• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It might have something to do with AMD drivers too - i think they are still rather immature.
AMDs drivers are immature?
I dont think the XT will magically be worth the money with some driver updates.
Granted I am certain they will boost performance as drivers keep getting updated, it is the lay of the land, the same with Nvidias driver updates.

But the gap isnt going to change.

And the hardware issues.....are hardware issue.
Coil Whine and Transient Spikes arent something you can software away.
 

Cyborg

Member
It's on par with 4080 right? If so... why would you go for AMD XTX when they are the same price? If you go for the 4080 you get the benefits of much better DLSS and RT.
 

Irobot82

Member
AMDs drivers are immature?
I dont think the XT will magically be worth the money with some driver updates.
Granted I am certain they will boost performance as drivers keep getting updated, it is the lay of the land, the same with Nvidias driver updates.

But the gap isnt going to change.

And the hardware issues.....are hardware issue.
Coil Whine and Transient Spikes arent something you can software away.
Don't buy a reference board
 
It's on par with 4080 right? If so... why would you go for AMD XTX when they are the same price? If you go for the 4080 you get the benefits of much better DLSS and RT.
Not exactly. It is beating some and winning some current gen graphic engines.

ON UE5.1 it is losing about 20% against RTX 4080 and on Halo Engine it same 20%, Id tech 7.1 losing 20% nearly.


UE5.1 will used by many developers just UE4.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Don't buy a reference board
alfred-pennyworth-batman.gif


I havent done that since the 8800GTS......no im lying I had a reference GTX 260c216.


But for people who wanted 2x8pin and reasonably sized cards the reference boards are their best bets.
 
GN calling out the 7900XT as overpriced like others today:



The weird thing is that totally doesn't bother me, because it makes sense to position the full chip product as the best performance value within the group of cards manufactured with that chip. The cut down cards should offer you a little cheaper pricing and help fill a price gap, while at the same time not being quite as good of a value per frame. At least the FPS per $ still improved over launch pricing of the previous generation.
 
The reference 7900xtx is £1,050 at Overclockers UK. The lower end 4080s are around £1,150 to £1,200 and the 7900xtx reference cooler seems a bit shit so you'll probably want a AIB one so add at least another £100 and you are looking at..../drumroll....£1,150 for no DLSS and worse RT performance.

Is this DOA in the UK at least? Maybe 4080 prices will go up if people were waiting to see what AMD offered. Maybe there is a use case where you need more raster performance and more than 16gb RAM but otherwise the already rip-off 4080 looks an ok deal in comparison.

If you managed to get a £1,600 4090 FE you hit the jackpot as completely gross as that sounds.

Edit: they put the AIB prices up for the 7900xtx and they are £1,200 to £1,300. AMD don't even have price advantage in the UK as you can pick up a 4080 for less than that..
 
Last edited:

Celcius

°Temp. member
The weird thing is that totally doesn't bother me, because it makes sense to position the full chip product as the best performance value within the group of cards manufactured with that chip. The cut down cards should offer you a little cheaper pricing and help fill a price gap, while at the same time not being quite as good of a value per frame. At least the FPS per $ still improved over launch pricing of the previous generation.
The XT is 80% of the performance at 90% of the price of the XTX.
 

Buggy Loop

Member
Is it a limitation with the interconnect? I can’t see why it would react so differently than RDNA 2, the GCD is basically an improvement of it. What’s new and could cause these wild variations and overall underperformance for the silicon dedicated here is an hardware bug with the interconnect?
 
The XT is 80% of the performance at 90% of the price of the XTX.

That's really the way the cut down cards should have always been sold. Sell the full chip the most aggressively, the cut down is a cheaper option, but you loose a little of the fps per $ value.

I've always thought it was crazy that the GPU vendors would kneecap the full sku by making the cutdown the better value, that's how you end up with full chips being used to create the lower priced sku. Always seemed like a waste of silicone resources.
 

octiny

Banned
Excellent performance, especially at 4K resolution which was the opposite last gen.

With that said, It's definitely power starved. Undervolting+OC is the way to go until Igors MorePowerTool becomes compatible. The XTX is going to be insane @ 500w as it's clear the architecture has much higher headroom in OC'ing than the 40 series. Makes sense now when AMD was referring to AIB's unleashing the beast (higher vBios PL limits)

Link

f16EPVE.png


3.2ghz.

Once MPT becomes compatible we should see some cards hitting close to 3.5ghz regularly. Lots of OC potential here.
 

Irobot82

Member
Excellent performance, especially at 4K resolution which was the opposite last gen.

With that said, It's definitely power starved. Undervolting+OC is the way to go until Igors MorePowerTool becomes compatible. The XTX is going to be insane @ 500w as it's clear the architecture has much higher headroom in OC'ing than the 40 series. Makes sense now when AMD was referring to AIB's unleashing the beast (higher vBios PL limits)

Link

f16EPVE.png


3.2ghz.

Once MPT becomes compatible we should see some cards hitting close to 3.5ghz regularly. Lots of OC potential here.
That actually makes OC'ing worthwhile like in the old days.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That's really the way the cut down cards should have always been sold. Sell the full chip the most aggressively, the cut down is a cheaper option, but you loose a little of the fps per $ value.

I've always thought it was crazy that the GPU vendors would kneecap the full sku by making the cutdown the better value, that's how you end up with full chips being used to create the lower priced sku. Always seemed like a waste of silicone resources.
Cuz the people most concerned with perceived value are people lower down the totem pole.
At the upper end people dont give a shit about value.
Halo products can be priced at whatever you want regardless of how "worth it" they are.


The RTX 3090 and especially the 3090Ti were joke level cards from a value perspective, but 3090/Ti customers didnt/dont give a shit cuz they just want it and can hopefully afford it.
"Average" consumers look at the 3090 and think I get 90% of the performance from a 3080 for much less?
Im in, SOLD.
Halo product buyers look at the 3090 and think i get 10% better performance for theycantseethis and think....fuck yeah I need it!!!!


The xx60s and xx70 sell so well partly because their perceived value is high.
xx70s would always beat the range topper of the previous generation.
Even the 3070 looked like good value at 500 dollars cuz it could match or beat the 1100 dollar 2080Ti. (Cant believe I pissed myself laughing at the price then and now Nvidias cheapest card is 200 dollars more expensive than it)
Now if you try sell me a 4070 at a higher or equal price to the 3080 and tell me it about matches it.......im not seeing the value.
That thing can go fuck itself.


Pricing people out of products you are actually hoping move is bad for business.
Youll get your markup from the halo product.
Everything below the halo product has to prove its worth, otherwise potential customers simply wont buy it.
Being, stuck with stock is more costly than simply lowering the price so people actually buy the product.
 
Cuz the people most concerned with perceived value are people lower down the totem pole.
At the upper end people dont give a shit about value.
Halo products can be priced at whatever you want regardless of how "worth it" they are.


The RTX 3090 and especially the 3090Ti were joke level cards from a value perspective, but 3090/Ti customers didnt/dont give a shit cuz they just want it and can hopefully afford it.
"Average" consumers look at the 3090 and think I get 90% of the performance from a 3080 for much less?
Im in, SOLD.
Halo product buyers look at the 3090 and think i get 10% better performance for theycantseethis and think....fuck yeah I need it!!!!


The xx60s and xx70 sell so well partly because their perceived value is high.
xx70s would always beat the range topper of the previous generation.
Even the 3070 looked like good value at 500 dollars cuz it could match or beat the 1100 dollar 2080Ti. (Cant believe I pissed myself laughing at the price then and now Nvidias cheapest card is 200 dollars more expensive than it)
Now if you try sell me a 4070 at a higher or equal price to the 3080 and tell me it about matches it.......im not seeing the value.
That thing can go fuck itself.


Pricing people out of products you are actually hoping move is bad for business.
Youll get your markup from the halo product.
Everything below the halo product has to prove its worth, otherwise potential customers simply wont buy it.
Being, stuck with stock is more costly than simply lowering the price so people actually buy the product.

But you should have different chips for the different price brackets.

Big chip:

Full chip best frames per $ for the chip

Cut down weaker value for the dollar

Maybe a further cut down chip with a little more frames per $ loss

Medium chip:

Full chip best frames per dollar for the chip

Cut down again with a little less fps per $

and so on down the line.

@ octiny octiny that's a crazy result for the OC. Would really make those giant coolers seem worth it.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
Nailed it

I was in that hype train for oh so long on ATI/AMD. Always was to be fixed at X driver or Y card, that they've got a master plan to catch Nvidia with their pants down.

When i went to university and started my electrical engineering, including a semiconductor course, then reading all these universities and research papers that Nvidia participates with for developing the tech, that it sort of clicked that they know wtf they are doing at another level. They've had no competition for CUDA for like 15 years, they have 80% of the ML market, they're way ahead in RT and now we see with this release that they're also way ahead in chip design and efficiency.
Yeah I mean, I don’t want to denigrate AMD’s engineers. They have made some awesome advancements over the years.

There’s just something about this hype machine that is especially ridiculous. The fans/media have a way of fixating on some upcoming AMD tech and building it up in their imaginations to be some secret super weapon that’s going to restore them to former glory.

The fanboys choose to believe it because they have this romanticized The Empire vs Rebel Alliance view of the tech industry. Tech sites/YouTube channels run with it because, well, it makes for a hell of a story and it gets them lots of clicks.

Then when we find out it was mostly hyperbole, instead of choosing to learn some healthy skepticism, they kick the can down the road with “oh just wait for drivers to improve, just wait for the next version, THEN it’s really going to shine.”

It’s almost like some kind of religious faith in the Second Coming.

Anyway TL;DR I respect AMD and their engineers, but fanboys and clickbait media deserve mockery.
 

GymWolf

Member
The reference 7900xtx is £1,050 at Overclockers UK. The lower end 4080s are around £1,150 to £1,200 and the 7900xtx reference cooler seems a bit shit so you'll probably want a AIB one so add at least another £100 and you are looking at..../drumroll....£1,150 for no DLSS and worse RT performance.

Is this DOA in the UK at least? Maybe 4080 prices will go up if people were waiting to see what AMD offered. Maybe there is a use case where you need more raster performance and more than 16gb RAM but otherwise the already rip-off 4080 looks an ok deal in comparison.

If you managed to get a £1,600 4090 FE you hit the jackpot as completely gross as that sounds.

Edit: they put the AIB prices up for the 7900xtx and they are £1,200 to £1,300. AMD don't even have price advantage in the UK as you can pick up a 4080 for less than that..
It is doa in all europe and all places that have major price difference compared to america.

Nobody is gonna buy a 7900xtx for the same price (or more) of a 4080.
 
Last edited:

manfestival

Member
Yeah it really does appear that AMD and Nvidia did similar moves though I think the 7900xtx is clearly the move for anyone that doesn't quite want to reach into the upper hemisphere for pricing. Maybe my brain rotted from overpaying for my card last gen. That $1k doesn't seem too terrible but in my case. This is just really too much power for 1440p unless maybe I start implementing ray tracing.
 

V1LÆM

Gold Member
i don't know if i can live without DLSS. RTX? yeah definitely but i think if it wasn't for DLSS i'd have upgraded my GPU by now.

i have a 2080. i wanted to get a 4080 but the price is insane plus i'd need to factor in the cost of a new PSU.

not sure what to do.
 

GreatnessRD

Member
Yeah I mean, I don’t want to denigrate AMD’s engineers. They have made some awesome advancements over the years.

There’s just something about this hype machine that is especially ridiculous. The fans/media have a way of fixating on some upcoming AMD tech and building it up in their imaginations to be some secret super weapon that’s going to restore them to former glory.

The fanboys choose to believe it because they have this romanticized The Empire vs Rebel Alliance view of the tech industry. Tech sites/YouTube channels run with it because, well, it makes for a hell of a story and it gets them lots of clicks.

Then when we find out it was mostly hyperbole, instead of choosing to learn some healthy skepticism, they kick the can down the road with “oh just wait for drivers to improve, just wait for the next version, THEN it’s really going to shine.”

It’s almost like some kind of religious faith in the Second Coming.

Anyway TL;DR I respect AMD and their engineers, but fanboys and clickbait media deserve mockery.
What I think it is, is just the yearning for someone to dethrone Nvidia so people latch onto anything they can. AMD is the great hope for a lot until Intel decides what they're gonna do when it comes to Arc. But folks should be thankful for AMD for putting a little pep in Nvidia's step. The 4090 didn't just become god tier outta nowhere. Just like the rest of the world, Nvidia thought AMD might actually bring some smoke with the 7000 series and brought out the boom stick to make sure the 4090 was still top dog. They saw the 6950 XT creep up on the 3090/Ti just like the rest of us. With that said, Nvidia is still King and will be for the foreseeable future. AMD got some work to do with their chiplet/hardware design.
 

hlm666

Member
i don't know if i can live without DLSS. RTX? yeah definitely but i think if it wasn't for DLSS i'd have upgraded my GPU by now.

i have a 2080. i wanted to get a 4080 but the price is insane plus i'd need to factor in the cost of a new PSU.

not sure what to do.
Wont you still have to buy a psu if you go an rdna3 gpu? the xt uses about the same power as the 4080 and the xtx uses more.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
But you should have different chips for the different price brackets.

Big chip:

Full chip best frames per $ for the chip

Cut down weaker value for the dollar

Maybe a further cut down chip with a little more frames per $ loss

Medium chip:

Full chip best frames per dollar for the chip

Cut down again with a little less fps per $

and so on down the line.
Lets look at last gen for example.

You wanted the 3090Ti full GA102 to be best bang for buck?
Then the 3080 GA102 to be much worse value bang for buck?

No one will buy the 3080 then, and if the 3090Ti is 2000 dollars, then again very few people buy it.
So you are left with a shit ton of GA102 chips some which made yield some which didnt, none of which are seen as being of value?

It leads you right back to the same point.
The halo product doesnt need to be the most logical from a value perspective, it simply needs to be the best.
Which is why 3090s and 3090Tis actually sold.
Everything else needs to make sense to consumers.

Doesnt matter what the underlying chip is, cuz who the fuck cares.
Outside of those halo products price/performance starts to matter.
And if you cant meet consumer expectations there, you are left with stock on your hands.
 

octiny

Banned
I snagged a 7900XT XFX. What is differerence between XFX and XFX merc?

The XFX is simply a rebranded AMD reference model vs a custom cooler/board design for the Merc w/ higher PL vBios.

There's quite a few partners selling the reference model rebranded.

6PPYGzT.jpg
 
Last edited:

octiny

Banned
So reference model is worse?

Yes.

Not to say the reference model is bad, as once Igor's MorePowerTool becomes compatible you'll be able to raise the max power limit on the reference model like last gen. However, out of the box, the custom models definitely have an advantage in noise/cooling/potential OC moreso than usual.
 
Last edited:

SantaC

Member
Yes.

Not to say the reference model is bad, as once Igor's MorePowerTool becomes compatible you'll be able to raise the max power limit on the reference model like last gen. However, out of the box, the custom models definitely have an advantage in noise/cooling/potential OC moreso than usual.
I made a mistake. The merc was 100 more euro so i didnt order it
 

Buggy Loop

Member
Why did AMD lie just a month ago with their presentation? It's a lot of goodwill gone out of the window. Even MLID and Daniel Owen felt that slap from AMD. 5000 and 6000 series had legit close to retail presentations of performances. Strangely, you typically doubt Nvidia claims, but this time, it seems even the 4000 series presentation kind of undersold them. This is upside down world.
 
Lets look at last gen for example.

You wanted the 3090Ti full GA102 to be best bang for buck?
Then the 3080 GA102 to be much worse value bang for buck?

No one will buy the 3080 then, and if the 3090Ti is 2000 dollars, then again very few people buy it.
So you are left with a shit ton of GA102 chips some which made yield some which didnt, none of which are seen as being of value?

It leads you right back to the same point.
The halo product doesnt need to be the most logical from a value perspective, it simply needs to be the best.
Which is why 3090s and 3090Tis actually sold.
Everything else needs to make sense to consumers.

Doesnt matter what the underlying chip is, cuz who the fuck cares.
Outside of those halo products price/performance starts to matter.
And if you cant meet consumer expectations there, you are left with stock on your hands.

Realistically, the full chip and the derivatives shouldn't be in a different class, unless it's just a supplement to the supply line (like the 2060 KO or whatever they were called). I can see where the 3080s really devalued the GA102, where if that chip was only used in 3090 or better, they'd just manufacture less of those. In that scenario 3080 would have been manufactured using a more price appropriate chip silicon wise (which they've done with the 4000 series). Unfortunately they've overpriced the 4080, but using a more efficient chip there is a smart choice. That's also likely a behavior you'll see from all vendors going forward, every chip will be more design oriented for the price points because the transistors are expensive. I don't think the fact that the 4080 is a 103 chip hurts it at all, the performance level offered is fine. It's the price that hurts it. If it was priced in line with the 3080 or even $100 more it would be one hell of an upgrade.

When the cost per transistor was constantly going down, the old way was more acceptable. But now, with the yields they get, it's unlikely that there are enough broken chips to supply a main-line product so you are going to use full chips in cards that aren't priced correctly for the transistor costs and all that. Eventually there will probably only be one GPU core and a cutdown version that getting mixed and matched to create the tiers once monolithic is dropped.

With the 7900 series positioning they can just make the XT have less availability, it will be the less popular product anyway.
 
Last edited:
^ Using a limiter is a weird choice. You aren't really testing for the peak perf/watt there, just the way the cards scale power. Which the 7900 obviously needs some tuning in with where the idle usage is.

Max it out and compare the fps per watt to the 4080 or 6950xt, kind of useless the way the info is presented there.
 
There’s just something about this hype machine that is especially ridiculous. The fans/media have a way of fixating on some upcoming AMD tech and building it up in their imaginations to be some secret super weapon that’s going to restore them to former glory.
Folks love a good underdog story. AMD coming back from the brink with Ryzen was good for the industry, and I think a lot of people are hoping they repeat that success with RDNA3 against nVidia. Foolish, but to be expected.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
^ Using a limiter is a weird choice. You aren't really testing for the peak perf/watt there, just the way the cards scale power. Which the 7900 obviously needs some tuning in with where the idle usage is.

Max it out and compare the fps per watt to the 4080 or 6950xt, kind of useless the way the info is presented there
If you have a 144hz monitor why wouldn't you limit the FPS to 144hz? It's not weird to me, I lock the FPS to match the display, otherwise I'm wasting energy and heating up my room unnecessarily.
 
Last edited:

winjer

Gold Member
Came here to share the same thing. Yikes.

Some techtubers claimed that AMD saw Nvidia's Lovelace as "hot, loud and noisy", MLiD even claimed that Nvidia would struggle with diminishing returns in the future with Blackwell (50 Series) because they're cramming everything into Lovelace.

It's so laughable now.

Seriously, MLID was never a trustworthy source for information or leaks.
People really have to stop using his non-sense videos as a source.
 
If you have a 144hz monitor why wouldn't you limit the FPS to 144hz? It's not weird to me, I lock the FPS to match the display, otherwise I'm wasting energy and heating up my room unnecessarily.

It's weird to use that data for a FPS per watt calculation. Max out the cards and use those figures.

What's posted is just pointing out the power scaling issues that the 7900 series currently has. It doesn't measure the performance per watt offered by the card.
 

Crayon

Member
Something tells me that just like Zen4, these GPUs are going to sell poorly at launch and AMD will have to lower prices soon after. Especially the 7900XT.
I wouldn't be surprised if a month or two from now, we'll find the 7900XTX under 900$ and the 7900XT at 750$.

Those prices make way more sense. Still on the high side depending on real world prices and if what Nvidia does when the ampere stock dries up. Does 4080 get a little cut or a big one? Because maybe it should be $150-$200 more than the 7900 but at these prices neither is a great deal.
 
Top Bottom