• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

Buggy Loop

Member
The very definition of "chat shit, get banged".

Everything about this release seems rushed.


761soy.jpg
 

hlm666

Member
If you need to go for an aib custom model to avoid whatever the hell is going on with that reference cooler it's putting the price of the xtx into the lower end bounds of the 4080 aib models. I'm sure in that GN video with that young kid from nvidia talking about their cooler he was talking at one point about hotspots and vapor/condensation design considerations.

Timestamped where he is talking about what debauer seems to think the problem is. At the end of the day at least with reference vs reference your not just paying more for the silicon (or nvidia tax), your paying for more r&d on the cooler by the looks.
 

Topher

Identifies as young
If you need to go for an aib custom model to avoid whatever the hell is going on with that reference cooler it's putting the price of the xtx into the lower end bounds of the 4080 aib models.

That's a good point. The value in the XTX is definitely the $999 reference model, but a risky buy it would seem. AIBs start around $1099 but some models creep up to $1199 which makes for a more difficult decision.
 

GreatnessRD

Member

I love Hardware Unboxed, but I for the life of me don't understand why he isn't using the 13900K for the benches of these enthusiast cards. As a 5800X3D owner, its nice to see what the benches look like, but these cards should be benched on the best CPUs available. Especially since he has both the I-9 and R9. Shoutout to him for the 60 game bench though. A true workhorse.
 

winjer

Gold Member
I love Hardware Unboxed, but I for the life of me don't understand why he isn't using the 13900K for the benches of these enthusiast cards. As a 5800X3D owner, its nice to see what the benches look like, but these cards should be benched on the best CPUs available. Especially since he has both the I-9 and R9. Shoutout to him for the 60 game bench though. A true workhorse.

In games, the 5800X3D is not far from the 13900K.
But the most likely reason is that changing to a new CPU means doing all the tests again. And doing that for 50 games, with 3 runs for each, for 2 GPUs, is a lot.
So most reviewers only change the base system once in a while.
 

GreatnessRD

Member
In games, the 5800X3D is not far from the 13900K.
But the most likely reason is that changing to a new CPU means doing all the tests again. And doing that for 50 games, with 3 runs for each, for 2 GPUs, is a lot.
So most reviewers only change the base system once in a while.
I guess that makes sense. I do expect a change once the 7000 X3D chips come, however. Would be interesting to see the comparison between those and the KS chip.
 

winjer

Gold Member
I guess that makes sense. I do expect a change once the 7000 X3D chips come, however. Would be interesting to see the comparison between those and the KS chip.

Some reviewers might change to the 7800X3D, if the performance difference is large enough, to the CPU they were previously using.
For people that are already using a Raptor Lake or Zen4, it might not justify.
But for people still using Zen3 or Alder Lake, it probably will.
 

GymWolf

Gold Member
Oh shit the 7900xtx does a bit better in the fortnite test with lumen\nanite on...

And that difference in cod...
 
Last edited:

winjer

Gold Member
Oh shit the 7900xtx does a bit better in the fortnite test with lumen\nanite on...

And that difference in cod...

I think HU only used software Lumen in this test. With RT hardware Lumen the 4080 should have a decent margin.
CoD is very well optimized for consoles, which use AMD GPUs, so it transfers to the PC version.
But then on titles that use RT, the 7900XTX takes a beating.
Still, for people that don't care much about RT, it's a decent card that cost 200$ less than the 4080.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
I love Hardware Unboxed, but I for the life of me don't understand why he isn't using the 13900K for the benches of these enthusiast cards. As a 5800X3D owner, its nice to see what the benches look like, but these cards should be benched on the best CPUs available. Especially since he has both the I-9 and R9. Shoutout to him for the 60 game bench though. A true workhorse.
Also strange that he included Call of Duty MW2 twice in the "60 game" average.
 
Last edited:

GHG

Gold Member


The way he twists himself in knots during the conclusion to not call the 7900XTX the same things he's previously called the 4080 is quite something.

If you're going to call the 4080 a bad buy then surely you also have to say the 7900XTX is the same considering the reference cards are now out of the equation and the fact that they perform within 1% of each other at 4k?

The problem is they've spent years amassing and pandering to an audience that is made up of mostly people that sway towards AMD. Just say it as it is, don't be scared Hardware Unboxed.
 

winjer

Gold Member
The way he twists himself in knots during the conclusion to not call the 7900XTX the same things he's previously called the 4080 is quite something.

If you're going to call the 4080 a bad buy then surely you also have to say the 7900XTX is the same considering the reference cards are now out of the equation and the fact that they perform within 1% of each other at 4k?

The problem is they've spent years amassing and pandering to an audience that is made up of mostly people that sway towards AMD. Just say it as it is, don't be scared Hardware Unboxed.

From what I've understand, the reference model had a limited run of 1500 cards. And even those were mostly snatched up by scalpers.
So in a funny twist, those scalpers are now stuck with the version of a GPU no one wants.
 
  • Praise the Sun
Reactions: GHG

GHG

Gold Member
From what I've understand, the reference model had a limited run of 1500 cards. And even those were mostly snatched up by scalpers.
So in a funny twist, those scalpers are now stuck with the version of a GPU no one wants.

Karma can indeed be harsh and swift sometimes.
 

hlm666

Member
From what I've understand, the reference model had a limited run of 1500 cards. And even those were mostly snatched up by scalpers.
So in a funny twist, those scalpers are now stuck with the version of a GPU no one wants.
1500 cards??? this was the ample supply of msrp cards AMD said they would have and you wouldn't have the worry about fighting bots. The only way you could feed more shit to someone is going human centipede mode.
 

winjer

Gold Member
1500 cards??? this was the ample supply of msrp cards AMD said they would have and you wouldn't have the worry about fighting bots. The only way you could feed more shit to someone is going human centipede mode.

Reference cards.
 

Xyphie

Member
I love Hardware Unboxed, but I for the life of me don't understand why he isn't using the 13900K for the benches of these enthusiast cards. As a 5800X3D owner, its nice to see what the benches look like, but these cards should be benched on the best CPUs available. Especially since he has both the I-9 and R9. Shoutout to him for the 60 game bench though. A true workhorse.

Yeah, A 13900K is the fastest gaming CPU on the market except for perhaps some outliners, so it's what he should use for GPU testing. Won't make too much of a difference at 1440p/4K either way.
 
Last edited:

Warnen

Don't pass gaas, it is your Destiny!
Dunno if my research is true but think I fount solution to my issue. It might actually be related to my monitor (Samsung Neo g9 odyssey) looks like issue effecting last gen cards to after a driver roll out months ago…

Guess that seals it, not getting a new monitor to save $200 on a video card. 32:9 240hz is too much for amd cards. The fix is to turn refresh rate down to 60….
 
Last edited:

Topher

Identifies as young
The way he twists himself in knots during the conclusion to not call the 7900XTX the same things he's previously called the 4080 is quite something.

If you're going to call the 4080 a bad buy then surely you also have to say the 7900XTX is the same considering the reference cards are now out of the equation and the fact that they perform within 1% of each other at 4k?

The problem is they've spent years amassing and pandering to an audience that is made up of mostly people that sway towards AMD. Just say it as it is, don't be scared Hardware Unboxed.

It was interesting how he attributed the Geforce ray tracing advantage as titles "sponsored by Nvidia".
 

hlm666

Member
Reference cards.
All those aib models using the reference pcb and reference cooler design are reference cards, but I concede technically they probably fall under a different bar on a chart because they arn't sold by amd directly. Do you have any numbers on the total cards shipped from partners aswell using the reference designs for pcb/cooler? None of the custom cards are close to msrp, and their supply seems even worse.
 

winjer

Gold Member
All those aib models using the reference pcb and reference cooler design are reference cards, but I concede technically they probably fall under a different bar on a chart because they arn't sold by amd directly. Do you have any numbers on the total cards shipped from partners aswell using the reference designs for pcb/cooler? None of the custom cards are close to msrp, and their supply seems even worse.

Reference cards are cards made to the design reference of AMD, Intel and NVidia. Regardless of AIB that brands them.
I haven't found any numbers for custom cards yet. These companies usually don't publish them.
We'll have to wait a few days for stores to start speaking about it. And a few months until we get more reliable numbers from sources like JPR.
 

TrebleShot

Member
Anyone with any experience with overclockers in the UK and their B-tier stock?

Just ordered a Sapphire 7900xtx B-Tier card from there and wondering what I should expect.

Was a decent price at £1070 but interested to know what will come in the post or if anyone has ordered B-Tier before anywhere else?
 

Topher

Identifies as young
Anyone with any experience with overclockers in the UK and their B-tier stock?

Just ordered a Sapphire 7900xtx B-Tier card from there and wondering what I should expect.

Was a decent price at £1070 but interested to know what will come in the post or if anyone has ordered B-Tier before anywhere else?

What is a “B-tier card”?
 
What is a “B-tier card”?

I'm assuming he is referring to the custom AIB designs that mimic the reference design as far as the power draw limits, etc. There are several AIB versions available with custom coolers at the standard MSRP (or very close to it - the twin power connector models).

Edit: @ TrebleShot TrebleShot I see what you are getting at now, I'd never heard that one. LOL
 
Last edited:

hlm666

Member
Reference cards are cards made to the design reference of AMD, Intel and NVidia. Regardless of AIB that brands them.
Ok so now i'm confused, is the 1500 number you said above including reference boards made by saphire/powercolour etc? because I whined above about how small that amount was because I automatically assumed it was all reference designs like you just implied there but seemed to imply something else in your first reply.
 

winjer

Gold Member
Ok so now i'm confused, is the 1500 number you said above including reference boards made by saphire/powercolour etc? because I whined above about how small that amount was because I automatically assumed it was all reference designs like you just implied there but seemed to imply something else in your first reply.

For reference cards. All of them.
You are just confusing everything, because you don't understand the definition of what a reference card is.
 

hlm666

Member
For reference cards. All of them.
You are just confusing everything, because you don't understand the definition of what a reference card is.
lol I do though so amd made 1500 cards at msrp, thankyou got it and my first bitch was valid.
 

winjer

Gold Member
lol I do though so amd made 1500 cards at msrp, thankyou got it and my first bitch was valid.

AMD released the reference model first, by 2 weeks.
This is something that AMD and NVidia have done a few times before. Even Intel did that with their Arch GPUs.
Only then, do AIBs get to release their custom solutions. And of course, AIB cards are almost always above MSRP.
 

Buggy Loop

Member
From what I've understand, the reference model had a limited run of 1500 cards. And even those were mostly snatched up by scalpers.
So in a funny twist, those scalpers are now stuck with the version of a GPU no one wants.

From what I understand, a lot of vendors have the MBA reference designs. Almost anything close to MSRP is likely with the cooler. Sapphire, XFX, etc. The upper tier of those vendors will have AIB (add-in board), it’s a slang to use AIB terminology to refer to vendors, they can have reference design cards too.
 
Last edited:

winjer

Gold Member
From what I understand, a lot of vendors have the MBA reference designs. Almost anything close to MSRP is likely with the cooler. Sapphire, XFX, etc.

Throughout all of GPU history, only reference cards have the MSRP price.
AIBs almost always price the custom cards with a premium.
In this, AMD is doing what all GPU makers have always done.
 

Buggy Loop

Member
It was interesting how he attributed the Geforce ray tracing advantage as titles "sponsored by Nvidia".

« Here’s a game where AMD performs well and was sponsored for including *checks note*, oh yea errr, RT shadows »

HU is down on RDNA 3, but still nothing like their hatred for Nvidia lol.
 

RoboFu

One of the green rats
So have the 4 or 5 of you convinced each other the xtx isn’t a 4080 competitor yet or you still trying? 😂
 
Last edited:

Topher

Identifies as young
« Here’s a game where AMD performs well and was sponsored for including *checks note*, oh yea errr, RT shadows »

HU is down on RDNA 3, but still nothing like their hatred for Nvidia lol.

I don't know these guys very well. Is this really just Nvidia hate? Have to say that comment took me by surprise.
 

winjer

Gold Member
I don't know these guys very well. Is this really just Nvidia hate? Have to say that comment took me by surprise.

No they don't hate NVidia. But they haven't been pushing the RT and DLSS angle in the first couple of years, as sites like DF did. And this pissed off a lot of NVidia fanboys.
They had a more nuanced approach, as in evaluating tech as it is. So during the first couple of years, when most games didn't use RT, they gave it little consideration. The same thing for DLSS. Asides, from some special features where they focused only on one particular tech.

On the other side, we have sites like DF, especially Alex, who is an hardcore NVidia fanboy, who has admitted to only have ever used NVidia cards on his personal rigs.
The kind of fanboy that would praise DLSS 1, when all other sites and gamers said it was crap. One that said that DLSS 2.0 was better than native in Control, despite it having many issues with ghosting and blurring in motion.
The kind of fanboy that would praise RT, even in games where it made no difference at all, except for the huge drop in frame rate.
 

Buggy Loop

Member
I don't know these guys very well. Is this really just Nvidia hate? Have to say that comment took me by surprise.

It's too long of a story but they did campaign hard how RT and DLSS aren't worth it, how +6GB of VRAM will be "fine wine", etc. Nvidia played dirty also with them by refusing to send cards to them after these RT-DLSS bashing, but they went on to cry a river to other tech youtubers and of course, drama followed. As if they are owed to be given cards anyway.

One of my old post on the subject of HWUB and the narrative they put :
I don't think anyone is saying that HU did not touch a single RT game or DLSS, but yea, the narrative is quite skewed.

On 3080's review, only Wolfenstein Youngblood, a whooping 50 seconds segment to go through. He even goes to say :

"So really, for the most part, it looks like the 3080 is faster at stuff like ray tracing, because it is a faster gpu, not because the 2nd gen RT cores are massively improved"
Which is total bullshit, like not even more than 5 mins research. A mere 90 Mhz base clock & 70 Mhz boost clock seperate the 3080 and the 2080 Ti, but we see a 42% increase in performances for ray tracing and 37% when considering DLSS, but again, these amazing "tech" youtubers did not bother showcasing the Wolfenstein youngblood ASYNC patch either, even though their techspot site did, showing a 54% increase in performances. Not only is their assumptions on 2nd gen RT cores wrong, but with this game being the only one benchmarked at HU, we don't really see the full potential of these new gen cores with games that will really stress it, like full path tracing games.

On 3070's review, he spends time to set the table for Nvidia's marketing on RT 2nd gen, and then says "all of that shoulld mean significantly improved performances.... "we see that for DLSS the 3070 is no better than the 2080 TI..."

Uh buddy, the 2080 Ti has 68 RT cores & 544 tensor cores, the 3070 has 46 RT cores and 184 tensor cores. Why is the narrative that the architectural change on RT cores and tensor cores are not affecting the performances?

It's one thing to not care about RT, no really, i'm fine with that, but then, don't start spreading FUD about the architectural changes if you are not going to make a deepdive and test a wide array of RT games.

6800 XT's review, "personally i care little for ray tracing support right now where i feel it's worth enabling" ok sure, a personal opinion. I also think personally that 4k monitors are fucking useless for the performance penalty when 1440p, but i assume that reviewers would want to inform everyone on all possible reasons on wanting to upgrade and dish out >700$ on a card? Then goes to showcase 2 RT games, with RT contact shadows, the bare minimum of RT effects ..

And the cherry on the cake "a new feature that i find way more exciting than ray tracing, at least in the short term, is smart access memory..", that feature that at the time of reviewing was locked to 500 series motherboards and Zen 3 only.

They did the same with Zen 2 reviews. Look, i'm on Ryzen since 1600, then 5600X and now 5800X3D, but they did everything in those reviews to paint Intel in worse light than in reality.

The data is solid per say, but the discussion is skewed.

Like GHG GHG said, they don't have the same discussion on price for the 4080 and 7900 XTX. If 4080 is a bad value (and it is in my opinion, compared to previous silicon cut% of previous gens), then the same applies to the 7900 XTX flagship too. These cards are easily $300-400 too expensive. I thought for sure that Nvidia had priced the 4080 to liquidate ampere stock and would lower it for AMD's flagship release, but seeing the results now... yikes, i think we're stuck with this pricing scheme for a while.
 

Mister Wolf

Gold Member
It's too long of a story but they did campaign hard how RT and DLSS aren't worth it, how +6GB of VRAM will be "fine wine", etc. Nvidia played dirty also with them by refusing to send cards to them after these RT-DLSS bashing, but they went on to cry a river to other tech youtubers and of course, drama followed. As if they are owed to be given cards anyway.

One of my old post on the subject of HWUB and the narrative they put :
I don't think anyone is saying that HU did not touch a single RT game or DLSS, but yea, the narrative is quite skewed.

On 3080's review, only Wolfenstein Youngblood, a whooping 50 seconds segment to go through. He even goes to say :

"So really, for the most part, it looks like the 3080 is faster at stuff like ray tracing, because it is a faster gpu, not because the 2nd gen RT cores are massively improved"
Which is total bullshit, like not even more than 5 mins research. A mere 90 Mhz base clock & 70 Mhz boost clock seperate the 3080 and the 2080 Ti, but we see a 42% increase in performances for ray tracing and 37% when considering DLSS, but again, these amazing "tech" youtubers did not bother showcasing the Wolfenstein youngblood ASYNC patch either, even though their techspot site did, showing a 54% increase in performances. Not only is their assumptions on 2nd gen RT cores wrong, but with this game being the only one benchmarked at HU, we don't really see the full potential of these new gen cores with games that will really stress it, like full path tracing games.

On 3070's review, he spends time to set the table for Nvidia's marketing on RT 2nd gen, and then says "all of that shoulld mean significantly improved performances.... "we see that for DLSS the 3070 is no better than the 2080 TI..."

Uh buddy, the 2080 Ti has 68 RT cores & 544 tensor cores, the 3070 has 46 RT cores and 184 tensor cores. Why is the narrative that the architectural change on RT cores and tensor cores are not affecting the performances?

It's one thing to not care about RT, no really, i'm fine with that, but then, don't start spreading FUD about the architectural changes if you are not going to make a deepdive and test a wide array of RT games.

6800 XT's review, "personally i care little for ray tracing support right now where i feel it's worth enabling" ok sure, a personal opinion. I also think personally that 4k monitors are fucking useless for the performance penalty when 1440p, but i assume that reviewers would want to inform everyone on all possible reasons on wanting to upgrade and dish out >700$ on a card? Then goes to showcase 2 RT games, with RT contact shadows, the bare minimum of RT effects ..

And the cherry on the cake "a new feature that i find way more exciting than ray tracing, at least in the short term, is smart access memory..", that feature that at the time of reviewing was locked to 500 series motherboards and Zen 3 only.

They did the same with Zen 2 reviews. Look, i'm on Ryzen since 1600, then 5600X and now 5800X3D, but they did everything in those reviews to paint Intel in worse light than in reality.

The data is solid per say, but the discussion is skewed.

Like GHG GHG said, they don't have the same discussion on price for the 4080 and 7900 XTX. If 4080 is a bad value (and it is in my opinion, compared to previous silicon cut% of previous gens), then the same applies to the 7900 XTX flagship too. These cards are easily $300-400 too expensive. I thought for sure that Nvidia had priced the 4080 to liquidate ampere stock and would lower it for AMD's flagship release, but seeing the results now... yikes, i think we're stuck with this pricing scheme for a while.

Hardware Unboxed are AMD shills. Always have been. I'm not saying other outlets aren't shills for Nvidia.
 

ToTTenTranz

Banned
Ok so it's been a while, the dust is settled, so here are my 3 cents (2 cents + inflation is a bitch).



1 - First and foremost: if anything could be learned from the 7900XT(X) reviews, is that no one needs to spend >$800 on a GPU nowadays. And my point isn't "no one needs to play PC games", it's really that even for those playing on 4K TVs/monitors, for 99.9% of the games out there a RX6800 XT or a RTX3080 are more than fine for the job. Looking at all those bar graphs, we can see how these new GPUs are a complete overkill for almost all games available until the end of 2022.
Almost all games are still just using PS4-era engines with PS4-era assets, so what's happening is we're looking at charts going over 150 FPS at native 4K with maxed out settings, which is just nonsense.
The higher-end RDNA2 and Ampere GPUs can run all these games at well over 60 FPS. And in the rare cases they don't, FSR2 and DLSS2 can be used down to balanced mode (0.59% resolution) with virtually no discernible visual difference.
The only outliers here are the handful of RTX-branded games and tech demos with post-development raytracing plastered over with very questionable visual results (especially considering the 50-70% performance hit). I would hardly call it worth the money and I'd bet even the majority of RTX 4090 users aren't toggling on those "extreme raytracing" modes.


2 - It looks like the driver isn't making use of the RDNA3's dual-issue FP32 units and a bunch of other new features. RDNA3 is a much bigger departure from RDNA2 than RDNA2 was to RDNA1, and AMD did say that we should expect considerable performance increases with later drivers. It's still a shame that a new compiler that optimizes for the dual-FP32 ALUs throughput isn't ready, showing once again that on AMD's side software development isn't on par with the hardware teams. Same goes for the high power consumption for low-activity tasks like idling and video decoding.
Regardless, this seems to be a product that will indeed get better over time, which to me is more of a reason not to buy one right now because GPUs during 2023 are definitely going down in price.


3 - The 7900XTX is way too expensive, though it's simply following up on the RTX 4080's ridiculous price.


4 - The 7900XT exists mostly to upsell the 7900XTX, and it's really a shame that AMD is using these tactics on consumers.

5 - The "OMG they're using A0 revision and making us their beta testers" and "OMG the cooler is so bad" internet dramas are just noise that people shouldn't buy into. If their card really throttles down due to temperatures then just send it back to AMD because they already said they'd replace it.

6 - The reference GPUs seems to be made to please 3rd parties more than anything. The 3rd-party cards really do increase performance significantly with overclocking by providing more power to the chip, without decreasing the power/performance ratio. AMD could have made the reference XTX a 3x 8pin card and increase its base performance by ~10-12% but they simply chose not to. It's nice to see AMD playing ball with the 3rd parties after seeing how Nvidia does the opposite with their FE cards.



Conclusion: prices are stupid and performance is an overkill. Don't buy these cards (7900XT(X) nor RTX 4080/90). When actual 9th-gen games start coming out then people should think about upgrading, which will hopefully happen after AMD and Nvidia get a dose of reality and their GPU revenue starts cutting short, forcing them to drastically reduce their prices. Until then, just get a discount RTX 6xxx or RTX 3xxx for a much lower price for basically the same experience.
 

Topher

Identifies as young
No they don't hate NVidia. But they haven't been pushing the RT and DLSS angle in the first couple of years, as sites like DF did. And this pissed off a lot of NVidia fanboys.
They had a more nuanced approach, as in evaluating tech as it is. So during the first couple of years, when most games didn't use RT, they gave it little consideration. The same thing for DLSS. Asides, from some special features where they focused only on one particular tech.

On the other side, we have sites like DF, especially Alex, who is an hardcore NVidia fanboy, who has admitted to only have ever used NVidia cards on his personal rigs.
The kind of fanboy that would praise DLSS 1, when all other sites and gamers said it was crap. One that said that DLSS 2.0 was better than native in Control, despite it having many issues with ghosting and blurring in motion.
The kind of fanboy that would praise RT, even in games where it made no difference at all, except for the huge drop in frame rate.

Seems there is some history there as you and Buggy Loop Buggy Loop have pointed out. Not going to say anything about all that as I'd need to review it all myself and not going to do all that. I don't think HU does themselves any favors with the Nvidia crowd, however, insinuating that better ray tracing performance is essentially paid for by Nvidia. That's the kind of stuff I hear from console warriors. Other than that, his analysis seemed fine for the most part.
 
  • Like
Reactions: GHG

GHG

Gold Member
Seems there is some history there as you and Buggy Loop Buggy Loop have pointed out. Not going to say anything about all that as I'd need to review it all myself and not going to do all that. I don't think HU does themselves any favors with the Nvidia crowd, however, insinuating that better ray tracing performance is essentially paid for by Nvidia. That's the kind of stuff I hear from console warriors. Other than that, his analysis seemed fine for the most part.

Their previous narrative was that Ray tracing doesn't matter when AMD didn't have ray tracing at all (and subsequently had dire RT performance with the 6xxx series).

Let's just say they are an interesting outlet to follow.
 

GreatnessRD

Member
Also strange that he included Call of Duty MW2 twice in the "60 game" average.
I didn't even peep that. Yikes!
Yeah, A 13900K is the fastest gaming CPU on the market except for perhaps some outliners, so it's what he should use for GPU testing. Won't make too much of a difference at 1440p/4K either way.
Yeah, that's true. You'll see an uplift, but nothing too crazy at 1440p and 4K.
2 - It looks like the driver isn't making use of the RDNA3's dual-issue FP32 units and a bunch of other new features. RDNA3 is a much bigger departure from RDNA2 than RDNA2 was to RDNA1, and AMD did say that we should expect considerable performance increases with later drivers. It's still a shame that a new compiler that optimizes for the dual-FP32 ALUs throughput isn't ready, showing once again that on AMD's side software development isn't on par with the hardware teams. Same goes for the high power consumption for low-activity tasks like idling and video decoding.
Regardless, this seems to be a product that will indeed get better over time, which to me is more of a reason not to buy one right now because GPUs during 2023 are definitely going down in price.

4 - The 7900XT exists mostly to upsell the 7900XTX, and it's really a shame that AMD is using these tactics on consumers.

Conclusion: prices are stupid and performance is an overkill. Don't buy these cards (7900XT(X) nor RTX 4080/90). When actual 9th-gen games start coming out then people should think about upgrading, which will hopefully happen after AMD and Nvidia get a dose of reality and their GPU revenue starts cutting short, forcing them to drastically reduce their prices. Until then, just get a discount RTX 6xxx or RTX 3xxx for a much lower price for basically the same experience.
2 - Feels like AMD need to add some more monies to the driver side developers as this shouldn't happen. I do think they'll improve the drivers down the line, but we shouldn't have to wait down the line to get solid performance across the board.

4 - I am truly disgusted that AMD went this route with the 7900 XT. It just didn't make any sense. Should've been named the 7800 XT with its 26% performance uplift over the 6800 XT and sold for $750-$800. It would be flying off the shelves just like the XTX model. With the 7900 XTX being on par with the 4080, slightly cheaper (Depending on your model) and a few slight wins against the 4090, that thing was gonna sell out regardless. Especially with it being priced at launch as the 6900 XT. (And they should've just called the 7900 XTX the 7900 XT) AMD coughed up the ball with this one.

Prices are outrageous, but the target audience for these cards aren't the average consumer. The audience for these cards are people with "fuck it" money to blow. With that said though, I'm not confident with what the prices will look like for the mid-range of either AMD or Nvidia with the current pricing model as it. But as long as people continue to buy these cards at stupid prices, they won't care, lol.
 

//DEVIL//

Member
Did the same with Fortnite, Witcher 3, F1 2022 and some others although those seem to be differentiating APIs.

kRfhgXM.png

Their previous narrative was that Ray tracing doesn't matter when AMD didn't have ray tracing at all (and subsequently had dire RT performance with the 6xxx series).

Let's just say they are an interesting outlet to follow.

I mean how else he will try and make an AMD card look competitive??

It is a well-known fact that these guys root for the underdog. and I like them as a channel. nothing wrong in supporting AMD. but yeah. the chart is clearly in AMDs favor and on top of that, he didn't do an average ray tracing chart because well... AMD will get destroyed.


When spending 1000$ + on a GPU. I am paying to get the most premium products. an AMD will never be that since their ray tracing is far off.

if the 7900xtx was 800$ vs 1200$ 4080, then yeah maybe one will say sure it has ray tracing but it's 50% more expensive. not worth it. but not at 1000$ vs 1200$ ( correction, 1100$ cheapest AIB card since the reference has hardware flow vs 1200$ 4080
 
Last edited:

winjer

Gold Member
Their previous narrative was that Ray tracing doesn't matter when AMD didn't have ray tracing at all (and subsequently had dire RT performance with the 6xxx series).

Let's just say they are an interesting outlet to follow.

But in that they were right. During Turing there were almost no games with RT. And even then, only Control and BF5 showed some real improvements. But that was with huge drop in performance.
So for 2 years, RT was almost pointless. And DLSS 1 was crap, not even worth considering.

During Ampere, things started to change, with improvements in performance and finally DLSS became really good. Giving back a part of the performance lost with RT.
But games still lagged behind, with only a handful of titles, that were really worth it with RT.
So now RT was a nice feature to have, although far from essential.
But the big feature during this time was DLSS 2.2 to 2.4. Very high quality upscaling and a big performance boost.

Now in the age of Ada Lovelace, things are much worse for AMD. A lot more games are releasing with RT.
UE5 has full support for it and it's going to be the main engine for this generation. Software Lumen is amazing, but RT Lumen is even better.
So having support for RT is becoming like graphics acceleration in 1996.

The problem is that GPUs are still very expensive. Even Ampere cards are still above normal prices. And Ada Lovelace only released for the ultra high end.
So a lot of people can't upgrade to b able to enjoy RT at good frame rates.
And by good, I don't mean that 30 fps crap we see on consoles.
 

ToTTenTranz

Banned
Prices are outrageous, but the target audience for these cards aren't the average consumer. The audience for these cards are people with "fuck it" money to blow.
I don't think that's the case. Given their public statements as of late, I think Nvidia is trying to creep up the prices significantly on their whole range and have their revenue take a much larger part of the PC gamers' disposable income.
Jensen & Co. saw how PC gamers' eventually bent the knee and bought GPUs at ridiculous prices during the COVID lockdowns, and he's trying to hold that inflated value perception to see if it sticks indefinitely. We're also living in an era where FOMO gets a hold on many young people with too much daddy's money and honestly that's probably a big factor to all those 4090 being sold.
AMD is just trying to ride the same winds, but given their currently abysmal dGPU marketshare I wonder if this move isn't going to result in their GPU division ending the PC discrete card business as a whole. They won't survive without sales.

the chart is clearly in Nvidia's favor and on top of that, he didn't do an average ray tracing chart because well... AMD will get destroyed.
If they showed a chart won't only RT games to show "AMD bEiNg dEstRoYeD" then they'd also need to show a rasterization-only chart showing the 4080 bEiNg dEstRoYeD and then the exact same people would complain about this second graph existing at all.

Truth is the 7900XT/X are competent enough on almost all RT enabled games and score just some 17-20% below the 4080.
Save for these last minute patches and demos, like the Cyberpunk 2077 Uber mode and Portal let's-just-bounce-all-the-rays-a-gazzillion-times RTX, all of which released just in time for the RTX4090 reviews (and were undoubtedly a part of the review "guidelines".. the ones that will put reviewers into a blacklist if they're not followed).
 
Top Bottom