• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

GreatnessRD

Member
They had to eventually sell out of that launch shipment. 7900XTX seems to already be gone for the most part.
Yeah, I guess you're right. And the 7900 XTX sold really well. And sold out fast everywhere. That's a good sign for AMD. I'm trying to get like you Master Racers with the big money. I just couldn't imagine ever spending 1k+ on a GPU, lol.
 

MikeM

Gold Member
Apparently 7900xtx will be restocked most days at 10am est.

7900xt, 6950xt and 6750xt still available direct from amd.
 
Last edited:
Yeah, I guess you're right. And the 7900 XTX sold really well. And sold out fast everywhere. That's a good sign for AMD. I'm trying to get like you Master Racers with the big money. I just couldn't imagine ever spending 1k+ on a GPU, lol.

Neither can I. I'm a budget gamer really. :messenger_grinning_smiling:

I just monitor these higher end launches because it gives you an idea of what might be coming in the classes of cards that most people buy.
 

poppabk

Cheeks Spread for Digital Only Future
Neither can I. I'm a budget gamer really. :messenger_grinning_smiling:

I just monitor these higher end launches because it gives you an idea of what might be coming in the classes of cards that most people buy.
Or what you will buy in 2 years time when the prices have dropped.
 

GreatnessRD

Member
Apparently 7900xtx will be restocked most days at 10am est.

7900xt, 6950xt and 6750xt still available direct from amd.
I just have zero idea who the 7900 XT is catered to. I think it's the 6800 XT replacement, but they'll have to drop the price to $750 for it to make sense to me. The 7900 XT is like 26% better than the 6800 XT, so I just can't see them making a 7800 XT that'll be like 10% better than its predecessor, lol.
 
Last edited:

MikeM

Gold Member
I just have zero idea who the 7900 XT is catered to. I think it's the 6800 XT replacement, but they'll have to drop the price to $750 for it to make sense to me. The 7900 XT is like 26% better than the 6800 XT, so I just can't see them making a 7800 XT that'll be like 10% better than its predecessor, lol.
AMD went off the deep end with the naming. 7900xt absolutely should have been 7800xt.
 

poppabk

Cheeks Spread for Digital Only Future
I had to buy a new case and the model with a good offer had a glass panel, i already dread the room becoming a fucking carnival of rio with all those fucking leds on sight...
Same. Just the motherboards leds to indicate that stuff is working are annoying, don't understand why people add all these extra led fans and cables, you can even get ram with RGB stuff on it for some reason.
 

Crayon

Member
Yeah, I guess you're right. And the 7900 XTX sold really well. And sold out fast everywhere. That's a good sign for AMD. I'm trying to get like you Master Racers with the big money. I just couldn't imagine ever spending 1k+ on a GPU, lol.

I had to try used on my last upgrade to keep the budget in check. A lot of the AIBs are really good quality so a good condition one should have more than enough life left.
 

HeisenbergFX4

Gold Member
Literally everything that's new from Nvidia and AMD should be 20-25% cheaper but we are where we are. This is the current economic environment unfortunately.
Demand sets the price and as you know good and well people will pay these prices :)
 
  • Like
Reactions: GHG

GHG

Gold Member
Demand sets the price and as you know good and well people will pay these prices :)

Well not only that, a lot of people still have cash lying around from the biggest money printing operation in human history.

Going to take a while before we start to see significant demand shock and even still, historically speaking premium products have been least effected during times of recession. This is precisely the reason why both AMD and Nvidia are in no rush to release their products positioned lower down in the stack, at that point you're mostly attempting to sell to people who are possibly already feeling the effects of this economy in a significant way.
 
Last edited:

Crayon

Member
These prices might push some people away from pc gaming. Most people run cheaper cards but there might be no such thing in a few years of this fuckery continues. Hopefully AMD and Intel get serious about the graphics allocation on apus. AMD is showing phoenix soon. Fingers crossed.
 
Wish I could afford this custom model.



AeLiA4U.gif
 
Last edited:
These prices might push some people away from pc gaming. Most people run cheaper cards but there might be no such thing in a few years of this fuckery continues. Hopefully AMD and Intel get serious about the graphics allocation on apus. AMD is showing phoenix soon. Fingers crossed.

That will be the end result, no question. The manufacturers might ultimately be fine or even prefer selling fewer cards at higher margins, but down the road the field of players shrinks significantly in that scenario.

With that said though, it simply isn't true that these companies can charge whatever they want. The 4080 has significantly reduced demand in comparison to the 3080 so buyers are reaching their limits.
 

eNT1TY

Member
I'm tempted to buy an EVGA 3090 ti i have access to for $800, wonder how it compares in RT performance to the 7900 XTX, i know the XTX is faster at raster by a good margin,
 

hlm666

Member
I had to buy a new case and the model with a good offer had a glass panel, i already dread the room becoming a fucking carnival of rio with all those fucking leds on sight...
Paint the back of the glass panel to match the case. Another option which I done when building a pc for my nephew was mount a picture on the inside glass panel. He's into LoL or was at the time so got one of the characters artwork and got it printed then stuck it on the inside of the glass like a photo frame.
 
I'm tempted to buy an EVGA 3090 ti i have access to for $800, wonder how it compares in RT performance to the 7900 XTX, i know the XTX is faster at raster by a good margin,

Should be somewhat similar. Most of the trustworthy YT reviewers have included the 3090/ti in their XTX benchmarks, so, quite a bit of data should be out there.
 

poppabk

Cheeks Spread for Digital Only Future
Damn, the 7900 XT was so bad that Nvidia is considering releasing the 4070 Ti at $899 as well.



GOD DAMMIT AMD YOU HAD ONE JOB AND YOU FAILED!!!

Wow, at the exact same price as the 4080 12GB was gonna launch at, what an amazing coincidence. They aren't even charging you extra for the badges they had to switch out.
 

hinch7

Member
The GPU market is so bad that the best value you can probably get rn is older gen hardware or the 4090 (if you bring in RT and DLSS). With the older cards being the 6000 series being good on sale and used Ampere cards. Everything else value wise this gen is looking like just scaling linearly to performance. Which is a total shit metric, but GPU manufacterers like to do it anyway. Especially after prices people are/were willing to spend pre and post mining boom.

performance-per-dollar_2560-1440.png


And in the mid range.. if the RTX 4070Ti is $900, then the 7800 XT will be $850 or $800 if we're so lucky.

What a shit gen tbh, that only the flagship Nvidia card is worth the cost. And that's a $1600 graphics card. Which is insanity in itself, but it is a Halo SKU and people have shown they are willing to spend a lot on getting the best, as an enthusiast card.
 
Last edited:

PaintTinJr

Member
You misunderstand 6nm vs 5nm.
6nm is slightly optimised 7nm. 5nm is nearly twice the transistor density of 6nm. The MCDs are around 55 million transistors per mm2. The GCD is 138 million transistors per mm2.
Now obviously, SRAM will not scale with new nodes as well as logic, so you are sort of right, that it won't be that big of a shrink, but it would still be substantial.

But also each of the MCDs and GCDs have to waste die area on chip-to-chip interconnects, which takes up a fair bit of space.
So in actuality If N31 was monolithic it would be around 450mm2. Which is rather small - especially considering the amount of space taken up by cache.

The fact that the silicon is buggy, and draws way too much current at given voltage to maintain its clockspeeds is a completely separate physical design issue. It has absolutely nothing to do with the architecture itself, which is perfectly fine.
This is evidenced by AIB card reviews which, if you add more power can add an extra 15-20% more performance just by lifting average clock frequencies from 2600MHz to 3200MHz.
The potential is there. You can see what AMD was aiming for, but they fell short of their targets. Which means in simple terms their silicon execution was not good enough.



I don't know about Nvidia schooling AMD engineers hard. AMD and Nvidia went for completely different strategies.
And to clear a few things up.
  1. AMD do have dedicated RT hardware, they just aren't spending as much of their transistor budget on it as Nvidia.
  2. AMD don't have fixed function ML accelerators because it doesn't matter that much for gaming. Yes FSR2 isn't quite as good as DLSS in image quality, but its damn close and is hardware agnostic. And if you think DLSS3's hit to image quality is an acceptable way to improve framerate, then you have absolute no right to complain about FSR2's image quality
Nvidia is indeed in a league of their own with the resources they have. Using ML to micro-optimise transistor layout to maximise performance and minimise power is exceptional stuff. However, you also need to understand that Lovelace is nothing remarkable from them as far as architecture is concerned. Every GPU architecture Nvidia has made since Volta is just a small incremental update on Volta. Turing is Volta + RT cores and a dedicated INT pipe. Ampere is Turing + double FP32. Lovelace is a die-shrunk Ampere with a jacked up RT core. If you actually look at the structure of the SM (Streaming multiprocessor) it hasn't dramatically changed since the shift from Pascal to Volta. Small incremental updates. Not unlike what AMD was doing with GCN, but just executed much more effectively with a much better starting point.

RDNA2 from AMD was successful, because it was basically RDNA1 on steroids. Optimised in physical design, fixed some hardware bugs, and clocked to insanity.
RDNA3 is effectively a completely new architecture in every possible way. The CU's are completely redesigned. The individual vALU's have been completely redesigned. The front end and geometry has been redesigned. The command processor has been streamlined, and shifted from a hardware scheduler to a software scheduler (iirc) like Nvidia. On top of this they have disaggregated last level cache and memory controllers. Very ambitious on a number of different ways. They aimed big and they failed.
If Nvidia schooled AMD at anything, its execution, not necessarily at architecture.

But more than their hardware, their true strength is in software. Software is the reason AMD doesn't make larger chips, because they have no answer to Nvidia's software.
Let me be clear. AD102 is not designed for gamers. Nvidia absolutely do not give a shit about gamers. The fact that AD102 is blazing fast at gaming is a nice side bonus for them. AD102's true target is semi-professionals and professionals. Their RTX 6000 Lovelace is where Nvidia really make money, and that is all built on Nvidia's software. CUDA and everything that plugs into, OptiX for rendering etc.



AMD doesn't have Nvidia's market incumbent to be able to dictate the way the market moves. DXR was built by Microsoft and Nvidia together, but RTX is essentially a black box. Its proprietary software. Nvidia is clever and sends software engineers to developers to help build RTX to most efficiently use their hardware. Even now AMD could dedicate a shit load of transistors to RT like Intel do, but that is no guarantee that it will be the same level of performance. AMD at present does not have the resources to do what Nvidia does, which is why they open source a lot of their stuff. They make up for their lack of resources, by providing a huge amount of detailed documentation and information so developers can do things easily themselves. However, at the end of the day, nothing is easier than having some guy from Nvidia come and do it for you.

And to be clear. I'm fairly sure AMD could make a huge 600mm2 GPU, and cram in a whole bunch of transistors for RT. They could borrow the Matrix/ML cores from their CDNA products. They could toss that all into a huge GPU. But without the software stack it would be pointless. As I said before Nvidia can justify that massive 600mm2 AD102 GPU because they intend on selling most of it to Pros in the form of the $1600+ 4090 and the $8000 RTX L6000. That helps them recover the money.
Now tell me, who the fuck would buy a $1600 AMD GPU, if it doesn't have fully functional CUDA alternative, or OptiX alternative?
Would you?
No you would expect them to charge a lower price, even if it performed exactly the same as Nvidia in gaming, let alone all the pro use-cases. So how can they make back the money spent on such an expensive GPU? They can't spend all that money, sell it at a loss or thin margins. Its not sustainable, and it won't help them compete long-term.

AMD's software is shit, which is where the real problem is. Their hardware engineers are doing just fine, barring this blip with N31. Problem with software, is that it takes time to develop. ROCm is progressing, but its still far behind CUDA. HIP exists, but its still just shy of CUDA in rendering. HIP-RT is in development with Blender, but its still far from release. Once that software stack is up and running and able to deliver something useful to professionals, then and only then will AMD start to actually dedicate valuable leading edge silicon to stuff like AI and RT.

You can talk about Intel. But Intel is a massive company that's even bigger than Nvidia. In fact, they've been doing hardware RT since before even Nvidia with Larrabee. And they also have a ton of very talented software engineers on payroll building out OneAPI for their compute stack. Again, you're high if you think ARC is designed for gamers. ARC exists as a platform to build Intel's image in GPU so they can enter that space in datacentre and pro use cases. See: Ponte Veccio.

The market situation really not that simple.

By the way, I'm not making excuses for AMD or The 7900 family. They're mediocre products that are better value than the 4080, because that's a terrible product. You really should not be buying either. But if you go ahead and buy a 4080 because you're disappointed by the 7900XTX, then you really have no right to complain about prices or the state of competition. You're just feeding the beast and nothing will ever change, so long as you keep feeding it.



The market is what gamers made it. We are past the point of complaining now, we engineered these circumstances.
I know I'm guilty of be impressed with AMD/ATi GPUs in the past, congratulating them on a job well done and then going ahead and buying an Nvidia GPU when they inevitably discounted it.
If instead of providing empty platitudes to Radeon products for being great value for money, the PC gaming community; myself included, actually bought Radeon products. Maybe AMD wouldn't have been so cash starved during the whole of GCN and they would have been able to compete better against Nvidia.
But its too late now.
I made a mistake. Collectively as a community we made a mistake giving Nvidia 80% of the market. And now we have to pay.

AMD have realised that they too must make money to compete. So why would they slash prices, take less profit per unit sold, and also sell fewer units than Nvidia? That's a great way to go out of business. Now they'll sit content being second best, and just ride Nvidia's coat-tails and position themselves 10-20% below Nvidia's prices. Because why not?

The only way we can change anything, is to stop being a bunch of lunatic magpies, and just not buy any new GPU.
You've packed a lot of detail in there with a take I agree with, the only thing I would dispute is that AMD have a professional graphics line that's slightly more successful than your post gives off, and that them not following Nvidia - and now Intel - with their dedicate hardware for ML/RT is by design philosophy of expecting and wanting a return to more generalised graphics programming - like writing a renderer from scratch in software on a CPU - as discussed by Carmack/Sweeney over a decade ago when Intel Larrabee was claiming to be amazing for RT and Carmack said they needed to show something great - in software terms.
 

M1chl

Currently Gif and Meme Champion
Crazy times, when the card which is actually worth buying is 4090. 4080 or Radeons are like really meh, it does not feel they should be priced as they are.
 

//DEVIL//

Member
Crazy times, when the card which is actually worth buying is 4090. 4080 or Radeons are like really meh, it does not feel they should be priced as they are.
Good luck finding a 4090 msrp.

If you are into the market for high end gpu , whatever you can get your hands on from 7900 xtx and msrp then grab it.

Or get a 3090 or 4070ti if you want dlss3
 

poppabk

Cheeks Spread for Digital Only Future
Crazy times, when the card which is actually worth buying is 4090. 4080 or Radeons are like really meh, it does not feel they should be priced as they are.
We are in a weird situation where we are falling between two stools. 4K60 at ultra settings without RT can be hit by even the last gen top of the range cards. Then with RT even the 4090 can struggle, and so all the other cards you immediately are looking at compromise.
Don't recall ever having a graphics toggle that had such a profound effect on performance.
 

M1chl

Currently Gif and Meme Champion
Good luck finding a 4090 msrp.

If you are into the market for high end gpu , whatever you can get your hands on from 7900 xtx and msrp then grab it.

Or get a 3090 or 4070ti if you want dlss3
True, but given the fact that there is almost no difference here between 4080 and 4090 (due to the fact, that for some reason, only the higher version of 4080 is available), 4080 would be a bad buy. And 7900XTX is around 4080 price here, the markup of shops is disgusting. I hope that EU hammer will hit them soon.
 
Damn, the 7900 XT was so bad that Nvidia is considering releasing the 4070 Ti at $899 as well.



GOD DAMMIT AMD YOU HAD ONE JOB AND YOU FAILED!!!


Doesn't really indicate much though. The fact the card was continuously available for the first few weeks of its life was a lot more telling.

AMD is selling what they put on shelves, I doubt they had much to do with Nvidia's sales. LOL
 
Last edited:

GymWolf

Gold Member
We are in a weird situation where we are falling between two stools. 4K60 at ultra settings without RT can be hit by even the last gen top of the range cards. Then with RT even the 4090 can struggle, and so all the other cards you immediately are looking at compromise.
Don't recall ever having a graphics toggle that had such a profound effect on performance.
They can barely hit 4k60 in some games made for 2013 cheap boxes, they are hardly a safe option for future heavy\broken games, and amd go way lower with rtx.

The only 4k60 future proof gpu is a 4090 and a tier lower 7900\4080 16gb\3090ti.

Whoever was making a serious build for 4k60 to endure the next 3-4 got his ass fucked so deeply that you can smell amd\nvidia's dick coming from our breath.

The situation is bleak for enthusiast pc gamers like me unless you really don't give a fuck about money, especially european gamers that have to pay the euro tax on top of already overpriced products, and the tax is between 300 to 500+++ euros.

A 3090 vanilla is still 1300-1500 euros over here...
 

twilo99

Gold Member
Literally everything that's new from Nvidia and AMD should be 20-25% cheaper but we are where we are. This is the current economic environment unfortunately.

Well but that should also mean that people have less disposable income for $1000+ GPUs to play video games..
 

Dural

Member
There's kind of people rationalizing the $900 4070ti by blaming AMD huh? Lol gd guys.

They're both to blame, Nvidia with the 4080 pricing and AMD with the 7900XTX and 7900XT pricing. If the 4080 was priced like last gen it would be $700-$800, AMD would then have to rename the 7900XTX to 7800XT and sell it at a similar price point. AMD knew they didn't have a 90 series competitor but still gave it that moniker so they could sell it at $999 and have reviewers praise them for not increasing pricing over the previous gen.
 

hinch7

Member
dear god, imagine the 60 tiers....
7600xt for 450, 7600 for 400, 7700 for 650
and the 4060s gonna be fucking 500 dollars.
this 6650xt better not conk out on me!!!
Meant to say $750-800 for the 7800XT. And yeah that's what I expect as well.

And worse off the GPU's are getting more cut down the stack, meaning better marketing and more profit for the manufactureres and less gains for the buyers at the lower end of the stack. They've already raised the bar last gen with RDNA 2, its going to go up again.
They're both to blame, Nvidia with the 4080 pricing and AMD with the 7900XTX and 7900XT pricing. If the 4080 was priced like last gen it would be $700-$800, AMD would then have to rename the 7900XTX to 7800XT and sell it at a similar price point. AMD knew they didn't have a 90 series competitor but still gave it that moniker so they could sell it at $999 and have reviewers praise them for not increasing pricing over the previous gen.
To be fair the XTX isn't a bad product it just is an inferior in almost every way to Nvidia counterparts including the 4080. If we go by performance, features and efficiency. The XT being much, much worse and is the equivelant of Nvidia's 4070Ti or previously 'unlaunched' 4080 12gb in that its so cut down it may as well be a 7800XT in all but name. If they priced both GPU's $100 less then it wouldn't be so bad but now we can see the numbers, and how much these GPU's are.. Nvidia cards look way, more appealing than they have the right to because how close AMD priced their cards.

In any case the XT should've been around $750-800 to make sense - taking inflationin consideration. But now, with 7900 series launch.. Nvidia can justify charging the full $900 for the 4070Ti. if it comes in or around the 7900XT and people will accept it and a card at that price point. Just from rebranding from a 80 class to 7, reflashing the bios's and calling it a day :/ Both companies are scummy tbh. That's what you get when you have a two horse race and a duopoly. Both will price fix and keep prices high.
 
Last edited:
Top Bottom