• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD deprioritizing flagship gaming GPUs

RCX

Member
But people buy down the product stack because you have the halo product that's the king of the hill. So many people buy nvidia because they have the 4090s of the world, even if they're ripping you off with the -60 and -70 tier cards.

Bruh

This is exactly it. The halo card is aspirational and they know it. I have no idea how many 4090s are in the hands of gamers but at the price it retails for it cant be millions. But Nvidia own peoples perceptions of power and thats where they make serious inroads with 60 and 70 series cards.

AMD should just make 1 top tier sku and strangle its supply to make it as scarce as possible. They wont lose too much money and at least look like they're trading blows with team green and it would help drive demand for their lower tier.
 

Buggy Loop

Member
Be real. No one is buying a -60 or -70 level card and turning on max path tracing in games.

Okay?

Depends on resolution. 4K is a minority.

Vast majority of people turn on RT once, see that it runs like dog shit, then turn it off again.

On AMD maybe

People are buying nvidia based off of the reputation created by the halo products.

You're thinking like pre-RTX gens where it was rasterization for rasterization, nowadays there's plenty of reasons that the price difference makes it favorable for Nvidia
But also a reputation for drivers.

I already hear the roaring of the dozen of AMD fans here to correct me that they're good now,

high quality GIF


I suggested my brother back a few years ago to buy a 5700XT, was pretty good value for the money right? 2 years of black screen, no solutions. He was not alone, peoples kept complaining on forums and reddit, with peoples on r/AMD cult replying with "well I don't have the problem!", real helpful. So AMD eventually fixed the problem. I think by then RDNA 2 was about to launch so yeah. Or the 7000 series VR performances at launch performing worse than 6000 series for 8 months. "but who cares about VR!?" the cult said. During the Fallout Show peak, Fallout 3 & new vegas was unplayable on AMD. etc. Not saying Nvidia is perfect, far from it, but they don't throw a whole segment of entertainment under the bus for nearly a year (VR).

And in the end, its not like you can just say, well, 60 or 70 series are bad at RT so let's pick AMD. AMD for what? Rasterization? They are nearly on-par. If AMD had thrown RT under the bus (and they kind of did), but to have an absolute monster in rasterization that Nvidia is not even close, then YES, that would be quite the strategy and would have gotten a lot more sales for RDNA 2 & RDNA 3, but ~2% is not enough. Especially since 2022 vs 2024 drivers showcase that Nvidia beat AMD in progress, so revisiting those old launch reviews probably means that rasterization advantage is gone on AMD side for mid tier.


Reminder that I was ATI / AMD for 2 decades, 1996 to 2016, and I want them to compete, so my advice to them is simple : Make an FSR version that takes advantage of your card and your cards only. Just MAKE IT GOOD. Fuck open source bullcrap, it is not the time. Nobody gives a shit about open source upscalers. What has even resulted from FSR being open source so far? There's not even a tree branch of derivatives out of it. Devs barely manage through to hex edits to support ultrawide and you expect them to open upscaler source code? Ridiculous.
 

hinch7

Member
Okay?

Depends on resolution. 4K is a minority.



On AMD maybe



You're thinking like pre-RTX gens where it was rasterization for rasterization, nowadays there's plenty of reasons that the price difference makes it favorable for Nvidia
But also a reputation for drivers.

I already hear the roaring of the dozen of AMD fans here to correct me that they're good now,

high quality GIF


I suggested my brother back a few years ago to buy a 5700XT, was pretty good value for the money right? 2 years of black screen, no solutions. He was not alone, peoples kept complaining on forums and reddit, with peoples on r/AMD cult replying with "well I don't have the problem!", real helpful. So AMD eventually fixed the problem. I think by then RDNA 2 was about to launch so yeah. Or the 7000 series VR performances at launch performing worse than 6000 series for 8 months. "but who cares about VR!?" the cult said. During the Fallout Show peak, Fallout 3 & new vegas was unplayable on AMD. etc. Not saying Nvidia is perfect, far from it, but they don't throw a whole segment of entertainment under the bus for nearly a year (VR).

And in the end, its not like you can just say, well, 60 or 70 series are bad at RT so let's pick AMD. AMD for what? Rasterization? They are nearly on-par. If AMD had thrown RT under the bus (and they kind of did), but to have an absolute monster in rasterization that Nvidia is not even close, then YES, that would be quite the strategy and would have gotten a lot more sales for RDNA 2 & RDNA 3, but ~2% is not enough. Especially since 2022 vs 2024 drivers showcase that Nvidia beat AMD in progress, so revisiting those old launch reviews probably means that rasterization advantage is gone on AMD side for mid tier.


Reminder that I was ATI / AMD for 2 decades, 1996 to 2016, and I want them to compete, so my advice to them is simple : Make an FSR version that takes advantage of your card and your cards only. Just MAKE IT GOOD. Fuck open source bullcrap, it is not the time. Nobody gives a shit about open source upscalers. What has even resulted from FSR being open source so far? There's not even a tree branch of derivatives out of it. Devs barely manage through to hex edits to support ultrawide and you expect them to open upscaler source code? Ridiculous.
They just need to make GPU's affordable to the masses again. So many people still stuck on older graphics cards because the barrier of entry is so high now, as low end cards are essentially dead. At least progress is.

No doubt they'll piggyback off Sony and their work on PSSR. And intergrate that into FSR come RDNA 4 or beyond.
 
Last edited:

Buggy Loop

Member
They just need to make GPU's affordable to the masses again. So many people still stuck on older graphics cards because the barrier of entry is so high now, as low end cards are essentially dead. At least progress is.

No doubt they'll piggyback off Sony and their work on PSSR. And intergrate that into FSR come RDNA 4 or beyond.

Hopefully something comes from PSSR. Apparently also Sony had a neat way of doing RT efficiently so maybe PS5 pro will help AMD a lot for RDNA 4. Even if its not flagships, if nail down a new upscaler and perform a huge leap in RT compared to past RDNA gen to gen jumps, then they could have a winner. A bit worrying that the solution would come from a customer, Sony in this case, when nobody should be understanding the tech better than the supplier of the product, but whatever, if it helps.
 
Last edited:

Pagusas

Elden Member
This is exactly it. The halo card is aspirational and they know it. I have no idea how many 4090s are in the hands of gamers but at the price it retails for it cant be millions.
and you'd be wrong, even just the informal Steam survey from 8 months ago show there are over 3 million 4090's out there. There are more well off people out there than you know or realize.
 

Three

Gold Member
The same old story when they get demolished by Nvidia. I think this might be the fourth or fifth time they are not targeting the high end. Always the same excuse about how they want to target the average gamer, or how nobody buys flagship GPUs.
Less than 10% of the addressable market buy flagship GPUs as he said. They're going to target the bang for your buck category in "80%" of the addressable market. Now it depends how much influence the flagship has on other lower end GPUs but he's not wrong in saying only a small minority buy flagship high end cards.
 

ByWatterson

Member
I can't tell - does he mean consoles are more profitable or that it's not worth investing in good console tech?

I know he's mostly talking PC here, but wasn't clear on his PS5 comments.
 
Last edited:

Three

Gold Member
and you'd be wrong, even just the informal Steam survey from 8 months ago show there are over 3 million 4090's out there. There are more well off people out there than you know or realize.
Is this true? Steam survey doesn't give totals and aren't they user percentage based in a time window?
 

Zathalus

Member
Less than 10% of the addressable market buy flagship GPUs as he said. They're going to target the bang for your buck category in "80%" of the addressable market. Now it depends how much influence the flagship has on other lower end GPUs but he's not wrong in saying only a small minority buy flagship high end cards.
And yet it never helps AMD targeting the mid range. They attempted it with HD 3000, Polaris, and RDNA 1. Nvidia simply countered with a full product stack in response. The 5700 XT was a really good card, but Nvidia simply released the 2060 Super which was nearly as good and in the long term actually turned out to be the better buy.

They only try and spin this in PR as them focusing on where the majority of the market lies, because they fail to compete at the high end.
 

RCX

Member
and you'd be wrong, even just the informal Steam survey from 8 months ago show there are over 3 million 4090's out there. There are more well off people out there than you know or realize.
Holy shit thats really surprising. I figured they'd sell well in America but not necessarily elsewhere due to the high sticker price and higher energy costs.
 

Three

Gold Member
I can't tell - does he mean consoles are more profitable or that it's not worth investing in good console tech?

I know he's mostly talking PC here, but wasn't clear on his PS5 comments.
He's saying it's better to cater and sell to a bigger audience at a lower price for greater developer support (like they already do with PS5) than to create an expensive product that only 10% of the total addressable market buys and results in no significant marketshare shift.
 

Buggy Loop

Member
He's saying it's better to cater and sell to a bigger audience at a lower price for greater developer support (like they already do with PS5) than to create an expensive product that only 10% of the total addressable market buys and results in no significant marketshare shift.

And he's right

Its also Intel's strategy if they don't fold with the fuck ups they have to manage on CPU side.

To sell at a bigger audience though, they can't rely on the old $50 difference with worse software suite and 2% advantage rasterization. I don't think they realize how difficult it will be to get a bigger audience and expect Nvidia to not open the war chest to drop prices accordingly.
 

Gaiff

SBI’s Resident Gaslighter
And yet it never helps AMD targeting the mid range. They attempted it with HD 3000, Polaris, and RDNA 1. Nvidia simply countered with a full product stack in response. The 5700 XT was a really good card, but Nvidia simply released the 2060 Super which was nearly as good and in the long term actually turned out to be the better buy.

They only try and spin this in PR as them focusing on where the majority of the market lies, because they fail to compete at the high end.
Because we all know it's pure cope. If AMD had better offerings in the mid-range, there's no reason this wouldn't apply to the high-end because the big cards are simply larger versions of the smaller ones. They have the same architectures and work the same way.

As you say, he's just spinning this because they've been getting trounced for over a decade. "Oh, yeah, it's not worth to compete anymore," suggests they're willingly bowing out when they just got TKO'd.
 

Xyphie

Member
It's just a cope to prepare the market for their top SKU only match a 5070 Ti or so because they cancelled their high-end products. Next generation they'll be back with a full stack.
 

ap_puff

Banned
Because we all know it's pure cope. If AMD had better offerings in the mid-range, there's no reason this wouldn't apply to the high-end because the big cards are simply larger versions of the smaller ones. They have the same architectures and work the same way.

As you say, he's just spinning this because they've been getting trounced for over a decade. "Oh, yeah, it's not worth to compete anymore," suggests they're willingly bowing out when they just got TKO'd.
Nah I don't think it's because they're bowing out, I think they'd rather use their wafer allocations on the more profitable segments. Their cards don't sell unless they basically make little to no margin on them, with TSMC raising prices they're going to be trying to cash in on the AI bubble as long as possible, they actually have some server grade stuff that can be somewhat competitive since the hyperscalars write their own software so they don't have to deal with AMDs shitty stuff.
 

Gaiff

SBI’s Resident Gaslighter
Nah I don't think it's because they're bowing out, I think they'd rather use their wafer allocations on the more profitable segments. Their cards don't sell unless they basically make little to no margin on them, with TSMC raising prices they're going to be trying to cash in on the AI bubble as long as possible, they actually have some server grade stuff that can be somewhat competitive since the hyperscalars write their own software so they don't have to deal with AMDs shitty stuff.
It's cope. Their cards don't sell in the mid-range either. Their most popular GPU is still the old RX 580. No one buys AMD GPUs. The profit margins in the high-end are much higher to offset the lower volumes. If they could compete with NVIDIA, they 1000% would, but they can't, so they put a nice PR spin to it.
 

FireFly

Member
And yet it never helps AMD targeting the mid range. They attempted it with HD 3000, Polaris, and RDNA 1. Nvidia simply countered with a full product stack in response. The 5700 XT was a really good card, but Nvidia simply released the 2060 Super which was nearly as good and in the long term actually turned out to be the better buy.

They only try and spin this in PR as them focusing on where the majority of the market lies, because they fail to compete at the high end.
The HD 3870 was just the die shrunk R600 (2900 XT), which was a big power hungry performance failure. It lost handily to the 8800 GT. Polaris was far behind Pascal in performance per watt, so was seen as the hot power hungry alternative. And with RDNA 1, AMD was at a big feature deficit vs Turing.

So I would say AMD has not actually launched any "great" mid range GPUs for a long time. But they seemed to do pretty decently with the 5850/5870 and 4850/4870.
 
Last edited:

ByWatterson

Member
He's saying it's better to cater and sell to a bigger audience at a lower price for greater developer support (like they already do with PS5) than to create an expensive product that only 10% of the total addressable market buys and results in no significant marketshare shift.

Makes sense.

Also makes sense from a dev side - the things that drive graphics innovation at fast pace also drive up costs more quickly than automation keep up with.

We've been in diminishing returns for a long time. Perhaps the best way forward for all is to slow down innovation for the sake of market scale and share.
 

kevboard

Member
So if Nvidia charges $500 for an xx70 gpu you want AMD to charge $300 for the same performance?

Man Nvidia fanboys are something else...
That's the only way they can compete. they already lack in RT performance and lack DLSS.

Having "the same" performance isn't even really possible for AMD due to these 2 elements.
to get 4060 levels of RT performance on AMD you need to buy a card that is one or two tiers above the 4060.
to get 4060 levels of image quality you have to do basically the same.

if the top class AMD GPU on the market gets outperformed by a 4060 in any capacity, then it's hard to sell such an AMD card.
and this isn't a theoretical. play Cyberpunk with RT, and a 4060 will destroy all AMD cards. and even without RT it can compete with it subjectively precisely due to DLSS and its forwarded thinking feature set.
if you need to push 3 to 4 times as many pixels to compete with the image quality of your competition, you're quite frankly just fucked... like, how can you compete in such a scenario?
 
Last edited:
Is this true? Steam survey doesn't give totals and aren't they user percentage based in a time window?
Steam Hardware Survey works like any survey in this sense, yes. This is also why surveys about who people will vote for the next President also don't poll all 400 million Americans

Game devs treat the data from Steam as gospel and dev games targeting what the survey says people use. If that's not good enough for you then I can't be of any further help
 
BTW, the real reason for this (from the rumors) is because AMD was/is working on multi chiplet solution and it missed a deadline. So they killed it.

Might be back with RDNA 5.
 

Three

Gold Member
Steam Hardware Survey works like any survey in this sense, yes. This is also why surveys about who people will vote for the next President also don't poll all 400 million Americans

Game devs treat the data from Steam as gospel and dev games targeting what the survey says people use. If that's not good enough for you then I can't be of any further help
That's not what I mean. What I mean is that Valve only gives percentages in those surveys. So 0.96% of those surveyed had a 4090. How is that turned into 3 Million? What are they using as the total amount surveyed? It isn't the entire steam all time userbase value. 300M users seems very far-fetched for all time users let alone those surveyed.
 
Last edited:

Cryio

Member
Overall, who cares.

AMD wants to do a 3090 Ti / 4070 Ti Super raster GPU and maybe close to in ray tracing for 400$-500$? That's fine.

AMD has had good succes when they executed good midrangers.

ATI 9500/9600. HD 3000. HD 4000. HD 5770. HD 6850 / 6870. HD 7850 / 7870. The entire Polaris generation, RX 400 and 500. 5700 XT. 6600 XT / 6650 XT / 6700 / 6700 XT / 6750 XT / 6800 straight bangers.

7600 XT > 4060 > ARC 750/770 IMO also.

Let's see RDNA4 with RX 8000.

PS: owner of AMD RX 7900 XTX
 

hinch7

Member
Hopefully something comes from PSSR. Apparently also Sony had a neat way of doing RT efficiently so maybe PS5 pro will help AMD a lot for RDNA 4. Even if its not flagships, if nail down a new upscaler and perform a huge leap in RT compared to past RDNA gen to gen jumps, then they could have a winner. A bit worrying that the solution would come from a customer, Sony in this case, when nobody should be understanding the tech better than the supplier of the product, but whatever, if it helps.
From the leaked screenshots, PSSR does look promising. Even if its on par with XeSS (XMX) or slightly behind DLSS in image quality that would be good enough for the most part. They just need to get RT performance under control. Which to be fair does sound like they may be addressing for RDNA 4.

Being behind even Intel who's first foray into DGPU's nearly managed to sort it.. is a bad look for AMD. In any case hopefully that means we get better upscaling all around and better optimisations from console to PC. And vice versa.
 
Last edited:

IDKFA

I am Become Bilbo Baggins
It's not excellent, it's less competition for Nvidia

This. Without AMD competing at the very top end, Nvidia will have more leeway to set pricing and determine the direction of the market.

What's the prediction for the RRP of the 5090? I'm guessing over £2000 easy.
 

SolidQ

Member
Without AMD competing at the very top end,
They're going to UDNA so top gonna compete with 6090

That interesting part
Tom's Hardware [TH], Paul Alcorn: So, with UDNA bringing those architectures back together, will all of that still be backward compatible with the RDNA and the CDNA split?

Jack Huynh [JH], AMD: So, one of the things we want to do is ...we made some mistakes with the RDNA side; each time we change the memory hierarchy, the subsystem, it has to reset the matrix on the optimizations. I don't want to do that.

So, going forward, we’re thinking about not just RDNA 5, RDNA 6, RDNA 7, but UDNA 6 and UDNA 7. We plan the next three generations because once we get the optimizations, I don't want to have to change the memory hierarchy, and then we lose a lot of optimizations. So, we're kind of forcing that issue about full forward and backward compatibility. We do that on Xbox today; it’s very doable but requires advanced planning. It’s a lot more work to do, but that’s the direction we’re going.
 
Last edited:

SmokedMeat

Gamer™
I know there’s a minuscule segment that wants $1500 GPUs, but personally I’d love to see it end.

We’re at a point where bigger and badder technology just means longer development times, rising prices, and safe sequels. I’m beyond fine with slowing things down, and the reality is the mid range is where the vast majority of your sales are.
 
They're going to UDNA so top gonna compete with 6090

That interesting part
Tom's Hardware [TH], Paul Alcorn: So, with UDNA bringing those architectures back together, will all of that still be backward compatible with the RDNA and the CDNA split?

Jack Huynh [JH], AMD: So, one of the things we want to do is ...we made some mistakes with the RDNA side; each time we change the memory hierarchy, the subsystem, it has to reset the matrix on the optimizations. I don't want to do that.

So, going forward, we’re thinking about not just RDNA 5, RDNA 6, RDNA 7, but UDNA 6 and UDNA 7. We plan the next three generations because once we get the optimizations, I don't want to have to change the memory hierarchy, and then we lose a lot of optimizations. So, we're kind of forcing that issue about full forward and backward compatibility. We do that on Xbox today; it’s very doable but requires advanced planning. It’s a lot more work to do, but that’s the direction we’re going.
LOL uh so they are going to try and do HBM on consumer GPU's again? I'm not sure what he's implying here because MI300X uses HBM and the consumer products use GDDR
 

simpatico

Member
IMHO the issue with AMD isn't that they don't try to compete with the XX90s, it's that NVIDIA has many better solutions.
Tensor Cores, DLSS etc.
If AMD were to have actual alternatives, not ones just on paper but in quality and performance too, many more would buy AMD even if they wouldn't have XX90 competitors.
The Mainstream sentiment is DLSS, Raytracing etc. where NVIDIA is currently king, down the whole stack of GPUs, not just the king.
Which is why AMD must offer the value prop. Dollar for dollar they're sunk because people will just go with DLSS and faster RT. If they could crank out xx80s for $399 though, maybe the economies of scale would make up for the borderline nano margins. If they could release something like the 4770. Card was $119 and shit on both consoles with any mid CPU. Even back then, if you could afford Nvidia that was where you went. PhysX and much better drivers. Pretty much same as always. If AMD was priced the way it is now in relation to Nvidia back then, I don't even think they'd be alive today in the gaming GPU market.
 
And yet it never helps AMD targeting the mid range. They attempted it with HD 3000, Polaris, and RDNA 1. Nvidia simply countered with a full product stack in response. The 5700 XT was a really good card, but Nvidia simply released the 2060 Super which was nearly as good and in the long term actually turned out to be the better buy.

They only try and spin this in PR as them focusing on where the majority of the market lies, because they fail to compete at the high end.

Depends on how you look at it. The 480/580 and 5700XT were relevant cards that were at least in the conversation for most buyers at the price point. It seems possible that the 480s or 580s sold more than the entirety of the 6000 series lineup, at least going by how low the 6000 series numbers have been on the Steam survey (not sure if there are hard numbers available on that).
 

CuNi

Member
Which is why AMD must offer the value prop. Dollar for dollar they're sunk because people will just go with DLSS and faster RT. If they could crank out xx80s for $399 though, maybe the economies of scale would make up for the borderline nano margins. If they could release something like the 4770. Card was $119 and shit on both consoles with any mid CPU. Even back then, if you could afford Nvidia that was where you went. PhysX and much better drivers. Pretty much same as always. If AMD was priced the way it is now in relation to Nvidia back then, I don't even think they'd be alive today in the gaming GPU market.
To be honest, I myself do not know what advice one could give to AMD to get out of the corner in the GPU market except for getting on par with the technologies and be competitive to DLSS, Raytracing etc.
I am very optimistic about their change to UDNA. They really should have done that a gen earlier, when Stable-Diffusion started to gain traction because AMD was known to have more VRAM than nvidia and if they would perform somewhat close to nvidia with their CUDA, they'd be the clear winner on that front as all those Models consume insane amounts of VRAM, and that is where the money is at currently.

But that is to be seen in the comming 5 years if their idea with UDNA pans out.
I fear that nvidia will be somewhat untouched for the 5000-Series and most likely for the 6000-Series too.
 

CuNi

Member
We're gonna see this with RDNA5/CDNA4

Didn't they say they had RDNA5/CDNA4 already in the pipeline but want to start the unification process after them?

Edit:

They said this:
So, going forward, we’re thinking about not just RDNA 5, RDNA 6, RDNA 7, but UDNA 6 and UDNA 7.

And this was also said:
PA: So, this merging back together, how long will that take? How many more product generations before we see that?

JH: We haven’t disclosed that yet.

To me this sounds like RDNA5, 6 and 7 are all already being developed, but they'll start with UDNA6 at the earliest, and it sounds that UDNA will replace CDNA first and then after RDNA7 will merge them into it too.
At least that's what I would read out of that statement, especially with looking at the second quote where they don't disclose how long the merger will take. If it were already this or next GPU-Gen, they'd definitely mention it.
 
Last edited:

hinch7

Member
This. Without AMD competing at the very top end, Nvidia will have more leeway to set pricing and determine the direction of the market.

What's the prediction for the RRP of the 5090? I'm guessing over £2000 easy.
Lets be honest, there's only a tiny niche willing to spend over £500 on an AMD graphics card. AMD just doesn't have the tech or the means to compete, at least not yet.

$2000 minimum US, not sure for us. Maybe £1900, if we're lucky. People were willing to spend that on their premium model 4090's.
 

RoboFu

One of the green rats
Low, mid, or high end .. if they can't get their RT up it doesn't matter. I personally would love to see a Ryzen moment happen in gpu space. It needs to happen to bring prices down.
 

Loxus

Member
Didn't they say they had RDNA5/CDNA4 already in the pipeline but want to start the unification process after them?

Edit:

They said this:


And this was also said:


To me this sounds like RDNA5, 6 and 7 are all already being developed, but they'll start with UDNA6 at the earliest, and it sounds that UDNA will replace CDNA first and then after RDNA7 will merge them into it too.
At least that's what I would read out of that statement, especially with looking at the second quote where they don't disclose how long the merger will take. If it were already this or next GPU-Gen, they'd definitely mention it.
Sounds to me that UDNA6 & 7 will exist alongside RDNA6 & 7 but not CDNA.

But the fact they're starting at UDNA 6 would suggest to me RDNA5 is the last of the RDNA architecture.
It's still to early to tell.

One good thing from this is gaming will get the Matrix Cores from CDNA.
From the article.
What precisely will UDNA change compared to the current RDNA and CDNA split? Huynh didn't go into a lot of detail, and obviously there's still plenty of groundwork to be laid. But one clear potential pain point has been the lack of dedicated AI acceleration units in RDNA. Nvidia brought tensor cores to then entire RTX line starting in 2018. AMD only has limited AI acceleration in RDNA 3, basically accessing the FP16 units in a more optimized fashion via WMMA instructions, while RDNA 2 depends purely on the GPU shaders for such work.

Our assumption is that, at some point, AMD will bring full stack support for tensor operations to its GPUs with UDNA.
 

IDKFA

I am Become Bilbo Baggins
Lets be honest, there's only a tiny niche willing to spend over £500 on an AMD graphics card. AMD just doesn't have the tech or the means to compete, at least not yet.

$2000 minimum US, not sure for us. Maybe £1900, if we're lucky. People were willing to spend that on their premium model 4090's.

I'm going to be honest. I don't know much about GPUs. With that in mind, are Nvidia over charging on their high end GPUs?
 

SolidQ

Member
RDNA5/CDNA4 already in the pipeline but want to start the unification process after them?
Funnily enough RDNA5 and CDNA4 will re-unify the architectures. (c) KeplerL2

There also was rumor RDNA5 have new arch. Interesting things PS4 was GCN, PS6 maybe gonna be with UDNA, which should give massive boost to consoles
 
Last edited:
I could swear I´ve read that AMD is deprioritizing GPU flagships for several generations now......
I'm going to be honest. I don't know much about GPUs. With that in mind, are Nvidia over charging on their high end GPUs?
Nvidia is sitting on ~80% margin right now and pretty much just adapted the ridiculous scalper prices during Corona as the new norm, go figure.
 

IDKFA

I am Become Bilbo Baggins
They are all overcharging on GPUs. Nvidia flagship even more so because they don't even have competition.

I could swear I´ve read that AMD is deprioritizing GPU flagships for several generations now......

Nvidia is sitting on ~80% margin right now and pretty much just adapted the ridiculous scalper prices during Corona as the new norm, go figure.

As expected. It sucks, but without meaningful competition, they have their customers by the balls.
 

Yamato

Neo Member
I wish more people would give AMD a chance, but tbh it's not a surprise that things are as they are. I recently got a 7900 GRE, and I honestly think that at that price, it's an excellent deal. Anyone shopping at $500 has good reason to go AMD. At most other prices ranges, though, eh.....

It's a shame. But the truth is the from almost any angle, Nvidia's doing better. You don't hear about Nvidia drivers crapping their pants a couple of times a year. DLSS is the best upscaling tech around, RT performance is significantly better, and Nvidia has that momentum as a brand that really sped up when they launched Pascal. If AMD had managed to get something right with, say, a theoretical 5800XT to compete against Nvidia when the 20 series was controversial, things would've been different. But the 5700 XT came late, seemed more like an experiment than a finished product, and driver issues were apparently a real pain. Nobody here seems to have mentioned it, but I think that the 6000 series was a good effort. RT and upscaling aside, performance was good against Nvidia's 30-series while using less power and having more VRAM. The 7000 series has kind of been a disaster from the outset. Bad marketing pre-launch, lack of significant improvement on multiple features and at least a few minor driver issues still persisting have caused bad optics for sure. Where I think they really screwed up was in the 7600/XT being such a non-improvement. If it was faster and if they had price it well, that could have been another Polaris, because most buying those are entry-level people and care mostly about the numbers, as in howfast.exe and howexpensive.exe. Nevertheless, I stand by the $500 segment being impressive. It is good progress that 6950 XT performance is now available for under $550.

AMD is like one of those directors that made a great movie, but never understood why it was great in the first place. Why did Ryzen sell well? It definitely wasn't because of the early memory problems and the mediocre single-core performance. It was because they at least edged out Intel in something. Cores and multicore performance at good value. They didn't have to beat Intel in everything to get going. They mainly had to just do something right and give buyers some confidence in the purchase. I think that the current problem AMD has in the GPU space is not that their GPUs are bad or that Nvidia completely destroys them. They aren't and the difference isn't massive. But from almost any aspect, the scales tip in Nvidia's favor and that gives great optics and confidence in buyers. It's something I'm sure a marketing expert could explain quite simply. Usually, a brand that's ahead sells X units, and the brand that's behind sells X/2 or less. Just look at the previous console gen. If AMD would fix their drivers and if they would have previously priced their GPUs properly to actually beat Nvidia's rasterization performance/$, I think the 7000 series would have sold incomparably better than what it's doing now in reality.

So I don't think that this is a bad decision on AMD's part. Besides, we're out of the era of constant upgrades in consumer computing, especially in GPUs. Like people in the thread pointed out, Kepler didn't age well. My GTX 670 was already struggling in 2016. But the 970 lasted pretty long. And the 1080 and 1080ti lasted even longer. If AMD can deliver cards that are good value up to the common 1440p use case, I think they'll do fine. In the end, they don't actually have to beat Nvidia. The PC market is huge as a whole. 10-20% market share is still big (Steam survey), and they don't need a RX 8999 for the top. The people using those are in the single-digit range, and you only really need cards like that if you're at 4K (again, very small share), or if your need the computational power for non-gaming applications, where there are also other enterprise options.
 
Top Bottom