RDNA3 rumor discussion

Nvidia has priced the 4080 so high, I think they fucked themselves on 4060 and 4070 pricing. Who the hell wants to pay $600 for a 60 series card??

Only thing is I don't see FSR 3.0 coming anywhere close to DLSS 3.
 

AMD Radeon RX 7000 "RDNA 3" GPU Lineup Rumor: 2x Faster Raster & Over 2x Ray Tracing Performance Versus RDNA 2, Power Efficiency Looks Amazing!




  • 5nm Process Node
  • Advanced Chiplet Packaging
  • Rearchitected Compute Unit
  • Optimized Graphics Pipeline
  • Next-Gen AMD Infinity Cache
  • Enhanced Ray Tracing Capabilities
  • Refined Adaptive Power Management
  • >50% Perf/Watt vs RDNA 2


 
For all the talk of the 4090, I feel most people are forgetting that over 90% of card sales are the 50/60 midrange models.

Having the best is prestigious, but having more sales can bring in more revenue.

It was lack of DSLL and ray tracing that has kept AMD back. If they can complete now, Nvidia will need to cut prices.

Nvidia dominates mid range as well, it's not even a competition, AMD is like a theoretical alternative to Nvidia, but Nvidia keeps eating away at the marketshare. They have the high end AND mid range sales dominance for discrete GPUs.

At this point, except a minority of loud AMD fans, peoples are cheering for AMD to be competitive, for Nvidia price drops, to buy Nvidia cards. Almost nothing AMD can do will get them out of this market share hole, sadly for them. They know it, they didn't produce a whole lot of RDNA 2 cards as even the 3090 outsold their entire range of cards. It was the very definition of a paper launch even though AMD marketing hyped the crowd into believing it would not be the case and that they would surpass Nvidia in availability. It was a bunch of bullshit. Same for their MSRP, had to be incredibly lucky to score one on amd.com, their AIBs had disgusting prices compared to Ampere.

Discrete GPUs are no longer (never was) AMD's bread and butter. To compete with Nvidia in production output, they would have to eat away at their silicon availability for better margin products like APUs for consoles and CPUs.

So at this point, why would AMD eat away at their already small margins if they know it'll only trigger a price war with a company that has way deeper pockets? They're not running a charity here.

Edit- you're also maybe a tad too hopeful that not only they match or exceed in rasterization, but also match in ray tracing and ML? You do know that MCM solution will already require a lot of work to split workload and be invisible to API while also trying to somehow keep latencies low? All these ray tracing / ML solutions require super fast small transfers of data, which an MCM crossbar will slow down as the data needs to hop from node to node. The more chiplets, the slower it becomes. CPU tasks are typically unaffected by chiplets, while GPUs get big impacts. Even Apple's insane 2.5TB/s link (vs AMD MI200's 800 GB/s) didn't help there, as the CPU performance was effectively doubled while the GPU was ~+50%. That's with 2 chiplets. More than that and yikes..
 
Last edited:
Nobody is going to buy AMD GPUs because look:

1) At the higher end, people even considering spend that sort of money probably want NVIDIA features, and plus once you get to/around $1000, you're already at "I have money to burn" or "this for work so I don't give a shit either way" territory.

2) At the mid-range, if AMD prices their products like $50 below NVIDIA and it has slightly worse raytracing, reviewers are probably going to say "Well it's only $50... may as well just get NVIDIA for the ray tracing".

There we go. So nobody ends up recommending/want RDNA cards as usual, because AMD GPUs at this point are more of a "protest vote" than a solid purchasing decision in isolation. Same cannot be said for Ryzen CPUs where there is usually a compelling reason to go for it over Intel.

So........ yeah I think we'll end up with the same old "I hope AMD is competitive so NVIDIA lowers its prices and I can buy NVIDIA for less money". If we don't (as a market) reward AMD for competing, they will stop competing and will just price themselves the same as NVIDIA to take what they can get. If that happens, NVIDIA can keep raising prices knowing that AMD will just mirror them and effectively offer zero competition while sort of offering competition.
 
Last edited:
I personally think RDNA 3 is not really about competing with NVIDIA.

I personally think AMD and Intel are just tired of NVIDIA. Their iGPU's/APU's incorporating ARC/RDNA3 is going to be AMD and Intel's blue ocean strategy from now on.
 
https://wccftech.com/amd-rdna-3-rad...er-2x-rt-performance-amazing-tbp-aib-testing/

2x RT performance compared to RDNA 2 when also having 2x more rasterization performance is not good news, it seems it would be using the same type of RT accelerator or, if better, culled back by MCM latencies. This would put their RT performance at 3090 level. That's not something to hang your hat on.

But you know, rumours and all, it's pretty much all crap until it's benched by sites.
 
https://wccftech.com/amd-rdna-3-rad...er-2x-rt-performance-amazing-tbp-aib-testing/

2x RT performance compared to RDNA 2 when also having 2x more rasterization performance is not good news, it seems it would be using the same type of RT accelerator or, if better, culled back by MCM latencies. This would put their RT performance at 3090 level. That's not something to hang your hat on.

But you know, rumours and all, it's pretty much all crap until it's benched by sites.
He literally says more than 2X improvement in RT in the linked tweet. Even AMD have stated that their CUs have "enhanced ray-tracing capabilities".
 
Actually RDNA3 is rumored to only have 96MB infinity cache, less than 128MB.
But each cache block was halved in size, meaning there's twice the cache blocks for the same cache amount and therefore twice the bandwidth-per-MB. So while there's 2/3rds of the cache it's working at 2* 2/3rds the total bandwidth, so there's actually 4/3rds = 33% higher bandwidth clock-for-clock when comparing Navi 21's 128MB to Navi 31's 96MB. Navi 21 had ~2TB/s on Infinity cache, so Navi 31 should have 2.67TB/s.
If clocks are higher then the bandwidth scales up yet again.

Plus, there's supposed to be a model with stacked 16MB 3D cache (like Zen3/4 X3D) on top of each of the 6 MCDs. That model (radeon 7950?) would have a total of 192MB L3 for the GPU.
Coupled with 384bit GDDR6 24Gbps this should make Navi 31 a bandwidth monster.


By AMD's own numbers, 96MB (same as Navi 22) still account to ~53% hit rate at 4K, as per this slide from AMD and edited by Locuza Locuza :

Locuza on Twitter: I modified the official slide from AMD showing the  Infinity Cache hit rate for a certain capacity. I would take it just as a  rough guidance but the pixel



With 384bit of 20Gbps GDDR6 Navi 31 is getting 960GB/s (0.938TB/s), which is then mixed with a maximum ~2.65TB/s from Infinity Cache.
At 4K we're then looking at 2.67*0.53 + 0.938*0.47 = ~1.85TB/s
At 1440p it's 2.67*0.69 + 0.938*0.31 = ~2.13 TB/s

With FSR 2 and above becoming widespread I'm guessing most games will be able to render natively at 1440p for a 4K display, so the latter should to be the most used.
The 1-Hi 3D stacked cache version, if it ever comes out, could be a very 8K-capable card.
 
Actually RDNA3 is rumored to only have 96MB infinity cache, less than 128MB.
RGT did say they were using a new generation of infinity cache which is apparently much better hence the smaller size.

The higher clocks of RDNA 3 will also mean more cache bandwidth.
 
Both him and MLID get a lot of shit but having followed them for years, on graphics cards they've been spot on.
LOL. MLiD said last month back that Intel will close their GPU division because it was not successful.

Let's not forget that he originally said Intel was only making GPUs for servers and the rest is marketing. Competition in the market is good, but people like MLiD dream of AMD performing some magic that will destroy all their competition (blah blah infinity cache). And there are people who fell for that fan marketing.
Update:
unknown.png
 
Last edited:
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.
 
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.
Me.
 
Why would you think that?
Ada and RDNA3 are on the same node.
AMD doesnt have the node advantage anymore.
Both are gonna be boosting to 3000Mhz what could AMD possibly do to actually take that the lead from Nvidia?

Beating the RTX 4090 even in pure raster will be a tall order.
Maybe in those AMD favoring titles like Valhalla and FarCry, but in general and on average the 4090 isnt losing this battle.
Because of where I think they will be investing there money on i don't see brand new tech like dlss 3 here i see it going to raw perf
 
I love the tension of these wars. There's always the undercurrent of lurking console fans eyeing up the AMD offering, even though it won't possibly fit - in terms of size or power draw or cost - into a console product.
In terms of console i don't see them using rdna 3 i see them using rdna 4 es specially if the refreshes dont come out till 2024
 
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.
The 7900xt Will be my first pc so me I guess
 
Nvidia has priced the 4080 so high, I think they fucked themselves on 4060 and 4070 pricing. Who the hell wants to pay $600 for a 60 series card??

Only thing is I don't see FSR 3.0 coming anywhere close to DLSS 3.
Fsr 3.0 could be better than dlss 3.0 when used at lower framerates
 
Because of where I think they will be investing there money on i don't see brand new tech like dlss 3 here i see it going to raw perf
So effectively RDNA2 all over again?
They might match and even beat Ada in some games no doubt.

But we know how big the GCD and MCDs are, we know the nodes being used.
We have a good guess of how high the thing can boost.
It has 3x8pins so thats approx 450-500W TDP.

At best at best in pure raster it will match a 4090 during synthetics.
RT performance hopefully on par with xx70 Ampere maybe even a 3080'10G
A few AMD favoring games will give the edge to the 7900XT too.....but beyond that, the 7900XT is once again going to be a borderline nonexistent card.

Rather look at the 76/77/78XT and hope they are priced competitively cuz Nvidia got no answer to those right now.

At the top end, forget about it.
 
Wait, which one is the navi 33? Sounds kinda weird if they are going with 8gb vram for the 7700xt.

N33 will be 7600XT for this very reason.

N32 will be in 7800XT with 4 MCDs and 16GB ram as well as a cut version in the 7700XT with 3 MCDs and 12GB ram.

The maths works pretty well for this with 2 N5 wafers and 1 N6 wafer being able to make about 600 GPUs which is more than 3 N22 N7 wafers could make abd probably comparable in cost with uncut N32 having far far better margins for AMD than a $650 6800XT.
 
So effectively RDNA2 all over again?
They might match and even beat Ada in some games no doubt.

But we know how big the GCD and MCDs are, we know the nodes being used.
We have a good guess of how high the thing can boost.
It has 3x8pins so thats approx 450-500W TDP.

At best at best in pure raster it will match a 4090 during synthetics.
RT performance hopefully on par with xx70 Ampere maybe even a 3080'10G
A few AMD favoring games will give the edge to the 7900XT too.....but beyond that, the 7900XT is once again going to be a borderline nonexistent card.

Rather look at the 76/77/78XT and hope they are priced competitively cuz Nvidia got no answer to those right now.

At the top end, forget about it.
We will see but if be more surprised if the 7900 xt loses in any rasterized game that isn't Just straight nvidia favored
 
So effectively RDNA2 all over again?
They might match and even beat Ada in some games no doubt.

But we know how big the GCD and MCDs are, we know the nodes being used.
We have a good guess of how high the thing can boost.
It has 3x8pins so thats approx 450-500W TDP.

At best at best in pure raster it will match a 4090 during synthetics.
RT performance hopefully on par with xx70 Ampere maybe even a 3080'10G
A few AMD favoring games will give the edge to the 7900XT too.....but beyond that, the 7900XT is once again going to be a borderline nonexistent card.

Rather look at the 76/77/78XT and hope they are priced competitively cuz Nvidia got no answer to those right now.

At the top end, forget about it.

At 450W with 1.5x perf/watt the uplift will be 2.25x the 6900XT which is 20% faster than a 4090 in raster.

In the very best case the perf per watt was said to be >50% and last time AMD did that it was actually a 64% going from 5700XT to the 6900XT we can say that 450W with 1.64 perf per watt is 2.46x 6900XT and 30% faster than the 4090 in raster.

Personally I think 375W is more likely and it will probably roughly match the 4090 in raster but lets wait 2 weeks and we will know.
 
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.

I always though this take was really cynical, but now I think there may be more than a kernel of truth to it. Like, I see so many mentions along the lines of "we need competition to bring prices down". Reading between the lines a bit, I must ask - do we need "competion" to get lower prices or do we just need someone to release cards at better prices to get better prices?

Me, I'm very much more likely to buy radeon but im well invested because i use linux exclusively at home. Ever since the drivers have went open source, the experience has been fantastic. Some stuff comes late, like the 6000 series not working right for a few months (never experienced it firsthand. I buy mid-low range and it was worked out by then), but once things are sorted I practically forget that drivers even exist. O_O I have not installed a gpu driver even once in the last 4 years. The reason I use linux in the first place is the low maintenance so I'm all about it. Nvidia actually works well on linux, tho. So I'm not completely locked in, just had a great experience with the drivers rolled-in and want to stick with that.
 
Last edited:
https://wccftech.com/amd-rdna-3-rad...er-2x-rt-performance-amazing-tbp-aib-testing/

2x RT performance compared to RDNA 2 when also having 2x more rasterization performance is not good news, it seems it would be using the same type of RT accelerator or, if better, culled back by MCM latencies. This would put their RT performance at 3090 level. That's not something to hang your hat on.

But you know, rumours and all, it's pretty much all crap until it's benched by sites.

This kind of mentality is the reason Nvidia want 900$ as standard price for mid range cards, The trash 4080 12GB is
not even half as fast as the 4090 in some games so effectively Nvidia is offreing overpriced underperforming trash
and Nvidia shills are ok with it
 
At 450W with 1.5x perf/watt the uplift will be 2.25x the 6900XT which is 20% faster than a 4090 in raster.

In the very best case the perf per watt was said to be >50% and last time AMD did that it was actually a 64% going from 5700XT to the 6900XT we can say that 450W with 1.64 perf per watt is 2.46x 6900XT and 30% faster than the 4090 in raster.

Personally I think 375W is more likely and it will probably roughly match the 4090 in raster but lets wait 2 weeks and we will know.
20 to 40% faster than the 4090?
Are you drunk?
 
I'll always hope for AMD to be competitive as it majorly helps the market. I'd love to see Nvidia in full on compete mode. The 4090 is insane, and thats without them really feeling any pressure from AMD. Imagine what we could get if there was real competition?

On the flip side, AMD very much has a Sonic cycle going on with some users here. Do not get hopes up for a 30% increase over the 4090 lol. Go in hoping AMD has something within 90% of the performance of a 4090 for $999. THAT would be amazing.
 
This kind of mentality is the reason Nvidia want 900$ as standard price for mid range cards, The trash 4080 12GB is
not even half as fast as the 4090 in some games so effectively Nvidia is offreing overpriced underperforming trash
and Nvidia shills are ok with it

Fo real. :/

So AMD is supposed to give everyone the exact same performance, for way less money? Sorry, no.

Right now, you can get a 6650xt for like 280 bucks new. It's as fast as a 3060ti in raster, (edit, thought the 3060 had more memory but I was thing on the 6700 lol), has FSR2 instead of DLS which is vanishingly less effective, and the raytracing isn't good (as if it's really so useful in the 3060ti).

Getting better price to performance is a trade-off. You can't expect EVERYTHING to be as good or better at a significantly lower price ffs. If that's what it takes to even consider amd then they aren't serious about this competition stuff.
 
Last edited:
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.
I will certainly buy an AMD GPU if the price and performance is better.

My budget for a new GPU is £900-1000. I'll just buy whatever can offer me the best performance for that price.
 
Last edited:
gonna laugh if nvidia was able to brute force it's way to outperforming amd's mcm.

q6RR00z.png


and amd drivers are probably going to have more issues than usual, given its their first mcm.
 
All amd has to do is price better than nvidia
gonna laugh if nvidia was able to brute force it's way to outperforming amd's mcm.

q6RR00z.png


and amd drivers are probably going to have more issues than usual, given its their first mcm.

What's your criteria?

Is it just if amd cannot beat the 4090?
Is it in categories? High, medium, low?
Is it using dlss and fsr or not?
Is it price to performance?
 
Are you drunk or just illiterate? I said 20-30,% on the basis of 450W and a lower/upper perf per watt improvement. It is just maths. I also said I don't think 450W will happen so around 4090 at 375W feels more likely.
20 - 30% faster than a 4090 at 450W....PCB leaks have shown the 7900XT with 3x8pin which translates to approx 450W.
The board at least has a 450W rating, likely the power slider will be unlocked up to 450W regardless of what the shipped state is.

That would put the 7900XT at like 180fps 13 game average at 4K.
Average_4K-p.webp

Yeah I want whatever you are on.
 
I always though this take was really cynical, but now I think there may be more than a kernel of truth to it. Like, I see so many mentions along the lines of "we need competition to bring prices down". Reading between the lines a bit, I must ask - do we need "competion" to get lower prices or do we just need someone to release cards at better prices to get better prices?

Me, I'm very much more likely to buy radeon but im well invested because i use linux exclusively at home. Ever since the drivers have went open source, the experience has been fantastic. Some stuff comes late, like the 6000 series not working right for a few months (never experienced it firsthand. I buy mid-low range and it was worked out by then), but once things are sorted I practically forget that drivers even exist. O_O I have not installed a gpu driver even once in the last 4 years. The reason I use linux in the first place is the low maintenance so I'm all about it. Nvidia actually works well on linux, tho. So I'm not completely locked in, just had a great experience with the drivers rolled-in and want to stick with that.
Its not cynical, its very much a reality.

The PCMR has been using AMD and ATi before them, as a lever to keep Nvidia prices in check.
Radeon GPUs have been better value for money for literally decades and at every turn they've been spurned in favour of Nvidia. The driver problems of early to mid 2000s ATi have been memed so badly its turned into a self-fulfilling prophecy.
When RV770 and Cypress launched around the time of Tesla and Fermi, ATi had their highest market share which was barely 50%. But they had parity in market share at a significantly lower ASP, so they never really made any profit. No profit means software support takes longer.

Its not any of the consumers business to care about that of course, but a lot of the problems were overstated by press and in forums like the original GAF and Beyond3D. There are even reports of actual paid shills arranged by Nvidia in partnership with AEG to push the agenda. All of this hamstrung competition, by starving Radeon of the volume needed to compete.

Right now people are suddenly acting shocked and disturbed at Nvidia's monopolistic tendencies as if they hadn't been doing this since their inception as a company. And now they're longing for AMD to come to the rescue yet again.
"Please compete so we can have sane prices in the GPU market".

Well I'm sorry, but corporations exist to make money. AMD are not your friend. Why should they undercut Nvidia by any significant margin? Why should they leave money on the table? It has never worked for them before. 9700 Pro was an excellent card, didn't win ATi any meaningful market share. HD4870 was half the price of the GTX 280. Hardly anyone cared. HD5870 absolutely mopped the floor with Fermi. Nobody cared. The R9 290X offered GTX Titan performance for half the price - nobody bought it. The Fury X offered 980 Ti performance at a steep discount. Nobody bought one. The RX480 and RX580 were, are and remain to this day a better product than the GTX 1060 (did we all forget that the 1060 3GB was also a cut down version of the 6GB in more than just VRAM lol?) - the 1060 still absolutely murdered the RX480 in terms of marketshare and mindshare.

At all turns people applauded Radeon for "Being good value", but then turned around and bought Nvidia anyway.

The market is in this state because of a lack of competition. That has been fuelled by and large due to PCMR elitism and hilariously bizarre loyalty to a company that has openly shown contempt for its fans for decades.

Apologies for the sanctimonious rant, but I have a sinking feeling AMD won't play ball any more. We had a chance to support them to keep Nvidia in check many times before, but that time has passed. AMD wants their profits and ASPs up. They have no incentive to engage in the rat race to the bottom, because it never achieved anything for them. So be prepared to just get shafted. PC gaming will become more and more niche as shit just keeps getting jacked up in price.
 
Last edited:
Its not cynical, its very much a reality.

The PCMR has been using AMD and ATi before them, as a lever to keep Nvidia prices in check.
Radeon GPUs have been better value for money for literally decades and at every turn they've been spurned in favour of Nvidia. The driver problems of early to mid 2000s ATi have been memed so badly its turned into a self-fulfilling prophecy.
When RV770 and Cypress launched around the time of Tesla and Fermi, ATi had their highest market share which was barely 50%. But they had parity in market share at a significantly lower ASP, so they never really made any profit. No profit means software support takes longer.

Its not any of the consumers business to care about that of course, but a lot of the problems were overstated by press and in forums like the original GAF and Beyond3D. There are even reports of actual paid shills arranged by Nvidia in partnership with AEG to push the agenda. All of this hamstrung competition, by starving Radeon of the volume needed to compete.

Right now people are suddenly acting shocked and disturbed at Nvidia's monopolistic tendencies as if they hadn't been doing this since their inception as a company. And now they're longing for AMD to come to the rescue yet again.
"Please compete so we can have sane prices in the GPU market".

Well I'm sorry, but corporations exist to make money. AMD are not your friend. Why should they undercut Nvidia by any significant margin? Why should they leave money on the table? It has never worked for them before. 9700 Pro was an excellent card, didn't win ATi any meaningful market share. HD4870 was half the price of the GTX 280. Hardly anyone cared. HD5870 absolutely mopped the floor with Fermi. Nobody cared. The R9 290X offered GTX Titan performance for half the price - nobody bought it. The Fury X offered 980 Ti performance at a steep discount. Nobody bought one. The RX480 and RX580 were, are and remain to this day a better product than the GTX 1060 (did we all forget that the 1060 3GB was also a cut down version of the 6GB in more than just VRAM lol?) - the 1060 still absolutely murdered the RX480 in terms of marketshare and mindshare.

At all turns people applauded Radeon for "Being good value", but then turned around and bought Nvidia anyway.

The market is in this state because of a lack of competition. That has been fuelled by and large due to PCMR elitism and hilariously bizarre loyalty to a company that has openly shown contempt for its fans for decades.

Apologies for the sanctimonious rant, but I have a sinking feeling AMD won't play ball any more. We had a chance to support them to keep Nvidia in check many times before, but that time has passed. AMD wants their profits and ASPs up. They have no incentive to engage in the rat race to the bottom, because it never achieved anything for them. So be prepared to just get shafted. PC gaming will become more and more niche as shit just keeps getting jacked up in price.

Now I want to make a poll here about whether people would buy a 6800XT or choose a 3070ti for just a liiiiitle more.
O_O Kinda dread to see the results.
 
Now I want to make a poll here about whether people would buy a 6800XT or choose a 3070ti for just a liiiiitle more.
O_O Kinda dread to see the results.

Why bother making a poll. Look at steam hardware survey.

6800 XT 0.16%
3700 Ti 1.07%

Whatever neogaf thinks, this is the reality. Ampere dominated sales.
 
Why bother making a poll. Look at steam hardware survey.

6800 XT 0.16%
3700 Ti 1.07%

Whatever neogaf thinks, this is the reality. Ampere dominated sales.

goddam you are right.
.
.
.

The wait is over everyone! You can get 3080 performance for a 3070ti price!

They did it! AMD saved us!

special-ed-crank.gif
 
Nvidia dominates mid range as well, it's not even a competition, AMD is like a theoretical alternative to Nvidia, but Nvidia keeps eating away at the marketshare. They have the high end AND mid range sales dominance for discrete GPUs.

At this point, except a minority of loud AMD fans, peoples are cheering for AMD to be competitive, for Nvidia price drops, to buy Nvidia cards. Almost nothing AMD can do will get them out of this market share hole, sadly for them. They know it, they didn't produce a whole lot of RDNA 2 cards as even the 3090 outsold their entire range of cards. It was the very definition of a paper launch even though AMD marketing hyped the crowd into believing it would not be the case and that they would surpass Nvidia in availability. It was a bunch of bullshit. Same for their MSRP, had to be incredibly lucky to score one on amd.com, their AIBs had disgusting prices compared to Ampere.

Discrete GPUs are no longer (never was) AMD's bread and butter. To compete with Nvidia in production output, they would have to eat away at their silicon availability for better margin products like APUs for consoles and CPUs.

So at this point, why would AMD eat away at their already small margins if they know it'll only trigger a price war with a company that has way deeper pockets? They're not running a charity here.

Edit- you're also maybe a tad too hopeful that not only they match or exceed in rasterization, but also match in ray tracing and ML? You do know that MCM solution will already require a lot of work to split workload and be invisible to API while also trying to somehow keep latencies low? All these ray tracing / ML solutions require super fast small transfers of data, which an MCM crossbar will slow down as the data needs to hop from node to node. The more chiplets, the slower it becomes. CPU tasks are typically unaffected by chiplets, while GPUs get big impacts. Even Apple's insane 2.5TB/s link (vs AMD MI200's 800 GB/s) didn't help there, as the CPU performance was effectively doubled while the GPU was ~+50%. That's with 2 chiplets. More than that and yikes..

Nvidia released the 1080ti and 3080 at $699 due to worrying about what AMD had coming. We've now gone from a $700 3080 to a $1200 4080. And Nvidia came to their senses on releasing a $900 4070 maybe 4070ti. I believe Nvidia had to release at higher prices to not have their partners get crushed on their 3000 series inventory. Rumor was that before the 4000 series launch reveal, there was over a billion in inventory of 3000 series cards.

AMD doesn't seem to have as much stock to sell through and could sell cards at reasonable prices. They don't even need to match or exceed the Nvidia models. 90% of the performance for 20-30% less. Maybe even a much lower power draw? DLSS and raytracing were winners for Nvidia, and made spending extra worth it in many people's eyes. We don't know what AMD has in store, but we're heading into a major market downturn and people will be tightening their belts. People are going nuts of the 4090, but some people are always crazy and blow money just to have the best. Chip manufacturing will have more free capacity as long as China doesn't try moving in on Taiwan. People as a whole will be less likely to drop as much money on a new GPU as they were during the two-year drought. The used market will also affect pricing, you can get a used RTX 3080 for under $500 now, or a used RTX 3070 for just over $300.
 
Nvidia released the 1080ti and 3080 at $699 due to worrying about what AMD had coming. We've now gone from a $700 3080 to a $1200 4080. And Nvidia came to their senses on releasing a $900 4070 maybe 4070ti. I believe Nvidia had to release at higher prices to not have their partners get crushed on their 3000 series inventory. Rumor was that before the 4000 series launch reveal, there was over a billion in inventory of 3000 series cards.

AMD doesn't seem to have as much stock to sell through and could sell cards at reasonable prices. They don't even need to match or exceed the Nvidia models. 90% of the performance for 20-30% less. Maybe even a much lower power draw? DLSS and raytracing were winners for Nvidia, and made spending extra worth it in many people's eyes. We don't know what AMD has in store, but we're heading into a major market downturn and people will be tightening their belts. People are going nuts of the 4090, but some people are always crazy and blow money just to have the best. Chip manufacturing will have more free capacity as long as China doesn't try moving in on Taiwan. People as a whole will be less likely to drop as much money on a new GPU as they were during the two-year drought. The used market will also affect pricing, you can get a used RTX 3080 for under $500 now, or a used RTX 3070 for just over $300.
AMD don't have as much Radeon stock for a few reasons:
1) Nobody buys them - their marketshare has been hard capped at like 20% for the better part of a decade.
2) They have a limited number of wafers to build out their entire tech portfolio including Radeon, Instinct, CPU, Xilinx, APUs and Consoles. Out of those things CPU's are the most profitable. Simple as.

As I've said before. They have no incentive to play the game to try and win marketshare because the market has proven time and time again they don't care. Heck we've got tech journalists unironically trying to shill Intel ARC, when AMD and Radeon have been present and competing on a shoestring budget forever.

If they do decide to try, because of Nvidia's unique vulnerability this generation, I will be pleasantly surprised. But this is the last chance. If nobody buys in, the market is fucked forever. Intel might be relevant, but I doubt they'll do much better than Radeon.
 
This rumor duscussion is not turning out as fun as I thought. >:|

Glimmer of hope or dystopian future?: graphics-focused apu's inhereit the sub-$400 gpu market.
 
Last edited:
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.
I'm just hoping that they can continue to be competitive with Nvidia and bring solid prices. If the new cards are just absolutely insane, then I'll purchase the 7000 series. I am happy with my 6800 XT purchase thus far and that was my first AMD GPU purchase. I wish everyone would hope AMD and Intel do well in the GPU space. That will mean more options, better tech and prices for us... the consumers. Anything else is clown behavior, imo.
 
Why would you think that?
Ada and RDNA3 are on the same node.
AMD doesnt have the node advantage anymore.
Both are gonna be boosting to 3000Mhz what could AMD possibly do to actually take that the lead from Nvidia?

clock speeds are not merely a result of node, but also of pipeline design. thats why you see very different clock speed levels in GPUs and CPUs.

AMD incresed effective (sustained) clocks from rdna 1 to rdna 2 from 1.9 to 2.6 Ghz within the same node (TSMC 7nm) through pipe rebalancing alone.
 
20 - 30% faster than a 4090 at 450W....PCB leaks have shown the 7900XT with 3x8pin which translates to approx 450W.
The board at least has a 450W rating, likely the power slider will be unlocked up to 450W regardless of what the shipped state is.

That would put the 7900XT at like 180fps 13 game average at 4K.
Average_4K-p.webp

Yeah I want whatever you are on.

It is fucking maths.

6900XT has a 300W TBP. A card designed to hit 450W with a 50% perf/watt gain has 1.5x the power and 1.5x the performance at that TBP equating to 2.25x overall improvement.

The 6900XT gets 77fps so 2.25x that is 173.25 fps.

If the perf/watt improvement is the same as 5700XT to 6900XT which was 1.64x then the maths is 1.5x power * 1.64x performance per watt for 2.46x performance which is 77 * 2.46 = 189.42 fps. This is an extreme upper bound though.

Now there are obvious caveats. 1) Maybe TBP won't be 450W so 1.5x power multiplier is wrong. 2) maybe the design goal was 375W but the clocks are getting pushed up the v/f curve so at 450W the perf/ watt advantage falls off of a cliff. 3) Maybe the baseline is the reference 6950XT instead at 335w again changing the power multiplier.

Now the way AMD have advertised their perf/watt gains in the past was prior flagship vs new SKU. For RDNA it was Vega64 vs 5700XT and for RDNA2 it was 5700XT vs 6800XT at 1.54x perf/watt and 5700XT vs 6900XT at 1.64x perf/watt.

As such I expect the >1.5x perf/watt claim is vs the 6900XT or 6950XT and using one of the top N31 SKUs.

So TLDR: maths is just maths and AMDs perf/watt claims have historically been a bit conservative meaning AMD will need a 375-400W TBP to match the 4090 and higher TBPs (provided that was the initial design goal and not a later in the day juicing of the clock speeds) will be faster than a 4090.

Edit: just to be clear I am talking pure raster performance here, no clue on how RT will shake out.
 
Last edited:
How many people here who are hoping for AMD to be competitive in performance and aggressive in price, are actually planning on buying Radeon GPU?

Or are you just hoping AMD bring the heat so that you can buy a GeForce GPU at a discount?

If its the latter, I've got a 4080 12GB shaped bridge to sell you.
Do people really think that if amd is aggressive with prices nvidia is gonna lower their price the day after?

How they would justify that with pr talk??

And did something similar even happened? Amd having aggressive prices and nvidia eating their pride and lowering their price the day\week after?
 
but people like MLiD dream of AMD performing some magic that will destroy all their competition (blah blah infinity cache). And there are people who fell for that fan marketing.
Based on what I've watched (I might have missed something), MLID has been consistently that RDNA3 will be good but that it will not beat Lovelace.
If anything he's said that RDNA3 might match Lovelace in raster performance but will be worse in RT.
Plus, in the most recent long-form video with Gordon from PCWorld, they were discussing at length how Radeon will struggle to steal much market share even if RDNA3 came out with better overall performance than NVIDIA.

RedGamingTech is way worse in terms of hyping the shit out of AMD GPUs. He's one of those people who will put out a video every single day even when there's nothing to talk about.

On Intel, I think it's obvious that Arc is either cancelled or significantly toned down in scale (as business at least for the DIY GPU segment) given the absolute shitshow of a marketing campaign.
 
Do people really think that if amd is aggressive with prices nvidia is gonna lower their price the day after?

How they would justify that with pr talk??

And did something similar even happened? Amd having aggressive prices and nvidia eating their pride and lowering their price the day\week after?

4870 was about 85-90% of GTX280 in performance but was half the price. NV dropped the 280 by aroubd $200 2 weeks after launch.

AMD still didn't really gain much and would have been better off making a bigger die with 50% more execution units and clearly taking the performance crown for the same price as the GTX280.
 
AMD don't have as much Radeon stock for a few reasons:
1) Nobody buys them - their marketshare has been hard capped at like 20% for the better part of a decade.
2) They have a limited number of wafers to build out their entire tech portfolio including Radeon, Instinct, CPU, Xilinx, APUs and Consoles. Out of those things CPU's are the most profitable. Simple as.

As I've said before. They have no incentive to play the game to try and win marketshare because the market has proven time and time again they don't care. Heck we've got tech journalists unironically trying to shill Intel ARC, when AMD and Radeon have been present and competing on a shoestring budget forever.

If they do decide to try, because of Nvidia's unique vulnerability this generation, I will be pleasantly surprised. But this is the last chance. If nobody buys in, the market is fucked forever. Intel might be relevant, but I doubt they'll do much better than Radeon.

I suppose one could make the argument that AMD (through Radeon) is competing in the GPU space for three reasons:
1) They need to keep up to date so that they have a compelling offering for Sony and Microsoft when they decide to build out their next-gen consoles (assuming it won't all be cloud-based).
2) They have an interest in holding Nvidia's proverbial feet to the proverbial fire to sort of keep them occupied so that they don't encroach further on any of AMD's other businesses (i.e., look at Nvidia's big push into datacenters over the last decade or so)
3) If the business is profitable then there's no point closing it just because they don't have a large market share, so they're probably happy to keep chugging along.
 
Last edited:
Top Bottom