AMD Next Generation RDNA 5 GPU Lineup Leaked: Up to 184 CUs 128GB GDDR7, Top Gaming Model at 154 CUs 36GB GDDR7 with performance at 2.64x of RTX 4080

2.64x the performance of the 4080 is most certainly not happening. That's close to twice the performance lead the 5090 has over the 4080.
 
Last edited:
And the source is MLID, ffs.


home video dancing GIF
 
Gonna chalk this one as not likely. Reminds me of the RDNA initial rumors how everyone was talking about it being way better than nvidia and never happened. Granted the RDNA stuff was pretty good just not as in the realm of imagination land.
 
It could be for professional/workstation cards. 128GB or even 64GB are pointless for gaming systems.
The VRAM capacity isn't the issue. 2x the performance of the 5090 is not believable, workstation/pro card or not, unless the performance is strictly in reference to AI.
 
This would mean that 5090 is 1.32x the performance of 4080. Are you possibly confusing 4080 with 4090?
No, but I misspoke. I meant that it would be almost twice the performance the 5090 has over the 4080. 5090 is 1.7x the performance of the 4080. 1.7x the performance of the 5090 gives us 2.89x the performance of the 4080, not that far from 2.64x.
 
36 gig VRAM? I'm in

Honestly, for what though? At this point bandwidth is more important than the actual ram amount. Texture quality is already really good with 16GBs, so what is 36GBs necessary for? Its not like there is going to be this huge push to higher than 4K resolution anytime soon.

I could see moving to 24GBs so you have more headroom for advanced ray-traying, but a 36GBs seems so unnecessary and will just make the GPU more expensive than needed.
 
Last edited:
Honestly, for what though? At this point bandwidth is more important than the actual ram amount. Texture quality is already really good with 16GBs, so what is 36GBs necessary for? Its not like there is going to be this huge push to higher than 4K resolution anytime soon.

I could see moving to 24GBs so you have more headroom for advanced ray-traying, but a 36GBs seems so unnecessary and will just make the GPU more expensive than needed.
More RAM would be great for local AI models. A 48GB or 64GB card with say 5090 equivalent performance or a bit higher would be pretty great.
 
No, but I misspoke. I meant that it would be almost twice the performance the 5090 has over the 4080. 5090 is 1.7x the performance of the 4080. 1.7x the performance of the 5090 gives us 2.89x the performance of the 4080, not that far from 2.64x.
not sure how they're calculating performance. If they're talking FP32 TFLOPs, 4080 is 49 TFLOPS, 2.64x that would be 129 TFLOPS. 5090 is 104 for comparison. So that would be ~1.25x 5090.

Not saying I believe it, just trying to put it in perspective because "2.64x the performance of 4080" is a confusing way to put it.
 
not sure how they're calculating performance. If they're talking FP32 TFLOPs, 4080 is 49 TFLOPS, 2.64x that would be 129 TFLOPS. 5090 is 104 for comparison. So that would be ~1.25x 5090.

Not saying I believe it, just trying to put it in perspective because "2.64x the performance of 4080" is a confusing way to put it.
Considering what is hot now, it's probably AI performance. I mean the whole "2.64x performance of 4080".
 
Last edited:
The Radeon cycle:

- wildly optimistic rumors about how much ass the next Radeon is going to kick <- WE ARE HERE

- new Radeon launches and falls short of the hype

- "yeah but just wait til we get better drivers/games are optimized for AMD because of consoles/etc"

- new GeForce launches and increases the gap

- "it's not fair to compare AMD's previous gen with Nvidia's new gen, just wait til the next Radeon launches"

- repeat



I swear this exact pattern happened with the last 4 Radeon generations at least.
 
The Radeon cycle:

- wildly optimistic rumors about how much ass the next Radeon is going to kick <- WE ARE HERE

- new Radeon launches and falls short of the hype

- "yeah but just wait til we get better drivers/games are optimized for AMD because of consoles/etc"

- new GeForce launches and increases the gap

- "it's not fair to compare AMD's previous gen with Nvidia's new gen, just wait til the next Radeon launches"

- repeat



I swear this exact pattern happened with the last 4 Radeon generations at least.
hd 7970 ?
 
Last edited:
The 9070XT was already beating the 4080 in a few games (RDR2, Flight Sim, Forza 5) , and that was before the latest driver squeezed out another 10% of performance.
I wouldn't be surprised if these benchmarks weren't at least 90% accurate, the bigger question will be if AMD can actually stock them at "MSRP".
 
The 9070XT was already beating the 4080 in a few games (RDR2, Flight Sim, Forza 5) , and that was before the latest driver squeezed out another 10% of performance.
I wouldn't be surprised if these benchmarks weren't at least 90% accurate, the bigger question will be if AMD can actually stock them at "MSRP".
So was the 7900xtx but that was hushed by Reddit/Youtube, It was 2 ~ 20% in favour of XTX at 1440p/4K. Love this news/leak angered Nvidia shills to now attack More law Is dead despite favouring him when It was Nvidia leaks/news. Because they can't handle that a 10900xtx = 5090 super priced at $900.
 
hd 7970 ?
Man that was 13 years ago.

But SO MANY times we have heard rumors about how Radeon++ is going to kick soooooo much ass, only for it to

a. Have really impressive sounding specs on paper but deliver underwhelming gaming performance (Vega/Radeon VII)

b. Get downgraded before launch (RDNA3)

or c. Get canceled altogether like high-end RX 9000


I take all next-gen Radeon rumors with an especially huge grain of salt.
 
The 9070XT was already beating the 4080 in a few games (RDR2, Flight Sim, Forza 5) , and that was before the latest driver squeezed out another 10% of performance.
I wouldn't be surprised if these benchmarks weren't at least 90% accurate, the bigger question will be if AMD can actually stock them at "MSRP".
Those latest drivers squeezing an extra 10% were not reproduced by other channels outside of HU as far as I'm aware.
 
So was the 7900xtx but that was hushed by Reddit/Youtube, It was 2 ~ 20% in favour of XTX at 1440p/4K. Love this news/leak angered Nvidia shills to now attack More law Is dead despite favouring him when It was Nvidia leaks/news. Because they can't handle that a 10900xtx = 5090 super priced at $900.
Funny how the "you're a stupid retard if you bought Nvidia over AMD" people never mention raytracing or DLSS. Let me guess, you don't care about those and neither should anybody else.

And you heard it here first folks: 10900XTX = "5090 Super" for $900. Suck it Nvidia shills!
 
Those latest drivers squeezing an extra 10% were not reproduced by other channels outside of HU as far as I'm aware.

Actually, it was PCGamesHardware the first to report it.


And it was not just drivers, but also games updates and maybe even Windows updates. PCGH and HU retested with everything updated and compared to the release launch.
That is why when some channels tested only the drivers, but with the games using the same current version.
 
Check it out. I can get high settings, native 1080p and at least 60fps in 95% of new releases. On a used FB marketplace $200 6750xt. So, for an additional $1,000 I can play Chinese UE5 games and Silent Hill 2 without settling for FSR. IDK if this is the most value packed upgrade bros. I remember thinking PhysX was costly, but in comparison the worst of the PhsyX era is like the very best of the RT era when it comes to efficiency. And honestly the difference in PhsyX was way more noticeable. I didn't even need to go to the forums to see which nooks and crannies got more realistic bounce lighting.
 
The Radeon cycle:

- wildly optimistic rumors about how much ass the next Radeon is going to kick <- WE ARE HERE

- new Radeon launches and falls short of the hype

- "yeah but just wait til we get better drivers/games are optimized for AMD because of consoles/etc"

- new GeForce launches and increases the gap

- "it's not fair to compare AMD's previous gen with Nvidia's new gen, just wait til the next Radeon launches"

- repeat



I swear this exact pattern happened with the last 4 Radeon generations at least.
They were let down by their upscaling tech.

Last gen any of their gpus (7900 gre, 7800xt, 7900xtx) along with FSR4 would make for a great combination.
 
They were let down by their upscaling tech.

Last gen any of their gpus (7900 gre, 7800xt, 7900xtx) along with FSR4 would make for a great combination.
Upscaling and raytracing. And around the time it released, Cyberpunk 2077 w/path tracing was by far the most demanding game out there (and pretty much needed frame gen as well to be playable). Whereas for non-ray traced games there weren't any killer apps where there was much benefit to be gained over the RTX 3000/RX6000 series.

That's why the "RX7000 is just as good** as RTX 4000 and you're stupid for buying Nvidia (**except for upscaling and ray tracing)" argument always seemed ridiculous to me. Like you're basically saying that RX7000 is just as good in scenarios where it hardly matters, but falls far short in scenarios that would drive you to upgrade.
 
Got excited till u guys told/reminded me that source is that clown MLID, guy had to delete so many of his prediction/leak yt vids coz they were crazy inaccurate and full of BS, so basically its a nothing burger.
32gigs of vram would be nice tho xD
 
Anyway I'm taking the rumors with a huge grain of salt as always. But I'll be the first to buy one + recommend it to others if this actually happens and releases at a reasonable price.
 
Trees in Wuchang were tanking my RTX 4070 again. Hope these new cards can speed up the trees.
That's UE5 being a crap engine difficult to optimize for, plus possibly inexperienced devs just not understanding how to optimize properly. The devs said they were satisfied with the performance when using recommended specs (RTX 2070).
 
Top Bottom