[MLID] Zen 6 Magnus Leak: AMD's MASSIVE APU for next gen console (+ Medusa Point Specs)

Can you remember when Kepler leaked the PS5 Pro and RDNA4 Ray Tracing?


And now he giving us insights on RDNA5, which is what the PS6/Xbox will utilize.

And in the end did we got even 1 title that shows big improvement of rt capabilities of ps5pro ? I cant find one.
 
Last edited:
How do you know it's only 50%?

"3x RT performance" does not mean "game with RT has 3x higher fps".

Just because people take numbers out of context doesn't mean that Kepler is unreliable.

ac2.jpg
ac1.jpg
 
I still don't understand how they managed to design such an unbalanced console that they somehow crippled the main strength of the new architecture to the point its a marginal .5/.6x improvement when we know rdna4 is amd,s most significant leap in rt in real-time. I mean you have to actually put in effort to gimp things this badly. I mean I would legit like to know from a tech expert like K KeplerL2 what is the main bottleneck?

Even by sonys own extremely conservative marketing material rt being the only wow point 3-4x rt performance it is not. Only way i can rationalize it is that they neutered their premium console so they could keep keep rdna2 as a base to take an easy approach to avoid any headaches with compatibility. Even the ps4pro which was bandwidth starved retained all of the benefits of arch leaps and even punched beyond its weight due to its loaded rops.

I remember when I was dissapointed when Heisenberg told us it was around 4070[yes i know later he was told it was cod] and I was a dissapointed looking back at the xbox one x(now that was a real pro console) but damn Cerny wtf happened. I mean, I can have unsubstantiated hope that this particular workload on their engine performed below expectations for their engine but let's be real..
 
Last edited:
I do remember how people were expecting 3X performance from PS5Pro RT and in the end it's barely +50%.
Everything which is being leaked should be taken with a grain of salt up until we'll see the actual performance and features.
That's because it's still Zen2 and RDNA2.
 
That's because it's still Zen2 and RDNA2.
Not exactly we were told it was a Frankensteinial abomination that had the rdna4 rt pipeline but rdna 2 base with a few rdna 3 fceatures but in real world scenarios it seems to perform basically like rdna2+ or rdna3 at best.
 
Last edited:
I mean you just said it's basically rdna 2 primarily so then the tech isn't really there no? The rdna4 features are insignificant in practice by that logic..and real world benchmarks.
What I'm saying is the developers are not fully utilizing the RT better.

And the reason it doesn't perform better in other games is because it doesn't benefit from IPC gains because it's still RDNA2.

Is Nvidia your preferred GPU?
 
What I'm saying is the developers are not fully utilizing the RT better.

And the reason it doesn't perform better in other games is because it doesn't benefit from IPC gains because it's still RDNA2.

Is Nvidia your preferred GPU?
I mean could it be optimized better sure but we have rt benchmarks shared by both ubisoft and I'd comparing ps5 pro, base consoles and pc gpus and especially assassin's Creed is probably the best ps5 pro showcase so far and we see the gains are nowhere near advertised nor near rdna4, better optimization isnt going to make performance jump from .6 improvement to rdna4 levels in rt.

Depends, I don't really have a preferred gpu maker. I appreciated Nvidias offerings when they delivered legendary pro consumer gpus like the brilliant 8800gt and i appreciated amd when they delivered value monsters like the hd5870. That being said, I am more partial to AMD because they are underdogs, and I want Nvidia to be toppled so we can have proper competition. That being said it's a fact that Nvidia is the defacto pioneer in gpu tech currently, delivers the best performance/tech dlss4 and still holds a large edge in pt. Unfortunately, nvidia has become arrogant and brutally anti-consumer...sadly, AMD is happy to price gouge as well. They are content with Nvidia's table scraps instead of targeting market share. So sadly, outside of just wanting to help AMD stay up, I find it hard to justify myself buy the AMD GPUs, even RDNA4, because you can't just ignore the best visual tech [Pt] and say it's almost equal outside that. That being said, I really hope udna kicks ass and catches Nvidia lacking.
 
Last edited:
Bojji, I don't think this is saying what you think it is saying. Seen it being circulated quite a bit without context and we don't fully know the variables that are not stated on the slide. Are these numbers like for like or for the target graphics settings for each individual system? The fact that series S numbers are being presented for 900p suggests, these are not like for like comparisons.

You don't go from no ray tracing to ray tracing with just 50% performance bumps.

See the quote made by Ubisoft's own Technical architect:

However, with the announcement of the PS5 Pro last year, we knew we had all the ingredients to build the ultimate console version of AC Shadows. By itself, the PS5 Pro's faster GPU helps performance without much effort from developers, but the game changing technology is the improved ray tracing capabilities of the PS5 Pro that BVH82 (BVH stands for 'bounding volume hierarchy', more below) offered over BVH4. With our implementation of BVH8 support on PS5 Pro, we were able to speed up GPU tasks involving hardware ray tracing usage by about 300%. Those 2 critical upgrades, the faster GPU and support for BVH8 made it possible for us to realistically target a stable 60 FPS while showcasing our new generation ray traced lighting engine.
https://blog.playstation.com/2025/0...dows-ubisoft-deep-dives-into-ps5-pro-updates/

Why would they claim 300% here and show 50% in their slides, unless there are other factors at play? For example, the numbers may be for RT diffuse while RT reflections are also being processed in parallel for the Pro, while only measuring RT diffuse for the base PS5. I don't know, I'm just trying to reconcile both claims
 
Last edited:
And in the end did we got even 1 title that shows big improvement of rt capabilities of ps5pro ? I cant find one.

AC: Shadows uses RTGI at 60 fps on Pro, that's X2 the frame rate of base console.

Fornite and Silent Hill 2 Remake (despite a very poor PSSR implementation) use hardware Lumen for lighting, AO and refletions instead of the cheaper software solution used on PS5.


F1 24 on Pro uses RT for reflection, shadows and ambient occlusion during actual game play, whereas RT is used only in replays on PS5.

Spider-Man 2 includes RT distant shadows not present on PS5. It also uses full-resolution reflections on Pro (they're half-resolution on base console).


Alan Wake 2 on Pro uses RT reflections in Quality Mode which are not present on any other console.

These are a few examples of big improvements in RT capabilities I would say.
 
Bojji, I don't think this is saying what you think it is saying. Seen it being circulated quite a bit without context and we don't fully know the variables that are not stated on the slide. Are these numbers like for like or for the target graphics settings for each individual system? The fact that series S numbers are being presented for 900p suggests, these are not like for like comparisons.

You don't go from no ray tracing to ray tracing with just 50% performance bumps.

See the quote made by Ubisoft's own Technical architect:


https://blog.playstation.com/2025/0...dows-ubisoft-deep-dives-into-ps5-pro-updates/

Why would they claim 300% here and show 50% in their slides, unless there are other factors at play? For example, the numbers may be for RT diffuse while RT reflections are also being processed in parallel for the Pro, while only measuring RT diffuse for the base PS5. I don't know, I'm just trying to reconcile both claims

PS5 has RT lighting in 40fps mode. It requires that 50% to go from it to 60fps on Pro. And that 60fps is not exactly very stable.

I think final results match graphs more than PR talk from devs.
 
AC: Shadows uses RTGI at 60 fps on Pro, that's X2 the frame rate of base console.

Fornite and Silent Hill 2 Remake (despite a very poor PSSR implementation) use hardware Lumen for lighting, AO and refletions instead of the cheaper software solution used on PS5.


F1 24 on Pro uses RT for reflection, shadows and ambient occlusion during actual game play, whereas RT is used only in replays on PS5.

Spider-Man 2 includes RT distant shadows not present on PS5. It also uses full-resolution reflections on Pro (they're half-resolution on base console).


Alan Wake 2 on Pro uses RT reflections in Quality Mode which are not present on any other console.

These are a few examples of big improvements in RT capabilities I would say.
F1 and AC I would call decent updates but still for apparently 2-3x improvement in rt rather not what many expected so also wouldnt make same mistake and expect from 160 bit 160w ps6 to introduce path tracing as standard even if on paper will have another 3x improvement over pro.
 
PS5 has RT lighting in 40fps mode. It requires that 50% to go from it to 60fps on Pro. And that 60fps is not exactly very stable.

I think final results match graphs more than PR talk from devs.
Bojii you're a precessional Pro hater. A 3X increase on performance in one part of the pipeline doesn't translates to a whole 3X performance and you know it. For that the whole system should have 3X the specs at a bare minimum. We're talking here, what, 4080 performance? You sure all want that 2.000€ Pro. But fear not, Microsoft is on its way…
 
Bojii you're a precessional Pro hater. A 3X increase on performance in one part of the pipeline doesn't translates to a whole 3X performance and you know it. For that the whole system should have 3X the specs at a bare minimum. We're talking here, what, 4080 performance? You sure all want that 2.000€ Pro. But fear not, Microsoft is on its way…

I know that 2-3x speed improvement in just RT won't translate to 2-3x speed improvement overall. RT is just part of what GPU renders in any given frame.

I told people this months before console launched and I was called "hater". Looks like nothing changed, lol.
 
F1 and AC I would call decent updates but still for apparently 2-3x improvement in rt rather not what many expected so also wouldnt make same mistake and expect from 160 bit 160w ps6 to introduce path tracing as standard even if on paper will have another 3x improvement over pro.


I think Path Tracing on next gen consoles might be a little bit too ambitious, I guess it will depend on how much AMD can improve RT and AI capabilities for RDNA5 but it's hard to ignore the fact that they're lagging behind Nvidia by years in those key areas, so we'll see. In any case I do expect next gen consoles to be able to display several RT features simultaneously like the PC does and unlike current gen consoles, which usually focus on just one RT effect (mainly RT reflections, RT shadows and RTGI) and even that comes with a lot of compromises.
 
Bojji, I don't think this is saying what you think it is saying. Seen it being circulated quite a bit without context and we don't fully know the variables that are not stated on the slide. Are these numbers like for like or for the target graphics settings for each individual system? The fact that series S numbers are being presented for 900p suggests, these are not like for like comparisons.

You don't go from no ray tracing to ray tracing with just 50% performance bumps.

See the quote made by Ubisoft's own Technical architect:


https://blog.playstation.com/2025/0...dows-ubisoft-deep-dives-into-ps5-pro-updates/

Why would they claim 300% here and show 50% in their slides, unless there are other factors at play? For example, the numbers may be for RT diffuse while RT reflections are also being processed in parallel for the Pro, while only measuring RT diffuse for the base PS5. I don't know, I'm just trying to reconcile both claims
Cerny said that "ray calculations" were sped up by 2X-3X, so they're likely using a synthetic measurement. AMD themselves advertise a 2X "ray traversal" improvement for RDNA 4, but that doesn't mean a 2X RT performance improvement if you are bound by other factors such as shading or memory bandwidth.

Nvidia significantly improve the throughput of the RT cores each generation for only relatively minor performance improvements in the real world. If those RT benchmarks are like-for-like then Ada is already only ~2X faster in their application than RDNA 2, at a given performance level. So there isn't even room to be more than 2X faster without exceeding Ada.
 
PS5 has RT lighting in 40fps mode. It requires that 50% to go from it to 60fps on Pro. And that 60fps is not exactly very stable.

I think final results match graphs more than PR talk from devs.
So we are now comparing a 40 fps mode tuned for PS5 with a 60 fps mode tuned for Pro and throwing in 50% because 40 + 20 = 60? You know you are more analytical than that. The 60 fps stability may have nothing to do with RT at all and may be CPU, memory or bandwidth related.

Can we at least agree the slide is representative of quality mode? If they profiled the game, frame times for 2 different RT features would run in parallel. It makes perfect sense that one console with just RTGI and the other console with RTGI and RT reflections does not show a 300% boost in frame times as only one component is being measured. The slide was meant to show frame times of their technique on different target platforms. Not to benchmark how the target platforms run with respect to each other. If that was their intention, they wouldn't run XSS at 900p. We are using that slide to prove a point that the slide wasn't meant for.

I'm afraid confirmation bias is playing here as you have already concluded what the RT capabilities actually are and not open to addressing a potential issue with the evidence you are presenting. I'm skeptical of the 3x as well, but there is no reason for a 3rd party technical architect to cite any number at all, unless they have some reason for it. Unless of course, Sony bribed him as part of their "collaboration". That's just tinfoil talk though. Given how every other team they collaborated with gives vague ass responses when they are asked about what they have done for the Pro, I don't see why 300% should be mentioned at all. There might be something here that we aren't privy to. Just advising caution before sharing it with your conclusions as there is a chance (I don't know the likelihood, but it's not 0) that it could mislead people. At least quote it with a caveat that we may not be seeing the whole picture.
 
So we are now comparing a 40 fps mode tuned for PS5 with a 60 fps mode tuned for Pro and throwing in 50% because 40 + 20 = 60? You know you are more analytical than that. The 60 fps stability may have nothing to do with RT at all and may be CPU, memory or bandwidth related.

Can we at least agree the slide is representative of quality mode? If they profiled the game, frame times for 2 different RT features would run in parallel. It makes perfect sense that one console with just RTGI and the other console with RTGI and RT reflections does not show a 300% boost in frame times as only one component is being measured. The slide was meant to show frame times of their technique on different target platforms. Not to benchmark how the target platforms run with respect to each other. If that was their intention, they wouldn't run XSS at 900p. We are using that slide to prove a point that the slide wasn't meant for.

I'm afraid confirmation bias is playing here as you have already concluded what the RT capabilities actually are and not open to addressing a potential issue with the evidence you are presenting. I'm skeptical of the 3x as well, but there is no reason for a 3rd party technical architect to cite any number at all, unless they have some reason for it. Unless of course, Sony bribed him as part of their "collaboration". That's just tinfoil talk though. Given how every other team they collaborated with gives vague ass responses when they are asked about what they have done for the Pro, I don't see why 300% should be mentioned at all. There might be something here that we aren't privy to. Just advising caution before sharing it with your conclusions as there is a chance (I don't know the likelihood, but it's not 0) that it could mislead people. At least quote it with a caveat that we may not be seeing the whole picture.

I have no idea what real RT abilities of Pro are, I just posted slides from developers. One of the few that actually utilized RT advantage on Pro.

Returning to comparison of balanced mode vs. performance mode. Yeah, that 50% is that simple, outside of fact that permanence mode on Pro had disabled hair strands and balanced mode on regular PS5 most likely runs in higher resolution (Olivier didn't count pixels here).

I think F1 so far is the most impressive Pro game when it comes to RT. New Capcom games will be interesting, both RE9 and Pragmata were showed with RT reflections on Gamescom.

Edit: yesterday someone posted graphs from Id software about RT performance of consoles but I can't find this shit...
 
Last edited:
Cerny said that "ray calculations" were sped up by 2X-3X, so they're likely using a synthetic measurement. AMD themselves advertise a 2X "ray traversal" improvement for RDNA 4, but that doesn't mean a 2X RT performance improvement if you are bound by other factors such as shading or memory bandwidth.

Nvidia significantly improve the throughput of the RT cores each generation for only relatively minor performance improvements in the real world. If those RT benchmarks are like-for-like then Ada is already only ~2X faster in their application than RDNA 2, at a given performance level. So there isn't even room to be more than 2X faster without exceeding Ada.
Yeah we are in full agreement there. Real world performance in terms of FPS cannot be equated to RT performance uplift. The slides are not presenting overall performance improvement anyway. It's a very specific component of the overall budget, so it actually won't be that different than synthetic benchmark numbers (assuming no other bottlenecks like you mentioned). Technically, we should indeed be seeing Cerny's claimed 2x-3x numbers here. So something is amiss. Assuming, of course, Cerny isn't lying, the Ubisoft technical architect isn't lying and the leaked developer guide isn't lying either. I see nothing for them to gain from lying though. I don't know if anyone is buying the pro because there is a theoretical 3x RT or 1.5x or 1.75x RT. We can assume that the declared numbers need to be true at least on paper.

According to the slide, 4080 is only 4-5x the RT of a base PS5? At what settings i.e how many bounces? What RT options running in parallel? It's not as clear as it seems, so I don't think we should be treating any of it as like for like.

I think F1 so far is the most impressive Pro game when it comes to RT. New Capcom games will be interesting, both RE9 and Pragmata were showed with RT reflections on Gamescom.
Yeah John had a glowing review of Pro for RE9. Seems it's running both RT reflections and RTGI at 60 fps. How that compares to the base might give more insight.

Edit: yesterday someone posted graphs from Id software about RT performance of consoles but I can't find this shit...
Would love to see that if you find it. Just want to ensure we have the right facts. Not trying to defend the pro.
 
Last edited:
I mean could it be optimized better sure but we have rt benchmarks shared by both ubisoft and I'd comparing ps5 pro, base consoles and pc gpus and especially assassin's Creed is probably the best ps5 pro showcase so far and we see the gains are nowhere near advertised nor near rdna4, better optimization isnt going to make performance jump from .6 improvement to rdna4 levels in rt.

Depends, I don't really have a preferred gpu maker. I appreciated Nvidias offerings when they delivered legendary pro consumer gpus like the brilliant 8800gt and i appreciated amd when they delivered value monsters like the hd5870. That being said, I am more partial to AMD because they are underdogs, and I want Nvidia to be toppled so we can have proper competition. That being said it's a fact that Nvidia is the defacto pioneer in gpu tech currently, delivers the best performance/tech dlss4 and still holds a large edge in pt. Unfortunately, nvidia has become arrogant and brutally anti-consumer...sadly, AMD is happy to price gouge as well. They are content with Nvidia's table scraps instead of targeting market share. So sadly, outside of just wanting to help AMD stay up, I find it hard to justify myself buy the AMD GPUs, even RDNA4, because you can't just ignore the best visual tech [Pt] and say it's almost equal outside that. That being said, I really hope udna kicks ass and catches Nvidia lacking.
You don't have to type all that just to say you prefer Nvidia.

No wonder you dislike the idea that AMD is making progress.
 
You don't have to type all that just to say you prefer Nvidia.

No wonder you dislike the idea that AMD is making progress.
Wow, Jesus Christ, take your blinkers off, you can only take selective conclusions based on your emotions, which certainly doesn't say much for your reading comprehension. I literally told you I prefer AMD and have been partial to them. I was excitedly following the RDNA4 launch beyond midnight, actively hoping they would beat out the 5080/5070ti at each level, I have bought every sony console so can only hope they launch it out of the park with udna so I can get an impressive console.

If they both offered comparable products, I would choose them just to help them and improve competition. That being said, I am not emotionally invested in either organization and neither should you, if given the chance both will fleece consumers as has been happening past few generations. I have followed ati since the 9700pro days which is a legendary card and can only hope they can go back to that level.
 
Last edited:
Wow, Jesus Christ, take your blinkers off, you can only take selective conclusions based on your emotions, which certainly doesn't say much for your reading comprehension. I literally told you I prefer AMD and have been partial to them. I was literally following the RDNA4 launch at midnight, actively hoping they would beat out the 5080/5070ti at each level, I have bought every sony console so can only hope they launch it out of the park with udna so I can get an impressive console.

If they both offered comparable products, I would choose them just to help them and improve competition. That being said, I am not emotionally invested in either organization and neither should you, if given the chance both will fleece consumers as has been happening past few generations. I have followed ati since the 9700pro days which is a legendary card and can only hope they can go back to that level.
It's ok dude.
Everyone has a preference, just don't downplay the other brand.
 
I still don't understand how they managed to design such an unbalanced console that they somehow crippled the main strength of the new architecture to the point its a marginal .5/.6x improvement when we know rdna4 is amd,s most significant leap in rt in real-time. I mean you have to actually put in effort to gimp things this badly. I mean I would legit like to know from a tech expert like K KeplerL2 what is the main bottleneck?

Even by sonys own extremely conservative marketing material rt being the only wow point 3-4x rt performance it is not. Only way i can rationalize it is that they neutered their premium console so they could keep keep rdna2 as a base to take an easy approach to avoid any headaches with compatibility. Even the ps4pro which was bandwidth starved retained all of the benefits of arch leaps and even punched beyond its weight due to its loaded rops.

I remember when I was dissapointed when Heisenberg told us it was around 4070[yes i know later he was told it was cod] and I was a dissapointed looking back at the xbox one x(now that was a real pro console) but damn Cerny wtf happened. I mean, I can have unsubstantiated hope that this particular workload on their engine performed below expectations for their engine but let's be real..
It's bandwidth starved because it doesn't have any of the cache improvements of RDNA4 and memory bandwidth was only a minor increase.
 
It's bandwidth starved because it doesn't have any of the cache improvements of RDNA4 and memory bandwidth was only a minor increase.
So it doesn't have the cache improvements to actually make use of the improved rt cores and its base is rdna2 so it barely gets any major performance improvements outside rt workloads, this console seems designed to excel at nothing really outside pushing ml ai which is also unpolished. Was carrying over general arch improvements really such a difficult task? As far as I remember xbox one x and ps4 pro had no problem carrying over the architectural improvements of newer archs without messing up bc with the base console, it just seems shocking that they would basically forego all of the main benefits from the newer architecture.

Also, thank you for the responses. good to have someone with in-depth expertise on gaming tech to draw upon, since with surface level understanding we can speculate more than anything.
 
Last edited:
It's ok dude.
Everyone has a preference, just don't downplay the other brand.
I'm not, though,h amd is just behind currently, and I hope they get back ahead to make things competitive. I remember that by the Xbox 360 launch, their GPU was significantly superior to the 7800 series, and the 5870 was a much more efficient and better value product than even the later 480, hell, RX 290 beat the titan. Currently, AMD has nothing that is truly a market shaker that can compete on equal terms with nvidia or even beat it in areas.
 
So it doesn't have the cache improvements to actually make use of the improved rt cores and its base is rdna2 so it barely gets any major performance improvements outside rt workloads, this console seems designed to excel at nothing really outside pushing ml ai which is also unpolished. Was carrying over general arch improvements really such a difficult task? As far as I remember xbox one x and ps4 pro had no problem carrying over the architectural improvements of newer archs without messing up bc with the base console, it just seems shocking that they would basically forego all of the main benefits from the newer architecture.

Also, thank you for the responses. good to have someone with in-depth expertise on gaming tech to draw upon, since with surface level understanding we can speculate more than anything.
Would have been too complicated a project to bring in rdna4 architectural changes to what is primarily an rdna2 console. The tape out, development, testing and release timing would make it a non-starter. It would be like releasing ps6 but 3 years sooner.

So instead they just brute forced whatever money could buy. The pro is clearly not a balanced console, but trying to balance it out would have erased it out of existence.
 
Last edited:
According to new MLID video, RDNA5-UDNA is only 10% better in performance than RDNA 4? I am not expecting RDNA 5 to be a huge leap to RDNA 4, *Except* for Raytracing/Path tracing?!
 
I know that 2-3x speed improvement in just RT won't translate to 2-3x speed improvement overall. RT is just part of what GPU renders in any given frame.

I told people this months before console launched and I was called "hater". Looks like nothing changed, lol.
War. War never changes.
 
Top Bottom