• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

[MLiD] XBOX Magnus RDNA 5 Finalized

Thing is, 99.99% of games don't use Mesh Shaders, precisely because PS5 is the lead dev environment, and since it lacked them, most devs didn't bother learning.
99.9 isn't true at all. UE5 uses it extensively. XSX just wasn't performing that much better in a lot of games that did either. I remember seeing comparisons for Fortnite where the geometry looked better on PS5 too.

PS5 actually held back Series X more than Series S.

Who would've thought?
because PS5 was literally stopping games from releasing on XSX like XSS was?

PS5 wasn't holding back XSX unless you think AMD were releasing RDNA2-4 cards with mesh shaders that were going unused. The inclusion of "Mesh Shaders" were largely unnoticed simply because ultimately they weren't that big a deal really. RT on the other hand at least you can point to and see and if you tried software alternatives it was very noticeably poor performance in comparison.
 
Last edited:
Chris Farley Idk GIF

Does He Know GIF
 
Out of curiosity:
Will there even be XBox Magnus branded games? Or will there only be normal PC versions of the games?
If Magnus can play both dedicated XBox games and the PC versions of those games wouldn't they run differently?

It's all a little confusing.
As far as pricing is concerned: 699-799€/$ PS6 and 1099-1299€/$ would be my guess
 
Bojji is just doing what he normally does, being reductive when it comes to PS. Both XS and PS5 are custom RDNA. "RDNA1+RT" is closer in important features to RDNA2 than it is with RDNA1 IMO. You only need to look at XS and PS5 Pro having so called "full RDNA2" but not offering all that much more than base PS5 in terms of the impact of those additional features. Now imagine say having no hardware RT like in actual RDNA1. The real important things in actual (not custom) RDNA2 AMD cards was hardware RT and maybe the introduction of Infinity Cache and even then neither PS5 nor XS implemented the latter because they're both custom really. custom SoCs built around the machines clocks, GDDR memory bandwidth and everything else (like those "cache scrubbers" or whatever else), implementing features that make sense to them.

It's about getting the most power out of a cheap well designed system. You can mitigate a lot of problems real RDNA2 tried to solve and XSX/PS5 lacked by doing other things instead. For example decide to clock higher, have a different memory config with different bus size instead of Infinity Cache, even do things in software like VRS. You're just trying to get the most performance for the cheapest price and PS5 punched well above its weight in that regard. It was cheaper to produce than the XSX yet going toe to toe with it and often outperforming it. Nobody really cares if you want to call it RDNA1+RT or RDNA2 minus X . There is no need to do that when it's clearly neither, best described as custom, and the real world differences are not all that important anyway.

Keep fighting the good fight...

That RDNA1/RDNA2 discussion happened few times over the years and people were finding evidences in drivers etc. PS5 don't have anything that makes RDNA2 - RDNA2. For Cerny and AMD PR it was enough that they added RT late in development but GPU itself was clearly designed in 2018 or even earlier, when AMD was still hoping that their proposition of DX12 FL 12_2 will become the norm. Nvidia won and they modified their architecture between RDNA1-2 to match that (Xbox has most of that features).

PS5 actually held back Series X more than Series S.

Who would've thought?
99.9 isn't true at all. UE5 uses it extensively. XSX just wasn't performing that much better in a lot of games that did either. I remember seeing comparisons for Fortnite where the geometry looked better on PS5 too.


because PS5 was literally stopping games from releasing on XSX like XSS was?

PS5 wasn't holding back XSX unless you think AMD were releasing RDNA2-4 cards with mesh shaders that were going unused. The inclusion of "Mesh Shaders" were largely unnoticed simply because ultimately they weren't that big a deal really. RT on the other hand at least you can point to and see and if you tried software alternatives it was very noticeably poor performance in comparison.

UE5 doesn't use mesh shaders. Nanite is pure software.

And yeah BOTH PS5 and older PC architectures held back DX12U software advancements. All the things that Turing introduced in 2018 (amd RDNA2 in 2020) are still rare in modern games (mesh shaders, hardware RT, VRS, SFS) - 8 years later. That's because dominant console and most PC GPUs didn't support any of them outside of RT (older PC GPUs didn't support even that), and that RT support is slow anyway so vast majority of games don't have it.
 
Last edited:
I don't usually use laugh emojis, but this is peak comedy.

You've got to love these dramatic last stands from Xbox warriors — "yeah, sure, PS5 held back Xbox." If that's the narrative you need, go for it.

- Series S held back games when it comes to how much memory they can use, it barely has more than last gen consoles
- PS5 held back adoption of features like SFS, VRS (maybe this one is actually good?) and int8 ML

Both of those statements are true.
 
Round 2 of is ps6 full rdna5 like XSX2?

rdna4 + AI hardware + cerny magic?
Cerny magic is simply discarding features he deems not truly essential to gaming performance as he predicts the coming generation's software development will play out, in order to save die space, clock speed, etc. to keep the SoC efficient as possible.

Because Playstaion will end up the dominant development platform, what he deemed non-essential ends up not being used widely, not because he predicted correctly, but because the platform dominance made it "correct"...
 
Last edited:
- Series S held back games when it comes to how much memory they can use, it barely has more than last gen consoles
- PS5 held back adoption of features like SFS, VRS (maybe this one is actually good?) and int8 ML

Both of those statements are true.
Ironically, SFS aka Sampler Feedback Streaming would've solved the Series S memory issue as well. It was basically 100 times bandwidth efficiency, fitting 300 gb worth of data in 3 gb memory, something like that. And Mesh Shaders bring like up to 500% efficiency to the GPU, so had the devs used SFS in conjunction with Mesh Shaders, the Series S would never have had any issues doing 1080/60 up to 1440/60.

But seems the next gen tech like Universal Compression is another method to solve the bandwidth and memory issues, and all Magnus, Orion, Canis should have it from the start.
Cerny magic is simply discarding features he deems not truly essential to gaming performance as he predicts the coming generation's software development will play out, in order to save die space, clock speed, etc. to keep the SoC efficient as possible.
Cerny definitely miscalculated regarding Mesh Shaders. Sony went ahead with production early because they were trying to have enough stock for launch. Times were uncertain during COVID. Maybe he knew PS4 cross gen wasn't going away so Mesh Shaders could be delayed.
 
Cerny definitely miscalculated regarding Mesh Shaders. Sony went ahead with production early because they were trying to have enough stock for launch. Times were uncertain during COVID. Maybe he knew PS4 cross gen wasn't going away so Mesh Shaders could be delayed.
At the end of the day, 3rd party software didn't "miss" mesh shaders on PS5 since it became lead platform of this console gen. No one bothered to utilize mesh shader on XSX version of their games I'll bet.
 
This is an interesting discussion but I can't help but thinks the specs may still change. Even if the leaks are coming directly from AMD, MS, and Sony, launch is anywhere from 20 months to 2 years away.

When would initial tapeout happen, next few months? And even after something might be modified depending on results.
 
Last edited:
UE5 doesn't use mesh shaders. Nanite is pure software.

Yes it does. And it also supports Primitive Shaders and the traditional Geometry Shader Pipeline.
Mesh shaders is a form to generate geometry. Nanite is a form of rasterization. They are different parts of the rendering pipeline, though they are related.
 
Cerny definitely miscalculated regarding Mesh Shaders. Sony went ahead with production early because they were trying to have enough stock for launch. Times were uncertain during COVID. Maybe he knew PS4 cross gen wasn't going away so Mesh Shaders could be delayed.
Mark Cerny didn't "miscalculate" anything. The PS5 wasn't designed to win a spec-sheet war — it was designed to be efficient, balanced, and cost-effective at scale.

With the PlayStation 5, Sony went for a smaller ~308 mm² die and a high-frequency approach instead of chasing brute CU count like the Xbox Series X. Call it RDNA 0.2, call it custom RDNA 2 — the label doesn't really matter. What matters is the result: a leaner chip that stayed competitive in real-world performance while being significantly cheaper to manufacture per unit.

At PS5's scale, even small silicon savings translate into billions over the generation.

The same design philosophy was present in the PlayStation 4. Both PS4 and PS5 are tightly balanced machines — no bloated specs, no waste, no obvious bottlenecks that cripple developers. Just efficient architectures tuned for real-world performance.

Price-to-performance wise, it's genuinely difficult to find better value propositions in consumer hardware at launch. They weren't built to look dominant on paper — they were built to win over a full generation, which they did.
 
This is an interesting discussion but I can't help but thinks the specs may still change. Even if the leaks are coming directly from AMD, MS, and Sony, launch is anywhere from 20 months to 2 years away.

When would initial tapeout happen, next few months? And even after something might be modified depending on results.
According to new video by MLiD, Magnus. Orion and Canis all taped out.
 
Is anyone expecting Magnus to reach 10M with the prices that's been rumoured?
This is exactly the core problem with the Nextbox. On paper, it might be stronger than the PS6, but with a laughable install base of only 4-7 million, developers simply won't bother putting in the effort. Most will optimize for the PS6 and just brute-force the Xbox version, slapping it on without proper tuning because it's not economically worth it.

In the end, the PS6 could actually perform better, not because it's technically superior, but because it has enough players to justify proper optimization and polish.
 
Yes it does. And it also supports Primitive Shaders and the traditional Geometry Shader Pipeline.
Mesh shaders is a form to generate geometry. Nanite is a form of rasterization. They are different parts of the rendering pipeline, though they are related.

I see conflicting reports on this. Seems that they added hardware acceleration to nanite but it still uses software fallback on architectures without mesh shaders?

If that was the case 5700XT would have to underperform vs. 6600 because it doesn't have mesh shaders:

bMl0x0TyH9rbrsUD.jpg
GkGfiBI2ZNAeKdqa.png


vs. how shit it performed in AW2 without MS support:

performance-1920-1080.png


Half performance of 6600XT...

Mark Cerny didn't "miscalculate" anything. The PS5 wasn't designed to win a spec-sheet war — it was designed to be efficient, balanced, and cost-effective at scale.

With the PlayStation 5, Sony went for a smaller ~308 mm² die and a high-frequency approach instead of chasing brute CU count like the Xbox Series X. Call it RDNA 0.2, call it custom RDNA 2 — the label doesn't really matter. What matters is the result: a leaner chip that stayed competitive in real-world performance while being significantly cheaper to manufacture per unit.

At PS5's scale, even small silicon savings translate into billions over the generation.

The same design philosophy was present in the PlayStation 4. Both PS4 and PS5 are tightly balanced machines — no bloated specs, no waste, no obvious bottlenecks that cripple developers. Just efficient architectures tuned for real-world performance.

Price-to-performance wise, it's genuinely difficult to find better value propositions in consumer hardware at launch. They weren't built to look dominant on paper — they were built to win over a full generation, which they did.

"No obvious bottlenecks" is funny when PS5 Pro is crippled by low memory bandwidth.

This is exactly the core problem with the Nextbox. On paper, it might be stronger than the PS6, but with a laughable install base of only 4-7 million, developers simply won't bother putting in the effort. Most will optimize for the PS6 and just brute-force the Xbox version, slapping it on without proper tuning because it's not economically worth it.

In the end, the PS6 could actually perform better, not because it's technically superior, but because it has enough players to justify proper optimization and polish.

Same architecture, optimizations done on PS6 will benefit Xbox as well.
 
Last edited:
And btw PS6 isn't full RDNA5 either.

This was expected given how heavily Sony customize their chips, the question is if it isn't full RDNA 5 then does that mean it could swing either way, maybe it'll have features plucked from further down the AMD roadmap

How important was that "efficiency" when PS5 was more often than not performing the same as XSX though? Not all that important.
Same with things like VRS vs software VRS. Now compare actual RDNA1 running software RT vs "custom RDNA" (RDNA1+RT/RDNA2-X) and you will see how much of a difference that is.

Sure you got more performance/watt with the "full" thing but PS5 clocked higher and had a system built around that. To counter the downsides of the higher clocks they added the "cache scrubbers". These are custom systems where the custom SoC "version" can't be looked at and compared to determine importance of things. Some people keep trying though for obvious reasons. It's like they refuse the fact that the pudding tastes the same or sometimes better and instead laser focus on ingredient differences they read about.

It's important to note that the PS5 RDNA 1 debate was mainly sparked by Microsoft themselves who claimed Series X was "full RDNA 2", and fanboys pushed the narrative to show Series X's alleged hardware superiority but given how things have played years into this gen and how the PS5 has offered performance parity then this whole debate is starting to become somewhat pointless.

Cerny definitely miscalculated regarding Mesh Shaders. Sony went ahead with production early because they were trying to have enough stock for launch. Times were uncertain during COVID. Maybe he knew PS4 cross gen wasn't going away so Mesh Shaders could be delayed.

There is nothing in the PS5 hardware which stops it from using Mesh Shaders, even if you go along with the whole RDNA 1 argument as RDNA 1 was lacking software and driver support but the necessary hardware for Mesh Shaders was still present. Mesh Shaders are DX12 specific and compile down to Primtive Shader code on all AMD hardware even on Series X.

This is inline with what developers have said about using Mesh Shaders on the consoles, Remedy developers even mentioned how there was no platform specific optimizations done for PS5 when handling geometry, which could either mean it was using Mesh Shaders or maybe Primitive Shader which offered the same features and performance you'd get with Mesh Shaders. This was the case also on Avatar : Frontiers of Pandora.

The point being that Mesh Shader adoption has not at all been stumped or stopped by PS5's alleged lack of support and there was certainly no "miscalculation"from Cerny.
 
Last edited:
I see conflicting reports on this. Seems that they added hardware acceleration to nanite but it still uses software fallback on architectures without mesh shaders?

If that was the case 5700XT would have to underperform vs. 6600 because it doesn't have mesh shaders:

bMl0x0TyH9rbrsUD.jpg
GkGfiBI2ZNAeKdqa.png


vs. how shit it performed in AW2 without MS support:

performance-1920-1080.png


Half performance of 6600XT...

There is no hardware acceleration for Nanite, because current hardware rasterizers on Nvidia, AMD and Intel work on a 2x2 pixel grid.
Nanite rasterizes in software to avoid this bottleneck, when using very dense geometry.

UE5 does support Primitive Shaders. And it's pretty much as fast as Mesh Shaders. They are very similar in a high overview. The main difference is that Primitive shaders do not support the Amplification Stage.

RDNA2 also does not support Mesh Shaders. It still uses Primitive Shaders, though a more advanced version than RDNA1. But then the driver maps and compiles Mesh Shaders into the Primitive Shader.
AMD never did the same translation from Mesh Shaders, with RDNA1.

Alan Wake only supports Mesh Shaders. ANd a more recent patch added support for the traditional Geometry Pipeline.
 
It doesn't seem like anything major is missing from Orion, only thing I can see is the NPU but even MLID was unsure if that's missing.
MLiD likes to drip feed his viewers.
That said, this is an old mockup of the Xbox Magnus I didn't awhile ago.

The most logical place I can see the NPU going is that large rectangular box next to the CPU cluster.
ntDRvxn6pySqeeI1.png
rJqOBTXCfGkcORni.png


It matches the NPU Compute shape and size on Strix Point.
vvmrIFwChdE7HA1t.jpg


On the PS6 mockup done by him, we can see the same large rectangular box next to the CPU cluster.
SeYy8xfX9rLhLlqH.jpg


Same with Canis.
AEU5vU5ze2BtqOfn.jpg


Why would he add it to Orion and Canis mockup?
Who knows, this is just speculation on my part.
 
Will there even be XBox Magnus branded games? Or will there only be normal PC versions of the games?
Interessting question.
In my eyes is Magnus just a PC, running on Windows 11 but with a Xbox UI (like the Xbox Ally X).
With Steam OS as some kind of "USP" for Magnus is it plausible that the normal PC versions will run on Magnus.
But i've no clue.

I would buy a Magnus and not a PS6, if it is a PC and capable of to run the PC versions of PS6 games.
But only when Magnus isn't too much more expensive than the PS6. I would buy 200-300 $ more for a Magnus, if the graphic is noticeably better.
But K KeplerL2 quess in october 2025 was 600 $ for the PS6 vs 1200 $ Magnus and i'm not sure if i'm willing to pay double so much..
 
This is exactly the core problem with the Nextbox. On paper, it might be stronger than the PS6, but with a laughable install base of only 4-7 million, developers simply won't bother putting in the effort. Most will optimize for the PS6 and just brute-force the Xbox version, slapping it on without proper tuning because it's not economically worth it.

In the end, the PS6 could actually perform better, not because it's technically superior, but because it has enough players to justify proper optimization and polish.

There's no "Xbox version" anymore....

There's a PC version that runs on 2000+ different PC configurations, including this one....

Anyone thinking otherwise is delusional
 
Last edited:
With DDR6 Magnus is going to be mighty expensive. Now I understand Kepler PS6 600$ / Magnus 1200$ guesswork.

Also how are they going to emulate XBox Series games with 2 pools of memory (DDR6 and GDDR7)?
 
What DDR6? Both consoles will use GDDR7.
MLiD likes to drip feed his viewers.
That said, this is an old mockup of the Xbox Magnus I didn't awhile ago.

The most logical place I can see the NPU going is that large rectangular box next to the CPU cluster.
ntDRvxn6pySqeeI1.png


It matches the NPU Compute shape and size on Strix Point.


On the PS6 mockup done by him, we can see the same large rectangular box next to the CPU cluster.


Same with Canis.


Why would he add it to Orion and Canis mockup?
Who knows, this is just speculation on my part.
Based on Loxus mockup. Loxus Loxus why have you added DDR6 on the CPU side?
 
Why are we still having this discussion? It's been confirmed from AMD leaks, drivers, and the PS5 hardware itself that the PS5 was basically RDNA1 + RT. No trace of mesh shaders or VRS anywhere. Not that it mattered, primitive shaders and mesh shaders are extremely similar, and VRS is basically a dud, the upscaling part now mostly handled by temporal scalers.

The same design philosophy was present in the PlayStation 4. Both PS4 and PS5 are tightly balanced machines — no bloated specs, no waste, no obvious bottlenecks that cripple developers. Just efficient architectures tuned for real-world performance.
No obvious bottlenecks? The PS4 had a miserable CPU at the time. And the PS5 RT capability (as well as the XSX) are anemic. No ML powered upscaling either.

Bang for buck, the consoles are both extremely well designed, but tradeoffs had to be made, especially being hobbled to AMD as Microsoft and Sony are. But the limitations are pretty obvious.
 
Last edited:
We are in the middle of the ps5 gen.

We are at the end of the series generation.

Xbox desperately needs new hardware ASAP to try and regain any semblance of marketshare.

PS5 meanwhile continues with solid momentum and potentially on the way to 150 million+ units.

Sony can afford to wait and see what/if MS launches hardware wise and proceed with next gen at their convenience. There is absolutely no rush for them. Better for devs to target 100 million plus ps5s, why rush into another device cycle when there won't be a huge bump up from ps5 pro? 2030 they can provide a decent visual bump at a reasonable price.
As Kepler posted earlier, too late to make big changes in hardware project, so doesnt matter if ps6 release in 2027 or 2029, we will get same hw
 
No obvious bottlenecks? The PS4 had a miserable CPU at the time. And the PS5 RT capability (as well as the XSX) are anemic. No ML powered upscaling either.

Bang for buck, the consoles are both extremely well designed, but tradeoffs had to be made, especially being hobbled to AMD as Microsoft and Sony are. But the limitations are pretty obvious.
Yeah. They didn't teleport you to your office either. But those are not bottlenecks but limitations of the available technology of the era.
 
Yeah. They didn't teleport you to your office either. But those are not bottlenecks but limitations of the available technology of the era.

That may be true for Jaguar (no other CPU available from AMD outside of ridiculous bulldozer). But memory BW problem of PS5 Pro could have been easily fixed by 320bit memory bus (it would allow for more memory as well).
 
MLiD likes to drip feed his viewers.
That said, this is an old mockup of the Xbox Magnus I didn't awhile ago.

The most logical place I can see the NPU going is that large rectangular box next to the CPU cluster.
ntDRvxn6pySqeeI1.png
rJqOBTXCfGkcORni.png


It matches the NPU Compute shape and size on Strix Point.
vvmrIFwChdE7HA1t.jpg


On the PS6 mockup done by him, we can see the same large rectangular box next to the CPU cluster.
SeYy8xfX9rLhLlqH.jpg


Same with Canis.
AEU5vU5ze2BtqOfn.jpg


Why would he add it to Orion and Canis mockup?
Who knows, this is just speculation on my part.
Xbox Magnus how many UCs 64 or 68 ? I saw 64 CUs this chart!
 
That may be true for Jaguar (no other CPU available from AMD outside of ridiculous bulldozer). But memory BW problem of PS5 Pro could have been easily fixed by 320bit memory bus (it would allow for more memory as well).
But we were talking about PS5 and SX. You know I agree with the Pro BW problem.
 
Ironically, SFS aka Sampler Feedback Streaming would've solved the Series S memory issue as well. It was basically 100 times bandwidth efficiency, fitting 300 gb worth of data in 3 gb memory, something like that. And Mesh Shaders bring like up to 500% efficiency to the GPU, so had the devs used SFS in conjunction with Mesh Shaders, the Series S would never have had any issues doing 1080/60 up to 1440/60.

But seems the next gen tech like Universal Compression is another method to solve the bandwidth and memory issues, and all Magnus, Orion, Canis should have it from the start.

Cerny definitely miscalculated regarding Mesh Shaders. Sony went ahead with production early because they were trying to have enough stock for launch. Times were uncertain during COVID. Maybe he knew PS4 cross gen wasn't going away so Mesh Shaders could be delayed.
Now I'm 100% sure, hello Shitjutsu.
 
At the end of the day, 3rd party software didn't "miss" mesh shaders on PS5 since it became lead platform of this console gen. No one bothered to utilize mesh shader on XSX version of their games I'll bet.
Not even Xbox studios themselves who initially had an exclusives policy? which means games could have used it. Tell me why Flight Sim didn't use it, or why FH didn't use it then? It was an in-house engine.

The games came to PS5 with little to no impact on performance or visuals later when xbox switched strategy at the end.

Like VRS, Mesh shaders just weren't all that important/efficient to bother with and you could even implement compute shaders or software VRS which not only gave you support on older graphics cards and other consoles but more freedom for optimisation too (look at what CoD did for example).

Suggesting it was being held back by PS5 when PS5 was totally out of the equation wth xbox exclusives tells you everything you need to know. Either Xbox themselves chose not to use it or they did and the alternative ports worked just as well anyway.
 
No obvious bottlenecks? The PS4 had a miserable CPU at the time. And the PS5 RT capability (as well as the XSX) are anemic. No ML powered upscaling either.
$399 consoles.
One launched in 2013. The other in 2020.

Context matters.

By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."

Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.
 
Are you sure it isn't you who are still fighting this in your head because you seem to pop up in nearly every single Microsoft related thread with something negative to say since the Actiblizz acquisition thread days.
He might still be butthurt after all the truth bombs FOSSpatents famously dropped on the likes of him.
maxresdefault.jpg
 
Whats funny to me is certain fanbase keep bragging about Cerny magic.

In reality its "as basic as it gets" console.

Its PC and Xbox that has extra magic sauce that gets underutilised.
 
Xbox Magnus how many UCs 64 or 68 ? I saw 64 CUs this chart!
I worked out 72CUs.

But AT2 full die is a mystery to me.
MLiD has it as 70CUs/4SE full die.

At one time AnandTech forum was discussing 40CUs new way / 80CUs old way.
g4Jyu1OOWDRibSvB.jpg


I came across these at one point too on Chiphell, which is outdated with some wrong info but looks logical.
jr4vX6UrF2nzbwaN.png
UHQSAAtKTGNY34HV.png
MFz0YfTXTPFZcJPk.png


Then here from MLiD again has AT2 with 64CU, possibly with a full die of 72CUs/4SE and 2CUs disabled per SE.
ZClrWHQeO6jBVc63.jpg

Or 64CUs could be the full die.

Lots of confusing information all round imo.
 
Top Bottom