• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

[MLiD] AMD Magnus APU Full Leak: RDNA 5, Zen 6, 110 TOPS NPU = XBOX Next-Gen Console!

That Magnus SOC looks massive. That can't be cheap.
And that's why being a chiplet design modular system helps save costs for both AMD and MS. They can use the 5 dies for various products including discrete graphics. Less waste, everything they create becomes dual purpose.

I think it will turn out like this:

Magnus with AT4 = Xbox Laptops and Xbox Handhelds
Magnus with AT3 = Xbox S console and bulkier Gaming Xbox Laptops
Magnus with AT2 = Xbox X console and Xbox PCs
Magnus with AT1 = Xbox PCs
Magnus with AT0 = Xbox Cloud, they will pair up AT0 with 2-4 Magnus SOCs each, and run 2 instances of X profiles for Ultimate users at 4k, and 4 instances of S profiles for Premium and Essential users at 1440/1080

Then also use AT2, AT1, AT0 for discrete graphics, basically 5080, 5090, and beyond tiers, or 6070, 6080, 6090 tier discrete GPUs to compete with Nvidia lineup.

AMD, and Xbox co-engineered this solution for longterm for both companies. 5 GPU dies designed for 5 form factors, all using same CPU SOC.


They mention portfolio of devices multiple times.
 
Last edited:
And that's why being a chiplet design modular system helps save costs for both AMD and MS. They can use the 5 dies for various products including discrete graphics. Less waste, everything they create becomes dual purpose.

I think it will turn out like this:

Magnus with AT4 = Xbox Laptops and Xbox Handhelds
Magnus with AT3 = Xbox S console and bulkier Gaming Xbox Laptops
Magnus with AT2 = Xbox X console and Xbox PCs
Magnus with AT1 = Xbox PCs
Magnus with AT0 = Xbox Cloud, they will pair up AT0 with 2-4 Magnus SOCs each, and run 2 instances of X profiles for Ultimate users at 4k, and 4 instances of S profiles for Premium and Essential users at 1440/1080

Then also use AT2, AT1, AT0 for discrete graphics, basically 5080, 5090, and beyond tiers, or 6070, 6080, 6090 tier discrete GPUs to compete with Nvidia lineup.

AMD, and Xbox co-engineered this solution for longterm for both companies. 5 GPU dies designed for 5 form factors, all using same CPU SOC.


They mention portfolio of devices multiple times.

Sounds super interesting.
 
I don't see how PS6 can match Magnus, it has fewer CPU cores, lower CPU frequency, fewer CUs, fewer ROPs, lower GPU frequency, less cache and memory bandwidth. It's not a huge difference but Magnus should have better performance in 100% of games unlike this gen where it's more of a 50/50.
I expect xbox cpu to be higher clocked. But xsx cpu is also higher clocked vs ps5. Gpu could be higher clocked on ps6 thanks to better cooling. And Magnus also have Zen 6c to properly cool
 
52 CUs x 4096 ops/cycle (INT8/FP8) * 2 (sparsity) * ~2.8 GHz

Isnt the 5090 only 3300 NV TOPS, aka x2 with sparsity x2 with FP4.

So if we take FP8, it is more around 800 TOPS, meaning magnus will crush 5090 in AI graphics....of course AMD TOPS is part of the CU, while Nvidia is separated tensor cores?
 
Maybe. Or maybe not.
The biggest difference between PS4 and PS5 is 30/60 fps, PS5 and PS6 will be different story. I expect gap to quickly become much bigger, incentivizing people to migrate and as a result - faster transition to next-gen only.

there was a big difference in capabilities between the PS4 and PS5. the SSD speeds and the RT acceleration alone created a clear divide.

PS5 and PS6 will have no such dividing factor. just one will be better at RT than the other. ML Reconstruction can be substituted with "dumb" reconstruction.


PS handheld will have 1080p most likely, so for it it's normal to have base resolution of 1080p and internal of like 400p, because it's not 55" screen but 10" screen.
400p scaled to 4K would be awful.

400p scaled to 1080p also looks awful. the target resolution is less important than the input resolution. generally, anything below 900p internal res is suboptimal already.
 
ps6 will be using 2nm chips, that is a huge jump from 7nm...except sony has choosen to use a downspec level.

In last 2 gen, sony chose x70 class gpu, from rumors, ps6 could be using x60 tier.

7850 -> 3070 -> 10060x
 
there was a big difference in capabilities between the PS4 and PS5. the SSD speeds and the RT acceleration alone created a clear divide.

PS5 and PS6 will have no such dividing factor. just one will be better at RT than the other. ML Reconstruction can be substituted with "dumb" reconstruction.
It depends on how far you want to go with the tech. If you decide to build your whole game on path tracing and not bake any lighting or build a software lumen-like fallback into your engine, your game would literally not be playable on PS5 gen.

Same with ML/AI. Given how overprovisioned next gen seems to be on TOPS, it would be a shame if devs don't go beyond upscaling and RR and build agentic AI into their games. If all your NPCs/racers/gameplay systems are driven by neural networks, you will have to design a different game to make cross gen viable.

The impact of that would be far more "generational" than faster load times and some half measure RT imo. So how big a leap PS6 is depends on to what extent devs are creative and to what extent they want to avoid the financial risk by continuing to support cross gen. The best case scenario is actually a bigger leap than PS4 to PS5.
 
It depends on how far you want to go with the tech. If you decide to build your whole game on path tracing and not bake any lighting or build a software lumen-like fallback into your engine, your game would literally not be playable on PS5 gen.
Can't they just do that? I mean software lumen already works in current gen consoles and it'll save a lot install space and dev time. That way you can have a next gen featured game with a non time consuming fallback to this gen. That way everyone can have realistic games that they all look photorealistic that is what the 80s kid thought wanted but we're gonna realize pretty quickly how boring is not having an art direction in illumination.
 
Can't they just do that? I mean software lumen already works in current gen consoles and it'll save a lot install space and dev time. That way you can have a next gen featured game with a non time consuming fallback to this gen. That way everyone can have realistic games that they all look photorealistic that is what the 80s kid thought wanted but we're gonna realize pretty quickly how boring is not having an art direction in illumination.
Depends on the engine. UE5, Anvil, CryEngine, ID will all have fallbacks. Most of Sony's own studios don't seem to have one and will have to build it in.

Art direction and path tracing are two different things. You don't replace art direction with path tracing. You just do it along with it. Like in real life. If natural lighting doesn't look right. You would still place fake lights as needed. The difference will be how the light behaves after and art direction will take that into consideration.

Path tracing and photo realism are also not the same thing. Not all CGI is photo realistic. Yet they have all been using path tracing forever. Arcane was path traced, right?
 
Last edited:
It depends on how far you want to go with the tech. If you decide to build your whole game on path tracing and not bake any lighting or build a software lumen-like fallback into your engine, your game would literally not be playable on PS5 gen.

that's not really an issue no. the only scenario where this would be an obstacle is if the game is truly, fully pathtraced, like Quake 2 RTX or Portal RTX.

any game that uses partial pathtracing with rasterised graphics that get pathtraced lighting added, can be easily changed to just use cheap raytracing.

any and all pathtraced reflections can even be entirely replaced with SSR (which is disgusting, and IMO SSR as a whole shouldn't even exist, but it's a cheap way to replace PT Reflections).

AO can also be replaced by SSAO. lots of the GI can be replaced by SSGI with a cheap RT pass on top (which is how Lumen works, even in hardware Lumen mode).

so, the fallback is just a shittier version of pathtracing. aka low quality RT + Screen Space tracing. neither of those need any real additional work.


Same with ML/AI. Given how overprovisioned next gen seems to be on TOPS, it would be a shame if devs don't go beyond upscaling and RR and build agentic AI into their games. If all your NPCs/racers/gameplay systems are driven by neural networks, you will have to design a different game to make cross gen viable.

The impact of that would be far more "generational" than faster load times and some half measure RT imo. So how big a leap PS6 is depends on to what extent devs are creative and to what extent they want to avoid the financial risk by continuing to support cross gen. The best case scenario is actually a bigger leap than PS4 to PS5.

that's a lot of wishful thinking. I doubt it's gonna happen.
especially when it comes to AAA titles, the design principle is to be as generic as possible as to not ruffle any feathers. and with how massive AAA dev teams are, there's no real place for innovation either.
 
Last edited:
The only way to hit back super hard is to take a hit on hardware, make sure it outclasses ps6 in every way system wise, and services wise.

Make sure Gta6 runs the best on your specific platform.

Sell it for 400/500.

You win the generation.
 
The only way to hit back super hard is to take a hit on hardware, make sure it outclasses ps6 in every way system wise, and services wise.

Make sure Gta6 runs the best on your specific platform.

Sell it for 400/500.

You win the generation.
That's not realistic at all they raised the price of series s/x
 
that's not really an issue no. the only scenario where this would be an obstacle is if the game is truly, fully pathtraced, like Quake 2 RTX or Portal RTX.

any game that uses partial pathtracing with rasterised graphics that get pathtraced lighting added, can be easily changed to just use cheap raytracing.

any and all pathtraced reflections can even be entirely replaced with SSR (which is disgusting, and IMO SSR as a whole shouldn't even exist, but it's a cheap way to replace PT Reflections).

AO can also be replaced by SSAO. lots of the GI can be replaced by SSGI with a cheap RT pass on top (which is how Lumen works, even in hardware Lumen mode).

so, the fallback is just a shittier version of pathtracing. aka low quality RT + Screen Space tracing. neither of those need any real additional work.
So you are telling me there isn't a generational difference between a partially path traced cyberpunk on a 5080 and the current console counterpart? And that's a game that wasn't even built from the ground up with path tracing in mind.

that's a lot of wishful thinking. I doubt it's gonna happen.
My argument is the capability is there, which means the marketing perception can be built around it. Not if devs would actually do it. That, like a I said, depends on how far they want to go with what's available on paper. We probably can agree that most will maintain status quo. Gameplay has barely evolved since PS2 or PS3 anyway.
 
Does this mean the PS6 also have an NPU?
I know Kepler already mentioned it not having an NPU.

So if it's not an NPU, what could it be?
Something bespoke like the I/O Complex?

Sadly, I think Cerny's recent comments confirm that those days are over.

O4nPsp8jU1mqav5i.jpg


It's a shame we will likely never see the full potential of the PS5 i/o tech because the economics of building a game engine around it aren't feasible, even for Sony first party who need to consider the PC audience. Until proven otherwise, I will contend that the reason we haven't seen Demon Souls PC Port is because Bluepoint developed the game at a fundamental level in alignment with Cerny's vision. Literally loading in 3gb+ of assets as the player is turning a corner. I'm still somewhat excited about PS6 and it's RT capabilities in particular, but it does kinda suck that bespoke hardware will very soon be a thing of the past.
 
luckily for me, i couldnt care what a consoles lifetime sales are. I didnt think about it then and i dont now. As long as its an awesome product and worth the money to me, I will buy it and enjoy it.

From these leaks, the XBox is so much more capable than the next PS that if you still want to own consoles and have the best third party experience it will be on the Xbox by the looks of things. That, with steam rumoured to run on it. If they can sort those things, it would be a perfect bedroom device for me.

I bought a rog ally day one because i played on one and thought it was cool as fuck. Probably sold half the amount of units as my steamdeck did but I really enjoyed it. Ill get the rog xbox ally x next year if the AI gets switched on and it can run FSR 4.

I like to buy stuff that I think is cool.

Pretty much where I stand.

The real lede here is the console project slipping into 2027. We already knew this would be an uber-powerful, uber-expensive machine, that part is not news.

2026 was the targeted date a lot of leakers mentioned prior, including our own @HeisenbergFX4 iirc. Launching with GTA6 was a big goal too iirc.

There were no credible leaks that confirmed a 2026 target, and the bit around GTA 6 doesn't make sense since GTA 6 was targeted at Fall 2025. The delay was only announced in May 2025
 
Until proven otherwise, I will contend that the reason we haven't seen Demon Souls PC Port is because Bluepoint developed the game at a fundamental level in alignment with Cerny's vision. Literally loading in 3gb+ of assets as the player is turning a corner. I'm still somewhat excited about PS6 and it's RT capabilities in particular, but it does kinda suck that bespoke hardware will very soon be a thing of the past.

Rift Apart is on PC…
 
The only way to hit back super hard is to take a hit on hardware, make sure it outclasses ps6 in every way system wise, and services wise.

Make sure Gta6 runs the best on your specific platform.

Sell it for 400/500.

You win the generation.

Microsoft cant afford to do this. Ironically, the more valuable company is the one saddled with higher opportunity cost (i.e. AI projects/investments).
 
I think it would be nice if someone compiled a leak vs reality table or graphic and it was required that all threads of leaks of future systems include this graphic for reference.
 
So you are telling me there isn't a generational difference between a partially path traced cyberpunk on a 5080 and the current console counterpart? And that's a game that wasn't even built from the ground up with path tracing in mind.

you can not make a game with pathtracing in mind, that can't also be downgraded easily. pathtracing is just raytracing but more accurate.

pathtracing can always be substituted with low quality raytracing and even screen space effects, unless it's something like Quake 2 RTX.
and there won't be something like Quake 2 RTX, because doing full pathtracing is basically impossible in complex modern games, even on high end PC hardware.

the only difference between a game that has pathtraced lighting, and one that only has RTGI + SSR + SSAO is that the latter will look worse. but the devs won't have to adjust much to get there.


My argument is the capability is there, which means the marketing perception can be built around it. Not if devs would actually do it. That, like a I said, depends on how far they want to go with what's available on paper. We probably can agree that most will maintain status quo. Gameplay has barely evolved since PS2 or PS3 anyway.

hence, there will be an infinite cross-gen period this time. there won't be a technical reason to not release on PS5.
 

Rift Apart was THE game used to tout the PS5's SSD and fast asset loading. Even more than Demon's Souls. The game's main gimmick is built around rapidly loading new areas after traversing portals.

Ergo, if Rift Apart runs fine on PC, Demons Souls should. And the reason we haven't seen it on PC is because Sony's taking their time…

How can you not follow something this simple?

Cracking Up Lol GIF
 
Rift Apart was THE game used to tout the PS5's SSD and fast asset loading. Even more than Demon's Souls. The game's main gimmick is built around rapidly loading new areas after traversing portals.

Ergo, if Rift Apart runs fine on PC, Demons Souls should. And the reason we haven't seen it on PC is because Sony's taking their time…

How can you not follow something this simple?

Cracking Up Lol GIF
I think the reason we haven't seen it is either because it's built on PS3 engine code from the original Demon's Souls, or because they somehow view it as a system seller (they did release it at launch). But yeah, everyone hyped up Rift Apart like you said.
 
Rift Apart was THE game used to tout the PS5's SSD and fast asset loading. Even more than Demon's Souls. The game's main gimmick is built around rapidly loading new areas after traversing portals.

So what? That doesn't change the fact that Demon Souls world design was much more dependent on PS5 storage system than Ratchet was.

Ergo, if Rift Apart runs fine on PC, Demons Souls should. And the reason we haven't seen it on PC is because Sony's taking their time…

Humor Boomer GIF
 
So what? That doesn't change the fact that Demon Souls world design was much more dependent on PS5 storage system than Ratchet was.



Humor Boomer GIF
World design? It's a PS3 game. The textures and assets are nice, but requiring that much bandwidth nice? Even if it does, Spider-Man 2 can stream in a ton of data, and it's not really a problem as long as you have a decent processor and SSD.
 
So what? That doesn't change the fact that Demon Souls world design was much more dependent on PS5 storage system than Ratchet was.



Humor Boomer GIF

?

If the game that was touted as 'the' showcase for the SSD can work, a more linear standard, linear one, will obviously work as well.
 
the only difference between a game that has pathtraced lighting, and one that only has RTGI + SSR + SSAO is that the latter will look worse. but the devs won't have to adjust much to get there.
IF the current gen console can handle RTGI in the game or if the engine supports some other form of realtime GI as fallback, then we agree, there isn't much dev work to support cross gen. UE 5 games would likely be indefinitely cross gen, as a result, until resolutions get so low that it becomes pointless. We already seem to be getting there, so I shudder at what resolutions cross gen UE5 games will be at. Lol. Or if they don't need any of these features and simply bake all lighting, then sure.

hence, there will be an infinite cross-gen period this time. there won't be a technical reason to not release on PS5.

I can even see GT8 having trouble supporting current gen if they push Sophy too hard next gen. The lackluster current gen CPU can only do so much. My wishful thinking isn't that far fetched when we have a game right now that is doing so much with AI, even without AI hardware. But yeah, I certainly have more unrealistic expectations on devs, given their output so far.
 
Last edited:
So what? That doesn't change the fact that Demon Souls world design was much more dependent on PS5 storage system than Ratchet was.



Humor Boomer GIF

Werent Rift Apart and Spider-Man 2 the "true" showcases of the I/O power of the PS5?


Or are you saying/implying Demons Souls Remake is technically more demanding on the I/O systems than Rift Apart and Spider-Man 2 which is the reason it hasnt been ported?
 
I expect xbox cpu to be higher clocked. But xsx cpu is also higher clocked vs ps5. Gpu could be higher clocked on ps6 thanks to better cooling. And Magnus also have Zen 6c to properly cool
Judging by Xbox's history of engineering wider gpus with lower clocks to make backwards compatibility with older systems more seamless, they'll likely stick with that approach for this gen as well. With the PS6 likely going for a narrower gpu pipeline with higher clocks enabled by RDNA 5's redesigned front end, larger caches and more efficient lithography. 54 CUs at variable clocks up to 3GHz and 68 CUs at 2.3 to 2.4 GHz for PS6 and Magnus respectively. They'll be close in power this way, but use similar approaches to those used in this gen's designs.

Where they'll most likely differ is in CPUs with Magnus being poised to have 3 Zen 6c cores plus 8 Zen 6 cores, with the PS6 rumoured as using a slightly different config, combining Zen 6 LP cores for background and Os and 7-8 Zen 6 cores for game logic (although those are preliminary and not set in stone for either console). Another thing Sony will definitely build upon is the bespoke AI unit they've built for the Ps5 pro to handle with PSSR, it doesn't make sense to abandon it, they'll more than likely add to it. And the Ram advantage will likely be taken by Magnus as well, with a larger and wider ram bus and higher capacity. Keep in mind, those are rumoured specs and for all we know, none of them might actually be the final chip.
 
Judging by Xbox's history of engineering wider gpus with lower clocks to make backwards compatibility with older systems more seamless, they'll likely stick with that approach for this gen as well. With the PS6 likely going for a narrower gpu pipeline with higher clocks enabled by RDNA 5's redesigned front end, larger caches and more efficient lithography. 54 CUs at variable clocks up to 3GHz and 68 CUs at 2.3 to 2.4 GHz for PS6 and Magnus respectively. They'll be close in power this way, but use similar approaches to those used in this gen's designs.

Where they'll most likely differ is in CPUs with Magnus being poised to have 3 Zen 6c cores plus 8 Zen 6 cores, with the PS6 rumoured as using a slightly different config, combining Zen 6 LP cores for background and Os and 7-8 Zen 6 cores for game logic (although those are preliminary and not set in stone for either console). Another thing Sony will definitely build upon is the bespoke AI unit they've built for the Ps5 pro to handle with PSSR, it doesn't make sense to abandon it, they'll more than likely add to it. And the Ram advantage will likely be taken by Magnus as well, with a larger and wider ram bus and higher capacity. Keep in mind, those are rumoured specs and for all we know, none of them might actually be the final chip.
Kepler has stated both specs are accurate and finalized. Also, both he and MLID agree Magnus will not run less than 3 ghz for GPU. MS has learned their lesson with the Series Consoles, they're going with much more powerful devices. They can afford to do it because the devices will be more expensive and much more Chonky. I expect two consoles, named Fatman and Little Boy on Xbox side.
 
That Magnus SOC looks massive. That can't be cheap.
It's only massive because it's chiplets, so ~50mm2 is wasted. If it was a monolithic apu it'll be around ~360-370 like the Xb1, Xb1x, and Series X.

And that's why being a chiplet design modular system helps save costs for both AMD and MS. They can use the 5 dies for various products including discrete graphics. Less waste, everything they create becomes dual purpose.

I think it will turn out like this:

Magnus with AT4 = Xbox Laptops and Xbox Handhelds
Magnus with AT3 = Xbox S console and bulkier Gaming Xbox Laptops
Magnus with AT2 = Xbox X console and Xbox PCs
Magnus with AT1 = Xbox PCs
Magnus with AT0 = Xbox Cloud, they will pair up AT0 with 2-4 Magnus SOCs each, and run 2 instances of X profiles for Ultimate users at 4k, and 4 instances of S profiles for Premium and Essential users at 1440/1080

Then also use AT2, AT1, AT0 for discrete graphics, basically 5080, 5090, and beyond tiers, or 6070, 6080, 6090 tier discrete GPUs to compete with Nvidia lineup.

AMD, and Xbox co-engineered this solution for longterm for both companies. 5 GPU dies designed for 5 form factors, all using same CPU SOC.


They mention portfolio of devices multiple times.

Magnus SOC with NPU and CPU is already way too power hungry for a handheld. 30+ watt TDP.
The handheld, if it comes out, will use monolithic off the shelf Zen6 + RDNA5 APUs such as the Medusa Halo Mini.
 
Last edited:
Magnus SOC is no more impressive than the Xb1x / Series X APU "effective" silicon area / clock wise. It'll feel more impressive because AMD cooked on RDNA5 and it's closest they have been to Nvidia since 2013.

As for the PS6, Playstation wants break-even / small profit at ~$599 so no more massive console, high clocks, liquid mercury, and 200w+ TDP. Back to the PS4 design philosophy.
 
Last edited:
ps6 will be using 2nm chips, that is a huge jump from 7nm...except sony has choosen to use a downspec level.

In last 2 gen, sony chose x70 class gpu, from rumors, ps6 could be using x60 tier.

7850 -> 3070 -> 10060x
The technology they've discussed is going to put it way above the RX 9070XT, so if that is 10060x level, it will pretty much be diminishing returns versus those cards above IMO.
 
Sadly, I think Cerny's recent comments confirm that those days are over.

It's a shame we will likely never see the full potential of the PS5 i/o tech because the economics of building a game engine around it aren't feasible, even for Sony first party who need to consider the PC audience. Until proven otherwise, I will contend that the reason we haven't seen Demon Souls PC Port is because Bluepoint developed the game at a fundamental level in alignment with Cerny's vision. Literally loading in 3gb+ of assets as the player is turning a corner. I'm still somewhat excited about PS6 and it's RT capabilities in particular, but it does kinda suck that bespoke hardware will very soon be a thing of the past.
As if Sony cares about PC gamers. Recent ports are quite bad and selling games to PC players is not even a priority now, even mediocre sales justify porting costs. "Buy better PC (with SSD, a lot of RAM and good CPU)" will be universal answers to woes.
In my take it just take a lot of time to adjust engine and pipeline for new paradigm and as games take as long as they do to develop, it's very slow process.
 
The technology they've discussed is going to put it way above the RX 9070XT, so if that is 10060x level, it will pretty much be diminishing returns versus those cards above IMO.

your comment made me realise that AMD absolutely drove themselves into a corner with that 90XX naming scheme lol...

they can't name a next generation card RX 10070 XT, that would look awful on the packaging and is just too many zeros. which means they will probably have to start over with a completely fresh naming scheme... maybe go back to 10X or something... but that then has the issue that the momentum of the 90XX series, which has decent momentum for AMD standards, can't be leveraged as well as a product line that is named as a clearly identifiable successor... like how the RTX4090 to RTX5090 jump shows a clear lineage.

super off topic... but... It really made me wonder just now...
 
Last edited:
ps6 will be using 2nm chips, that is a huge jump from 7nm...except sony has choosen to use a downspec level.

In last 2 gen, sony chose x70 class gpu, from rumors, ps6 could be using x60 tier.

7850 -> 3070 -> 10060x

I really doubt the PS6 SoC will be made in TSMCs N2 node.
This node is already being tapped for AI chips, that have much better profit margins.
I doubt there will be any wafers left for making a console.
 
your comment made me realise that AMD absolutely drove themselves into a corner with that 90XX naming scheme lol...

they can't name a next generation card RX 10070 XT, that would look awful on the packaging and is just too many zeros. which means they will probably have to start over with a completely fresh naming scheme... maybe go back to 10X or something... but that then has the issue that the momentum of the 90XX series, which has decent momentum for AMD standards, can't be leveraged as well as a product line that is named as a clearly identifiable successor... like how the RTX4090 to RTX5090 jump shows a clear lineage.

super off topic... but... It really made me wonder just now...
IMHO it is no different to when the ATI 970/980/980 Pro/FireGL X2-256 made a huge splash (with Doom3) before they were all obsolete to the 360 Xenos, and even more so to the HDR 10bit RSX and ATI PC cards, it feels cyclical to me.
 
Magnus SOC is no more impressive than the Xb1x / Series X APU "effective" silicon area / clock wise. It'll feel more impressive because AMD cooked on RDNA5 and it's closest they have been to Nvidia since 2013.

As for the PS6, Playstation wants break-even / small profit at ~$599 so no more massive console, high clocks, liquid mercury, and 200w+ TDP. Back to the PS4 design philosophy.
Sony going with PS4 design philosophy but charging $599 potentially only breaking even doesn't really make sense, $599 feels expensive for console to be honest $450-$500 yes then I would agree
 
Sony going with PS4 design philosophy but charging $599 potentially only breaking even doesn't really make sense, $599 feels expensive for console to be honest $450-$500 yes then I would agree

It's inflation + TSMC monopoly + AI boom. $600 in 2027 would be feel less money than $500 did in 2020.

Since 2013, inflation has caused a cumulative price increase of approximately 39% in the U.S., meaning that $1,000 in 2013 has the same purchasing power as about $1,390.71 today. The average annual inflation rate between 2013 and 2025 was about 2.79%, but this period saw significant variation, including high inflation rates of around 7% in 2021 and 6.5% in 2022.
 
Last edited:
IMHO it is no different to when the ATI 970/980/980 Pro/FireGL X2-256 made a huge splash (with Doom3) before they were all obsolete to the 360 Xenos, and even more so to the HDR 10bit RSX and ATI PC cards, it feels cyclical to me.

the issue I see is that the 9070 XT was seemingly finally an AMD card with lots of positive PR and momentum, only to then having to switch the entire naming scheme immediately after that.

it's like: "look! the 9070 XT! we finally have decent RT! We finally have decent ML Reconstruction!" and then they have start over with a new naming concept... imagine if Nintendo was essentially forced to name the Swtich 2 something other than Switch 2! that's how this feels to me lol. I don't envy the person who has to come up with the marketing and naming strategy of RDNA5.

maybe they should just stop with the numbers, and call it the AMD RDNA5 6, RDNA5 6XT, RDNA5 7, RDNA5 7XT and so on. because I feel like RDNA is an established name at this point, and that way they have enough for at least 5 more generations lol.
 
Last edited:
The only current GPUs that would be equal or more powerful vs the PS6 would be the 4090, 5080, and 5090. All current AMD GPUs would be obsolete. VRAM is another story, the only GPUs that would be fine is the 4090 or 5090. Assuming the current leaks regarding 2.5x PS5 raster performance, better RT vs Blackwell, and 30GB VRAM are true. If the RT isn't that significant over Blackwell then the 4080s would be competitive as well, barring the limited VRAM. Magnus is another story, the only current GPU better is the 5090, but who is really going to bet on Xbox?

Future cards are unknown, obviously the higher RDNA5 cards would be better and likely the 6070+ tier cards as well, depending on what Nvidia does with VRAM.
 
Sony going with PS4 design philosophy but charging $599 potentially only breaking even doesn't really make sense, $599 feels expensive for console to be honest $450-$500 yes then I would agree
I would agree and being despite having newer hardware pretty much having the same cu as the ps5 pro lesser bit speed I can honestly see the ps6 console hitting the ceiling quicker compared to magnus when it comes to games there for distance itself further from the ps6 expecially when the new tech comes to xbox
 
It's inflation + TSMC monopoly + AI boom.
Yeah I understand inflation but seeing the console $599 could difficult for casuals it'll be day 1 for me does, $599 include disk drive? I think has to at that price also how much do you think new Xbox would cost?
 
Last edited:
Kepler has stated both specs are accurate and finalized. Also, both he and MLID agree Magnus will not run less than 3 ghz for GPU. MS has learned their lesson with the Series Consoles, they're going with much more powerful devices. They can afford to do it because the devices will be more expensive and much more Chonky. I expect two consoles, named Fatman and Little Boy on Xbox side.
If they're aiming for a 300W power ceiling, then they're likely to avoid clocking at 3GHz with that wide of a gpu and there's no posts or reports online showing Kepler and MLiD agreeing that they'd clock their gpu at no lower than 3GHz, at least , none that I could find and I sifted through their posts on X and MLiD's video on YouTube, which Kepler reposted, and that probably doesn't confirm anything. If it does, then the delta between the two next gen systems is at ≈23% on the gpu side, lower than the delta between SX and PS5. Still, I doubt they'll clock that high to make backwards compatibility with older systems more seamless.
 
Last edited:
I would agree and being despite having newer hardware pretty much having the same cu as the ps5 pro lesser bit speed I can honestly see the ps6 console hitting the ceiling quicker compared to magnus when it comes to games there for distance itself further from the ps6 expecially when the new tech comes to xbox
I think neural radiance caches and Neural arrays will be there for RDNA5 and the Xbox hardware, and I suspect all the functionality of Radiance cores will be available via software to use all the other subs systems together via async, but I have my doubts radiance cores hardware accelerated will be in the first set of RDNA5 cards and the Xbox based on that Amethyst video.

I think that will be a PS6 feature with 4 WGPs taken from a 28WGP GPU to use exclusively for the Neural nets for the Neural Radiance Caches in the h/w Radiance Cores which I think are going to be satellite processors rather than async computation like the original NRC Sony Patent has listed in the title IIRC - from the Arstechnica article about the 2021 US patent.

This difference would lend itself to the view that the PS6 would outperform or equal a 32WGP Xbox in path tracing and maybe exceeded it when PS6 was using low latency PSSR2 at lower native and Xbox was using off the peg FSR.
 
If they're aiming for a 300W power ceiling, then they're likely to avoid clocking at 3GHz with that wide of a gpu and there's no posts or reports online showing Kepler and MLiD agreeing that they'd clock their gpu at no lower than 3GHz, at least , none that I could find and I sifted through their posts on X and MLiD's video on YouTube, which Kepler reposted, and that probably doesn't confirm anything. If it does, then the delta between the two next gen systems is at ≈23% on the gpu side, lower than the delta between SX and PS5. Still, I doubt they'll clock that high to make backwards compatibility with older systems more seamless.
I could see the 3GHz clock being reserved for the Radiance cores and maybe any 1 of 6 Neural arrays at any one time, moving heat around like a hot potato to keep two high performance features for path tracing and PSSR, and all the older PS5 raster/geometry techniques using lower PS5 clocks
 
Yeah I understand inflation but seeing the console $599 could difficult for casuals it'll be day 1 for me does, $599 include disk drive? I think has to at that price also how much do you think new Xbox would cost?

No disk drive included. $799 minimum for the next Xbox.
 
Last edited:
No disk drive included. $799 minimum for the next Xbox.
Looking at the leaks there seems to be a bigger gap then $200 Xbox has bigger chip + npu full zen6 cores more memory, taking all that into account I think $1000 is at least don't forget they're charging $800 for the series x the only way it's $800 is if Microsoft eat some of the cost but going by recent actions they won't be
 
Last edited:
Top Bottom