• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

[MLiD] XBOX Magnus RDNA 5 Finalized

I think it helped devs get better at optimisation.

So it actually helped them.

No amount of optimisation can help with missing features though.

Kill Me Smh GIF
 
$399 consoles.
One launched in 2013. The other in 2020.

Context matters.

By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."

Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.

Don't mean to speak on his behalf but I'm pretty sure Zathalus Zathalus *is* talking about the devices relative to their times.

The PS4/XBO were very famously running weak CPU's which were underpowered and outdated even before they released.
 
Last edited:
He might still be butthurt after all the truth bombs FOSSpatents famously dropped on the likes of him.
maxresdefault.jpg
That's just silly man, I don't think anyone was/is that invested.
I think it helped devs get better at optimisation.

So it actually helped them.

No amount of optimisation can help with missing features though.

Naw man I personally think XSS was a big mistake. I understand why they did it trying to catch sales from all the people that don't want to spend as much on a console but they should have just stuck to their guns only put out the XSX and increased the value there somehow.

Sadly I think both ms and Sony are going to put out multiple skus next gen aiming for the same thing even if they are dressed up as a mobile variant for convenience. Its gonna hold the full console experiences back.

Id much rather they put out a separate platform entirely like the psp/vita or continue to focus on improving a streaming model like the portal or just streaming to phones/tablets etc.
 
Naw man I personally think XSS was a big mistake. I understand why they did it trying to catch sales from all the people that don't want to spend as much on a console but they should have just stuck to their guns only put out the XSX and increased the value there somehow.

Sadly I think both ms and Sony are going to put out multiple skus next gen aiming for the same thing even if they are dressed up as a mobile variant for convenience. Its gonna hold the full console experiences back.

Id much rather they put out a separate platform entirely like the psp/vita or continue to focus on improving a streaming model like the portal or just streaming to phones/tablets etc.
No getting away from that.

Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.

Devs are more held back by budget than hardware these days.
 
No getting away from that.

Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.

Devs are more held back by budget than hardware these days.
So surely xbox first party games should have unlocked "the magic" right? Or were they already optimizing for PS5 way back at start of gen?
 
As Kepler posted earlier, too late to make big changes in hardware project, so doesnt matter if ps6 release in 2027 or 2029, we will get same hw
Nothing to stop Sony from saying, given state of market (xbox shat the bed, switch 2 is switch pro) and AI ram bubble, scrap Orion as we know it and regroup in 2028 to get a more performant design that may push PS5 owners to upgrade.
 

SoloKingRobert SoloKingRobert mistakenly exposing himself as Florian was one of the all-time highlight events during my time on GAF. Funny thing is that he was completely wrong about everything. Riky Riky was the one who correctly said the powers that be in UK government would pressure CMA to back off and let the acquisition go through. Florian was the guy who said Microsoft could just "waive" having to get CMA approval. Just proves that corporations will pay shills as long as they can pretend to know what they are talking about.

Fond memories, indeed
 
Last edited:
Not even Xbox studios themselves who initially had an exclusives policy? which means games could have used it. Tell me why Flight Sim didn't use it, or why FH didn't use it then? It was an in-house engine.

The games came to PS5 with little to no impact on performance or visuals later when xbox switched strategy at the end.

Like VRS, Mesh shaders just weren't all that important/efficient to bother with and you could even implement compute shaders or software VRS which not only gave you support on older graphics cards and other consoles but more freedom for optimisation too (look at what CoD did for example).

Suggesting it was being held back by PS5 when PS5 was totally out of the equation wth xbox exclusives tells you everything you need to know. Either Xbox themselves chose not to use it or they did and the alternative ports worked just as well anyway.
Damn man... did mesh shaders rape your dog or something? Hate is strong!
 
No getting away from that.

Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.
Developers (like The Coalition) fighting for their lives for months/a year to make games (and The Matrix demo) work on Series S.

Meanwhile Vaibhavpisal:
ak7ckl.jpg


Whats funny to me is certain fanbase keep bragging about Cerny magic.

In reality its "as basic as it gets" console.

Its PC and Xbox that has extra magic sauce that gets underutilised.
Nobody is really bragging about anything. What's actually funny is that you continously defend the most "basic of consoles" like the Series S while talking about "xbox extra magic sauce" that amounts to nothing. Roll back to when these released and you were even going against PC with it:

Everyone's favourite little console is under spotlight again I see.

Anyways, one of my relatives have made a pc that costs 3 times Series S. (2060 super)

It's targeting 1080p and I can't see difference between two. A win for Series S if you ask me.

But I have seen almost all videos from Elanaldebits involving Series S/X comparison. I was super sceptical before getting Series S as I thought I would miss out.

On 1080p screen, Series S maxes it out. I wouldn't gain much by using more powerful hardware. (Except supersampling which I am yet to see in action)

Do you still believe this (when XSS is missing RT lighting or runs at half the framerate?) or will you be claiming to see "all the difference" now when the expensive xbox releases and you no doubt get it?

It was the Xbox fanbase most of the time talking about "secret sauce" when in reality it amounted to nothing. People like you and riky. This is the sort of shit you and riky were peddling (of course you liked it) when people were telling you Series S will struggle even more as the gen progressed.

It's the people who were jumping up and down repeating their marketing terms like sustained clocks, SFS, VRS, Mesh Shaders, Velocity Architecture, 'Smart Delivery', and all that other nonsense and were ultimately humbled in the end when actual games/results came. Series S didn't even live up to MS's own claims yet here you are still defending it.
 
Damn man... did mesh shaders rape your dog or something? Hate is strong!
What kind of reply is this? Stating that Mesh shaders had little impact even in exclusive games where other consoles weren't holding back in-house engines at Playground and Asobo is not "hate". It's reality. PS5 simply wasn't holding back things like mesh shaders.

If it had raped my dog it would have had more impact on my gaming than it currently does.
 
Whats funny to me is certain fanbase keep bragging about Cerny magic.

In reality its "as basic as it gets" console.

Its PC and Xbox that has extra magic sauce that gets underutilised.
it ain't magic if its not being used in most games. Also as another member mentioned even Xbox exclusives didn't use it.
 
Last edited:
Developers (like The Coalition) fighting for their lives for months/a year to make games (and The Matrix demo) work on Series S.

Meanwhile Vaibhavpisal:
ak7ckl.jpg



Nobody is really bragging about anything. What's actually funny is that you continously defend the most "basic of consoles" like the Series S while talking about "xbox extra magic sauce" that amounts to nothing. Roll back to when these released and you were even going against PC with it:





Do you still believe this (when XSS is missing RT lighting or runs at half the framerate?) or will you be claiming to see "all the difference" now when the expensive xbox releases and you no doubt get it?

It was the Xbox fanbase most of the time talking about "secret sauce" when in reality it amounted to nothing. People like you and riky. This is the sort of shit you and riky were peddling (of course you liked it) when people were telling you Series S will struggle even more as the gen progressed.

It's the people who were jumping up and down repeating their marketing terms like sustained clocks, SFS, VRS, Mesh Shaders, Velocity Architecture, 'Smart Delivery', and all that other nonsense and were ultimately humbled in the end when actual games/results came. Series S didn't even live up to MS's own claims yet here you are still defending it.

You went so far back yet somehow missed my recent posts where I mention its struggling in recent games.

It does have feature set. It's underutilised but it has it nonetheless. Mentioning it is completely fair.

If PS5 had it, you bet it would have been put to use. So not having it is a bummer.

Mesh shaders is a new technique that devs would have gotten better at using, over time. Thats what learning hardware is, in traditional sense.
it ain't magic if its not being used in most games. Also as another member mentioned even Xbox exclusives didn't use it.
Only xbox studio that is tech competent is id Tech. And Coalition to some extent.

You really cannot use Grounded as example to prove anything.
 
Yeah. They didn't teleport you to your office either. But those are not bottlenecks but limitations of the available technology of the era.
They are absolutely bottlenecks. Yes, it is limited by AMD as I pointed out, but it is still absolutely bottlenecks for both systems.

$399 consoles.
One launched in 2013. The other in 2020.

Context matters.

By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."

Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.
Context does matter. And I'm judging each console by that context. Considering the price point and the specifications at the time, the PS4 CPU performance was dog shit as it was not much of an upgrade over the previous generation and outperfomed by slow dual core CPUs on PC. It was one of the biggest bottlenecks of the PS4 generation. One of the reasons quality/performance modes were not common was due to the CPU simply being capable of 60fps. Performance modes that did happen struggled to reach a stable 60.

PS5 RT capabilities speak for themselves, outperformed by almost two year old entry level GPUs. The RT capabilities in RDNA2 are a joke, especially without Infinity Cache, and barely better than GPUs that had zero hardware RT acceleration. And RT was obviously a design goal of the PS5. It was advertised over and over.
 
Thing is, 99.99% of games don't use Mesh Shaders, precisely because PS5 is the lead dev environment, and since it lacked them, most devs didn't bother learning.

So PS5 not having full RDNA2 held things back from advancing. We couldn't really see the differences Mesh Shaders would bring to the AAA games.

For example, that Space Marines 2 that the Apple 🍎 posted if true.

It doesn't seem like anything major is missing from Orion, only thing I can see is the NPU but even MLID was unsure if that's missing.
Actually Space Marine 2 does not use mesh shaders.


In this case you can see that a 5700XT performs like a 2070 Super. If mesh shaders would be in use, the 2070S would far outperform it, just like in Alan Wake 2.

Mesh Shaders are not even used by Xbox's own studios. Sadly it has been nothing but a nothing burger.

There is no hardware acceleration for Nanite, because current hardware rasterizers on Nvidia, AMD and Intel work on a 2x2 pixel grid.
Nanite rasterizes in software to avoid this bottleneck, when using very dense geometry.

UE5 does support Primitive Shaders. And it's pretty much as fast as Mesh Shaders. They are very similar in a high overview. The main difference is that Primitive shaders do not support the Amplification Stage.

RDNA2 also does not support Mesh Shaders. It still uses Primitive Shaders, though a more advanced version than RDNA1. But then the driver maps and compiles Mesh Shaders into the Primitive Shader.
AMD never did the same translation from Mesh Shaders, with RDNA1.

Alan Wake only supports Mesh Shaders. ANd a more recent patch added support for the traditional Geometry Pipeline.
Nanite does support mesh shading, for geometry that is bigger than a pixel. However it is completely useless, I have measured no performance differences. And also no performance differences in games. RDNA2 has full mesh shading support, IDK why you say it stll uses primitive shading. Primitive shading is not exposed on PC at all, only on PS5 through their own APIs. You can also compare a 5700XT to a 2070 Super and see there's no performance difference in UE5 games.
 
Last edited:
Sony only went truly custom GPU wise on the Pro so it checks.

PlayStation 5 went custom on other parts like the SSD etc.

PlayStation 6 GPU will probably be custom like the 5 Pro if I had to guess.
 
Nanite does support mesh shading, for geometry that is bigger than a pixel. However it is completely useless, I have measured no performance differences. And also no performance differences in games. RDNA2 has full mesh shading support, IDK why you say it stll uses primitive shading. Primitive shading is not exposed on PC at all, only on PS5 through their own APIs. You can also compare a 5700XT to a 2070 Super and see there's no performance difference in UE5 games.

I didn't say Nanite doesn't support Mesh Shaders. I said they are different techs, working at different stages of the rendering pipeline.
And it was confirmed by AMD themselves that RDNA2 supports Primitive Shaders and maps and emulates the Mesh Shader code into the the Primitive Shader pipeline.
So no, RDNA2 does not have full support for Mesh Shaders.
 


aiYUsGXMEvNNQK34.png

Chapters:
0:00 XBOX Magnus Silicon Finalized
4:14 AMD RDNA 5 dGPUs are Ready for 2027!
8:05 Intel Nova Lake TDP & Die Size Rumors
14:20 Titan Lake Launches w/ Nvidia Graphics in 2029!
19:46 I still expect Panther Lake's iGPU to Age Well
21:35 Nova Lake AX Leak – RTX 5080 Performance?

zfwZMVd2q2lWUOt0.jpeg

Make Xbox Sell Again, I know how sales men behave like they force you to buy something without knowing what it really is and these s really good at their job. From my point of view it's an orchestrated crime scene.
 
You don't think that there will be games that use the PS6's memory to the fullest extent?
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.

Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.

For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.

PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.

Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?

30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.

He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
4jmRM0d.jpeg

(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)
 
Last edited:
He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
4jmRM0d.jpeg

(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)

with controller $1000 God bless you.😊
 
Last edited:
You went so far back yet somehow missed my recent posts where I mention its struggling in recent games.

It does have feature set. It's underutilised but it has it nonetheless. Mentioning it is completely fair.

If PS5 had it, you bet it would have been put to use. So not having it is a bummer.

Mesh shaders is a new technique that devs would have gotten better at using, over time. Thats what learning hardware is, in traditional sense.

Only xbox studio that is tech competent is id Tech. And Coalition to some extent.

You really cannot use Grounded as example to prove anything.
its been 5 years and ps5 and series x are still trading blows. that's all that matters and especially for the consumers is what run games better. we can talk about real advantages like maybe series x could use fsr4 that would be a game changer other then that series x is very disappointing for being touted 30% more powerful then ps5.
 
Sony only went truly custom GPU wise on the Pro so it checks.

PlayStation 5 went custom on other parts like the SSD etc.

PlayStation 6 GPU will probably be custom like the 5 Pro if I had to guess.
"Big chunks of RDNA 5, or whatever AMD ends up calling it, are coming out of engineering I am doing on the project… And again, this is coming out of trying to move things forward. There are no restrictions on the way any of it can be used."
- Mark Cerny

That reads like Cerny's work designing the PS6 is directly helping AMD shape RDNA5's architecture, and AMD is free to use those improvements everywheren.

It could be a situation where Sony stop developing the PS6 GPU, but AMD continues to add more stuff like with the PS5 and RDNA2.
 
You went so far back yet somehow missed my recent posts where I mention its struggling in recent games.

It does have feature set. It's underutilised but it has it nonetheless. Mentioning it is completely fair.
I went back to show you claiming it wouldn't struggle on future games because you put too much faith in "secret sauce".
The point is you put too much weight on this "feature set" simply because it is (at least was) an xbox specific "secret sauce" when in practical terms it is basic iteration, and amounts to very little. For example is implementing something like hardware VRS that important when it offers poorer image quality with blockiness for like a 2% fps increase? Meanwhile PS5/XSX was offering a literal 100% fps boost on games and you felt that you're not missing out on much by getting a Series S back then? You think PS5 was what was holding back the gen over XSS? Why did you favour the S and its "featureset" rather than the overall better system.

This is laser focused concentrating on inconsequential things just for that "favourable" comparison to PS5 when it's not really favourable in practical terms at all. Like Riky and his XSS "in the dust" claims.

Akin to ignoring the overall worse taste of the pudding and claiming if the chef used himalayan pink salt which was available in his spice rack instead of regular table salt the pudding would have been very different. But tt wouldn't have been that different at all. You just favour the chef.
 
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.

Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.

For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.

PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.

Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?

30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.

He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
4jmRM0d.jpeg

(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)

I gotta question you on some things like cheaper assembly and wifi chips over time. 🤔
 
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.

Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.

For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.

PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.

Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?

30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.

He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
4jmRM0d.jpeg

(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)
Why do you assume Sony would see the same % increases in RAM as us?
 
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.

Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.

For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.

PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.

Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?

30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.

He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
4jmRM0d.jpeg

(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)

Biggest Bullshit I have ever read here. And that table. Lol

You just sound like Sony please don't future proof PS6 and bottleneck Switch 2
 
So the next gen games after Series are basically PC games right? The PC games will utilize magnus extra CPU/GPU/NPU, RAM etc, but those PC games can obviously run on NVIDIA, Intel. Magnus is just there to salvage xbox as a transition step to full blown PC gaming. PC gaming will pretty much have console like experience with console like UI, and can transition back to full blown Windows 11 desktop productivity.

I hope I have this correct.
 
Why do you assume Sony would see the same % increases in RAM as us?
This is assuming that they get 15% off the spot price. And assuming they can lower the cost of everything substantially. It's basically me showing that at these prices, how difficult 30GB GDDR7 is to sell.

Sony is buying high end memory here (24Gbit 32Gbps GDDR7). It's not the cheap 8Gbit 14Gbps GDDR6 they use on PS5.
 
Is the FUD machine gearing up early for this one?

The PS5 absolutely benefited from the RDNA2 pipeline improvements, even though its GPU is a custom design that retains RDNA1 elements.

- Has higher IPC per TFLOP than RDNA1
- Hardware ray accelerators
- Improved geometry pipeline
- Better cache hierarchy
- Improved shader scheduling
- Better wave32 utilization
- Higher clock scaling efficiency
 
Hard to tell due to Nvidia's insane markup.

But looking at the 9070XT and 5080 launch and current prices, it seems like only DRR memory has really seen the big increases.

I'm not even talking about consumer stuff. Nvidia buys incredible amounts of GDDR7 and HBM memory for data centers.
 
So the next gen games after Series are basically PC games right? The PC games will utilize magnus extra CPU/GPU/NPU, RAM etc, but those PC games can obviously run on NVIDIA, Intel. Magnus is just there to salvage xbox as a transition step to full blown PC gaming. PC gaming will pretty much have console like experience with console like UI, and can transition back to full blown Windows 11 desktop productivity.

I hope I have this correct.
No, we don't have 100% full confirmation either way yet, however, IMO it will most likely be the status quot of Console SKUs for the Xbox ecosystem. Series Consoles aren't going anywhere. So those same GDKX created SKUs would be optimized for Magnus. For the Epic, Steam ecosystems, devs could optionally optimize or create presets for the device.
 
Biggest Bullshit I have ever read here. And that table. Lol
None of it is bullshit. And it's not future proof as is.

Think about it, RTX 5070 + ~20GB is basically PS6 GPU. It's A different space-time tradeoff target.

It doesn't have that much compute

We have yet to see Nvidia's next gen. That will be Nvidia's vision for the future of gaming. And consoles lost to Nvidia last gen.
 
Last edited:
Do you think they will get cheaper memory than Nvidia?
Absolutely not.

Nvidia is the Tier 0 customer. They are bigger than Apple at this point. And their gaming GDDR orders are bundled with Rubin CPX orders for massive volume.

And Nvidia has much higher margins and free cash so they can outbid. The issue with Nvidia is that their AI side is a memory blackhole.
 
I'm not even talking about consumer stuff. Nvidia buys incredible amounts of GDDR7 and HBM memory for data centers.
I'm using the consumer stuff as an example to show that AMD, Nvidia and Sony isn't going to be paying what we are paying since they are buying straight from the manufacturers.
 
This is assuming that they get 15% off the spot price. And assuming they can lower the cost of everything substantially. It's basically me showing that at these prices, how difficult 30GB GDDR7 is to sell.

Sony is buying high end memory here (24Gbit 32Gbps GDDR7). It's not the cheap 8Gbit 14Gbps GDDR6 they use on PS5.
Sony wouldn't be paying at spot prices though.

Sony would be paying whatever deal they manage to contract with RAM manufacturers since they would be buying huge quantities.
 
Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.
What are you smoking? PS5 has unified memory pool, so you can't state VRAM ammount like that, unless you're pulling somekind of avg out of your ass.
 
Hold up, then it might be possible to run this like an xbox and not need ubisoft app?
Never thought I would be pulling for the 'console' way.
Steam might be in trouble with those games that require a separate app. I would buy the Xbox version of the game in those instances.
I might actually buy GTA6 on Xbox just to avoid having to use the Rockstar app.

edit- March Climber March Climber What do you think? I remember you saying you buy the games that have separate store apps on the PS5, and most everything else on PC.
I'm going to buy the new Xbox as my primary hybrid PC and still keep the same process. Even if there is any improvement to the process of launching games on the new Xbox, it still runs into my second issue, which is that I simply don't trust the longevity of these launchers, and thus I wouldn't trust the longevity of my purchases on those launchers.
 
What are you smoking? PS5 has unified memory pool, so you can't state VRAM ammount like that, unless you're pulling somekind of avg out of your ass
I actually can state VRAM just like that.

PS5 has 12.5GB exposed to developers. 3.5GB for CPU usage is a reasonable lowball. Games like BG3 will exceed. None will be able to give GPU anything around 10GB. UMA benefit is not needing to copy and manage two memory pools, CPU/GPU still work on different things.

It's necessary to talk about VRAM like that because mystical UMA talk prompted people to say bullshit like you needed 12GB VRAM to match PS5 on PC when PS5 barely has 12.5 GB total (LMAFO).
 
Last edited:
Sony wouldn't be paying at spot prices though.
I said 15% off spot. And again it's illustrative for how cost prohibiting current prices are.

It's not just Memory that increased. Remember that NAND went up a lot.

Sony needs memory to be dirt cheap for PS6 design to make sense at all. And right now that seems like a questionable gamble.
 
Cerny magic is simply discarding features he deems not truly essential to gaming performance as he predicts the coming generation's software development will play out, in order to save die space, clock speed, etc. to keep the SoC efficient as possible.

Because Playstaion will end up the dominant development platform, what he deemed non-essential ends up not being used widely, not because he predicted correctly, but because the platform dominance made it "correct"...

I see keplar likes this post.... Hmm cerny tesco sauce then
 
I actually can state VRAM just like that.

PS5 has 12.5GB exposed to developers. 3.5GB for CPU usage is a reasonable lowball. Games like BG3 will exceed. None will be able to give GPU anything around 10GB. UMA benefit is not needing to copy and manage two memory pools, CPU/GPU still work on different things.

It's necessary to talk about VRAM like that because mystical UMA talk prompted people to pull say bullshit like you needed 12GB VRAM to match PS5 on PC (LMAFO).
Thanks for confirming, full of shit.
Tales from the ass.
fart GIF
 
Top Bottom