Topher
Identifies as young
I think it helped devs get better at optimisation.
So it actually helped them.
No amount of optimisation can help with missing features though.
I think it helped devs get better at optimisation.
So it actually helped them.
No amount of optimisation can help with missing features though.
$399 consoles.
One launched in 2013. The other in 2020.
Context matters.
By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."
Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.
Now I'm 100% sure, hello Shitjutsu.
Get the fuck out with Kingsolorobert or whatever he was called here.He might still be butthurt after all the truth bombs FOSSpatents famously dropped on the likes of him.
![]()
That's just silly man, I don't think anyone was/is that invested.He might still be butthurt after all the truth bombs FOSSpatents famously dropped on the likes of him.
![]()
I think it helped devs get better at optimisation.
So it actually helped them.
No amount of optimisation can help with missing features though.
No getting away from that.Naw man I personally think XSS was a big mistake. I understand why they did it trying to catch sales from all the people that don't want to spend as much on a console but they should have just stuck to their guns only put out the XSX and increased the value there somehow.
Sadly I think both ms and Sony are going to put out multiple skus next gen aiming for the same thing even if they are dressed up as a mobile variant for convenience. Its gonna hold the full console experiences back.
Id much rather they put out a separate platform entirely like the psp/vita or continue to focus on improving a streaming model like the portal or just streaming to phones/tablets etc.
So surely xbox first party games should have unlocked "the magic" right? Or were they already optimizing for PS5 way back at start of gen?No getting away from that.
Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.
Devs are more held back by budget than hardware these days.
We look back with fond memories, exactly.Get the fuck out with Kingsolorobert or whatever he was called here.
You greenrats are truly having flashbacks and PTSD ..
Nothing to stop Sony from saying, given state of market (xbox shat the bed, switch 2 is switch pro) and AI ram bubble, scrap Orion as we know it and regroup in 2028 to get a more performant design that may push PS5 owners to upgrade.As Kepler posted earlier, too late to make big changes in hardware project, so doesnt matter if ps6 release in 2027 or 2029, we will get same hw
We look back with fond memories, exactly.
Damn man... did mesh shaders rape your dog or something? Hate is strong!Not even Xbox studios themselves who initially had an exclusives policy? which means games could have used it. Tell me why Flight Sim didn't use it, or why FH didn't use it then? It was an in-house engine.
The games came to PS5 with little to no impact on performance or visuals later when xbox switched strategy at the end.
Like VRS, Mesh shaders just weren't all that important/efficient to bother with and you could even implement compute shaders or software VRS which not only gave you support on older graphics cards and other consoles but more freedom for optimisation too (look at what CoD did for example).
Suggesting it was being held back by PS5 when PS5 was totally out of the equation wth xbox exclusives tells you everything you need to know. Either Xbox themselves chose not to use it or they did and the alternative ports worked just as well anyway.
Developers (like The Coalition) fighting for their lives for months/a year to make games (and The Matrix demo) work on Series S.No getting away from that.
Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.
Nobody is really bragging about anything. What's actually funny is that you continously defend the most "basic of consoles" like the Series S while talking about "xbox extra magic sauce" that amounts to nothing. Roll back to when these released and you were even going against PC with it:Whats funny to me is certain fanbase keep bragging about Cerny magic.
In reality its "as basic as it gets" console.
Its PC and Xbox that has extra magic sauce that gets underutilised.
Everyone's favourite little console is under spotlight again I see.
Anyways, one of my relatives have made a pc that costs 3 times Series S. (2060 super)
It's targeting 1080p and I can't see difference between two. A win for Series S if you ask me.
But I have seen almost all videos from Elanaldebits involving Series S/X comparison. I was super sceptical before getting Series S as I thought I would miss out.
On 1080p screen, Series S maxes it out. I wouldn't gain much by using more powerful hardware. (Except supersampling which I am yet to see in action)
What kind of reply is this? Stating that Mesh shaders had little impact even in exclusive games where other consoles weren't holding back in-house engines at Playground and Asobo is not "hate". It's reality. PS5 simply wasn't holding back things like mesh shaders.Damn man... did mesh shaders rape your dog or something? Hate is strong!
it ain't magic if its not being used in most games. Also as another member mentioned even Xbox exclusives didn't use it.Whats funny to me is certain fanbase keep bragging about Cerny magic.
In reality its "as basic as it gets" console.
Its PC and Xbox that has extra magic sauce that gets underutilised.
Developers (like The Coalition) fighting for their lives for months/a year to make games (and The Matrix demo) work on Series S.
Meanwhile Vaibhavpisal:
![]()
Nobody is really bragging about anything. What's actually funny is that you continously defend the most "basic of consoles" like the Series S while talking about "xbox extra magic sauce" that amounts to nothing. Roll back to when these released and you were even going against PC with it:
Do you still believe this (when XSS is missing RT lighting or runs at half the framerate?) or will you be claiming to see "all the difference" now when the expensive xbox releases and you no doubt get it?
It was the Xbox fanbase most of the time talking about "secret sauce" when in reality it amounted to nothing. People like you and riky. This is the sort of shit you and riky were peddling (of course you liked it) when people were telling you Series S will struggle even more as the gen progressed.
It's the people who were jumping up and down repeating their marketing terms like sustained clocks, SFS, VRS, Mesh Shaders, Velocity Architecture, 'Smart Delivery', and all that other nonsense and were ultimately humbled in the end when actual games/results came. Series S didn't even live up to MS's own claims yet here you are still defending it.
Only xbox studio that is tech competent is id Tech. And Coalition to some extent.it ain't magic if its not being used in most games. Also as another member mentioned even Xbox exclusives didn't use it.
They are absolutely bottlenecks. Yes, it is limited by AMD as I pointed out, but it is still absolutely bottlenecks for both systems.Yeah. They didn't teleport you to your office either. But those are not bottlenecks but limitations of the available technology of the era.
Context does matter. And I'm judging each console by that context. Considering the price point and the specifications at the time, the PS4 CPU performance was dog shit as it was not much of an upgrade over the previous generation and outperfomed by slow dual core CPUs on PC. It was one of the biggest bottlenecks of the PS4 generation. One of the reasons quality/performance modes were not common was due to the CPU simply being capable of 60fps. Performance modes that did happen struggled to reach a stable 60.$399 consoles.
One launched in 2013. The other in 2020.
Context matters.
By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."
Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.
Actually Space Marine 2 does not use mesh shaders.Thing is, 99.99% of games don't use Mesh Shaders, precisely because PS5 is the lead dev environment, and since it lacked them, most devs didn't bother learning.
So PS5 not having full RDNA2 held things back from advancing. We couldn't really see the differences Mesh Shaders would bring to the AAA games.
News Rumor Hardware Post in thread '[MLiD] XBOX Magnus RDNA 5 Finalized'
The more I think about it the more I'm going to pick one up for my lady the steam machine is just to weak.
![]()
Primitive shaders are pretty much the same between RDNA1 and 2. In PC space RDNA1 just lacks Api support, something that Sony did with PS5.
So how much more efficient PS5 really is? Reality is that there are games that perform better on PS5, games that perform better on Xbox and games that are more or...
For example, that Space Marines 2 that the Appleposted if true.
It doesn't seem like anything major is missing from Orion, only thing I can see is the NPU but even MLID was unsure if that's missing.
Nanite does support mesh shading, for geometry that is bigger than a pixel. However it is completely useless, I have measured no performance differences. And also no performance differences in games. RDNA2 has full mesh shading support, IDK why you say it stll uses primitive shading. Primitive shading is not exposed on PC at all, only on PS5 through their own APIs. You can also compare a 5700XT to a 2070 Super and see there's no performance difference in UE5 games.There is no hardware acceleration for Nanite, because current hardware rasterizers on Nvidia, AMD and Intel work on a 2x2 pixel grid.
Nanite rasterizes in software to avoid this bottleneck, when using very dense geometry.
UE5 does support Primitive Shaders. And it's pretty much as fast as Mesh Shaders. They are very similar in a high overview. The main difference is that Primitive shaders do not support the Amplification Stage.
RDNA2 also does not support Mesh Shaders. It still uses Primitive Shaders, though a more advanced version than RDNA1. But then the driver maps and compiles Mesh Shaders into the Primitive Shader.
AMD never did the same translation from Mesh Shaders, with RDNA1.
Alan Wake only supports Mesh Shaders. ANd a more recent patch added support for the traditional Geometry Pipeline.
Nanite does support mesh shading, for geometry that is bigger than a pixel. However it is completely useless, I have measured no performance differences. And also no performance differences in games. RDNA2 has full mesh shading support, IDK why you say it stll uses primitive shading. Primitive shading is not exposed on PC at all, only on PS5 through their own APIs. You can also compare a 5700XT to a 2070 Super and see there's no performance difference in UE5 games.
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.You don't think that there will be games that use the PS6's memory to the fullest extent?
He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
![]()
(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)
its been 5 years and ps5 and series x are still trading blows. that's all that matters and especially for the consumers is what run games better. we can talk about real advantages like maybe series x could use fsr4 that would be a game changer other then that series x is very disappointing for being touted 30% more powerful then ps5.You went so far back yet somehow missed my recent posts where I mention its struggling in recent games.
It does have feature set. It's underutilised but it has it nonetheless. Mentioning it is completely fair.
If PS5 had it, you bet it would have been put to use. So not having it is a bummer.
Mesh shaders is a new technique that devs would have gotten better at using, over time. Thats what learning hardware is, in traditional sense.
Only xbox studio that is tech competent is id Tech. And Coalition to some extent.
You really cannot use Grounded as example to prove anything.
"Big chunks of RDNA 5, or whatever AMD ends up calling it, are coming out of engineering I am doing on the project… And again, this is coming out of trying to move things forward. There are no restrictions on the way any of it can be used."Sony only went truly custom GPU wise on the Pro so it checks.
PlayStation 5 went custom on other parts like the SSD etc.
PlayStation 6 GPU will probably be custom like the 5 Pro if I had to guess.
I went back to show you claiming it wouldn't struggle on future games because you put too much faith in "secret sauce".You went so far back yet somehow missed my recent posts where I mention its struggling in recent games.
It does have feature set. It's underutilised but it has it nonetheless. Mentioning it is completely fair.
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.
Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.
For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.
PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.
Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?
30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.
He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
![]()
(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)
Why do you assume Sony would see the same % increases in RAM as us?No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.
Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.
For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.
PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.
Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?
30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.
He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
![]()
(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)
Why do you assume Sony would see the same % increases in RAM as us?
No. Developers can effortlessly bloat VRAM usage to 30GB for very little returns (uncompressing assets, etc). But they likely won't meaningfully use it because PS6 is simply likely not going to be a primary target.
Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.
For 3rd party developers, PC/Mobile/PS5/NS2 and PS6 Handheld are much bigger targets.
They will scale up to the PS6 rather than down from it.
PC and Mobile are not so suicidal as to follow Cerny's madness. Even Radeon RDNA5 dGPU is 18/15/12 GB from 16/16/12 GB RDNA4.
Interesting how despite your VRAM mongering about needing hundreds of dollars of commodities to play next generation games K KeplerL2 , you never commented on MLID's RDNA5 SKU list showing that RDNA5 dGPUs were not even planning to increase VRAM on 2027/2028 dGPUs even before the shortage. goes against the agenda?
30GB is a foolish mistake. A grave miscalculation by cerny and his worst fumble in history.
He has created a toy that gamers can't afford to buy and he can't afford to sell. This is an example BoM at current SSD/RAM prices. Even with scorched earth BoM reductions on other line items. At current prices he can barely afford to sell PS6 at 899$ with no controller.
![]()
(I originally made it / sent it to a friend. So don't take the bozo/womb womb notes personally)
This is assuming that they get 15% off the spot price. And assuming they can lower the cost of everything substantially. It's basically me showing that at these prices, how difficult 30GB GDDR7 is to sell.Why do you assume Sony would see the same % increases in RAM as us?
Hard to tell due to Nvidia's insane markup.Do you think they will get cheaper memory than Nvidia?
Hard to tell due to Nvidia's insane markup.
But looking at the 9070XT and 5080 launch and current prices, it seems like only DRR memory has really seen the big increases.
No, we don't have 100% full confirmation either way yet, however, IMO it will most likely be the status quot of Console SKUs for the Xbox ecosystem. Series Consoles aren't going anywhere. So those same GDKX created SKUs would be optimized for Magnus. For the Epic, Steam ecosystems, devs could optionally optimize or create presets for the device.So the next gen games after Series are basically PC games right? The PC games will utilize magnus extra CPU/GPU/NPU, RAM etc, but those PC games can obviously run on NVIDIA, Intel. Magnus is just there to salvage xbox as a transition step to full blown PC gaming. PC gaming will pretty much have console like experience with console like UI, and can transition back to full blown Windows 11 desktop productivity.
I hope I have this correct.
None of it is bullshit. And it's not future proof as is.Biggest Bullshit I have ever read here. And that table. Lol
Absolutely not.Do you think they will get cheaper memory than Nvidia?
Magnus RAM is 36 GB GDDR7, but with possible 48 GB SKUs by OEMs, like an Elite Xbox.Possibly a silly question:
Is Magnus's RAM expandable/replaceable? Because then it won't be GDDR memory, right?
No chancePossibly a silly question:
Is Magnus's RAM expandable/replaceable? Because then it won't be GDDR memory, right?
I'm using the consumer stuff as an example to show that AMD, Nvidia and Sony isn't going to be paying what we are paying since they are buying straight from the manufacturers.I'm not even talking about consumer stuff. Nvidia buys incredible amounts of GDDR7 and HBM memory for data centers.
Sony wouldn't be paying at spot prices though.This is assuming that they get 15% off the spot price. And assuming they can lower the cost of everything substantially. It's basically me showing that at these prices, how difficult 30GB GDDR7 is to sell.
Sony is buying high end memory here (24Gbit 32Gbps GDDR7). It's not the cheap 8Gbit 14Gbps GDDR6 they use on PS5.
What are you smoking? PS5 has unified memory pool, so you can't state VRAM ammount like that, unless you're pulling somekind of avg out of your ass.Sony first party developers will support PS5 for a very long time. And that has 9GB VRAM. They can add nothing that they can't turn off for PS5.
I'm going to buy the new Xbox as my primary hybrid PC and still keep the same process. Even if there is any improvement to the process of launching games on the new Xbox, it still runs into my second issue, which is that I simply don't trust the longevity of these launchers, and thus I wouldn't trust the longevity of my purchases on those launchers.Hold up, then it might be possible to run this like an xbox and not need ubisoft app?
Never thought I would be pulling for the 'console' way.
Steam might be in trouble with those games that require a separate app. I would buy the Xbox version of the game in those instances.
I might actually buy GTA6 on Xbox just to avoid having to use the Rockstar app.
edit-March Climber What do you think? I remember you saying you buy the games that have separate store apps on the PS5, and most everything else on PC.
I actually can state VRAM just like that.What are you smoking? PS5 has unified memory pool, so you can't state VRAM ammount like that, unless you're pulling somekind of avg out of your ass
I said 15% off spot. And again it's illustrative for how cost prohibiting current prices are.Sony wouldn't be paying at spot prices though.
Cerny magic is simply discarding features he deems not truly essential to gaming performance as he predicts the coming generation's software development will play out, in order to save die space, clock speed, etc. to keep the SoC efficient as possible.
Because Playstaion will end up the dominant development platform, what he deemed non-essential ends up not being used widely, not because he predicted correctly, but because the platform dominance made it "correct"...
Thanks for confirming, full of shit.I actually can state VRAM just like that.
PS5 has 12.5GB exposed to developers. 3.5GB for CPU usage is a reasonable lowball. Games like BG3 will exceed. None will be able to give GPU anything around 10GB. UMA benefit is not needing to copy and manage two memory pools, CPU/GPU still work on different things.
It's necessary to talk about VRAM like that because mystical UMA talk prompted people to pull say bullshit like you needed 12GB VRAM to match PS5 on PC (LMAFO).