Topher
Identifies as young
I think it helped devs get better at optimisation.
So it actually helped them.
No amount of optimisation can help with missing features though.
I think it helped devs get better at optimisation.
So it actually helped them.
No amount of optimisation can help with missing features though.
$399 consoles.
One launched in 2013. The other in 2020.
Context matters.
By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."
Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.
Now I'm 100% sure, hello Shitjutsu.
Get the fuck out with Kingsolorobert or whatever he was called here.He might still be butthurt after all the truth bombs FOSSpatents famously dropped on the likes of him.
![]()
That's just silly man, I don't think anyone was/is that invested.He might still be butthurt after all the truth bombs FOSSpatents famously dropped on the likes of him.
![]()
I think it helped devs get better at optimisation.
So it actually helped them.
No amount of optimisation can help with missing features though.
No getting away from that.Naw man I personally think XSS was a big mistake. I understand why they did it trying to catch sales from all the people that don't want to spend as much on a console but they should have just stuck to their guns only put out the XSX and increased the value there somehow.
Sadly I think both ms and Sony are going to put out multiple skus next gen aiming for the same thing even if they are dressed up as a mobile variant for convenience. Its gonna hold the full console experiences back.
Id much rather they put out a separate platform entirely like the psp/vita or continue to focus on improving a streaming model like the portal or just streaming to phones/tablets etc.
So surely xbox first party games should have unlocked "the magic" right? Or were they already optimizing for PS5 way back at start of gen?No getting away from that.
Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.
Devs are more held back by budget than hardware these days.
We look back with fond memories, exactly.Get the fuck out with Kingsolorobert or whatever he was called here.
You greenrats are truly having flashbacks and PTSD ..
Nothing to stop Sony from saying, given state of market (xbox shat the bed, switch 2 is switch pro) and AI ram bubble, scrap Orion as we know it and regroup in 2028 to get a more performant design that may push PS5 owners to upgrade.As Kepler posted earlier, too late to make big changes in hardware project, so doesnt matter if ps6 release in 2027 or 2029, we will get same hw
We look back with fond memories, exactly.
Damn man... did mesh shaders rape your dog or something? Hate is strong!Not even Xbox studios themselves who initially had an exclusives policy? which means games could have used it. Tell me why Flight Sim didn't use it, or why FH didn't use it then? It was an in-house engine.
The games came to PS5 with little to no impact on performance or visuals later when xbox switched strategy at the end.
Like VRS, Mesh shaders just weren't all that important/efficient to bother with and you could even implement compute shaders or software VRS which not only gave you support on older graphics cards and other consoles but more freedom for optimisation too (look at what CoD did for example).
Suggesting it was being held back by PS5 when PS5 was totally out of the equation wth xbox exclusives tells you everything you need to know. Either Xbox themselves chose not to use it or they did and the alternative ports worked just as well anyway.
Developers (like The Coalition) fighting for their lives for months/a year to make games (and The Matrix demo) work on Series S.No getting away from that.
Even underpowered consoles like Switch 2 / Series S go underutilised in lot of games.
Nobody is really bragging about anything. What's actually funny is that you continously defend the most "basic of consoles" like the Series S while talking about "xbox extra magic sauce" that amounts to nothing. Roll back to when these released and you were even going against PC with it:Whats funny to me is certain fanbase keep bragging about Cerny magic.
In reality its "as basic as it gets" console.
Its PC and Xbox that has extra magic sauce that gets underutilised.
Everyone's favourite little console is under spotlight again I see.
Anyways, one of my relatives have made a pc that costs 3 times Series S. (2060 super)
It's targeting 1080p and I can't see difference between two. A win for Series S if you ask me.
But I have seen almost all videos from Elanaldebits involving Series S/X comparison. I was super sceptical before getting Series S as I thought I would miss out.
On 1080p screen, Series S maxes it out. I wouldn't gain much by using more powerful hardware. (Except supersampling which I am yet to see in action)
What kind of reply is this? Stating that Mesh shaders had little impact even in exclusive games where other consoles weren't holding back in-house engines at Playground and Asobo is not "hate". It's reality. PS5 simply wasn't holding back things like mesh shaders.Damn man... did mesh shaders rape your dog or something? Hate is strong!
it ain't magic if its not being used in most games. Also as another member mentioned even Xbox exclusives didn't use it.Whats funny to me is certain fanbase keep bragging about Cerny magic.
In reality its "as basic as it gets" console.
Its PC and Xbox that has extra magic sauce that gets underutilised.
Developers (like The Coalition) fighting for their lives for months/a year to make games (and The Matrix demo) work on Series S.
Meanwhile Vaibhavpisal:
![]()
Nobody is really bragging about anything. What's actually funny is that you continously defend the most "basic of consoles" like the Series S while talking about "xbox extra magic sauce" that amounts to nothing. Roll back to when these released and you were even going against PC with it:
Do you still believe this (when XSS is missing RT lighting or runs at half the framerate?) or will you be claiming to see "all the difference" now when the expensive xbox releases and you no doubt get it?
It was the Xbox fanbase most of the time talking about "secret sauce" when in reality it amounted to nothing. People like you and riky. This is the sort of shit you and riky were peddling (of course you liked it) when people were telling you Series S will struggle even more as the gen progressed.
It's the people who were jumping up and down repeating their marketing terms like sustained clocks, SFS, VRS, Mesh Shaders, Velocity Architecture, 'Smart Delivery', and all that other nonsense and were ultimately humbled in the end when actual games/results came. Series S didn't even live up to MS's own claims yet here you are still defending it.
Only xbox studio that is tech competent is id Tech. And Coalition to some extent.it ain't magic if its not being used in most games. Also as another member mentioned even Xbox exclusives didn't use it.
They are absolutely bottlenecks. Yes, it is limited by AMD as I pointed out, but it is still absolutely bottlenecks for both systems.Yeah. They didn't teleport you to your office either. But those are not bottlenecks but limitations of the available technology of the era.
Context does matter. And I'm judging each console by that context. Considering the price point and the specifications at the time, the PS4 CPU performance was dog shit as it was not much of an upgrade over the previous generation and outperfomed by slow dual core CPUs on PC. It was one of the biggest bottlenecks of the PS4 generation. One of the reasons quality/performance modes were not common was due to the CPU simply being capable of 60fps. Performance modes that did happen struggled to reach a stable 60.$399 consoles.
One launched in 2013. The other in 2020.
Context matters.
By that logic, I could look at an RTX 5090 in 2050 and say:
"Wow, the core count is miserable, the clocks are abysmal, the RAM is laughable, and it can't even handle 8K/144 path tracing. What a bottlenecked disaster."
Hardware should be judged against its era, its price point, and its design goals — not against future standards or fantasy expectations.
Actually Space Marine 2 does not use mesh shaders.Thing is, 99.99% of games don't use Mesh Shaders, precisely because PS5 is the lead dev environment, and since it lacked them, most devs didn't bother learning.
So PS5 not having full RDNA2 held things back from advancing. We couldn't really see the differences Mesh Shaders would bring to the AAA games.
News Rumor Hardware Post in thread '[MLiD] XBOX Magnus RDNA 5 Finalized'
The more I think about it the more I'm going to pick one up for my lady the steam machine is just to weak.
![]()
Primitive shaders are pretty much the same between RDNA1 and 2. In PC space RDNA1 just lacks Api support, something that Sony did with PS5.
So how much more efficient PS5 really is? Reality is that there are games that perform better on PS5, games that perform better on Xbox and games that are more or...
For example, that Space Marines 2 that the Appleposted if true.
It doesn't seem like anything major is missing from Orion, only thing I can see is the NPU but even MLID was unsure if that's missing.
Nanite does support mesh shading, for geometry that is bigger than a pixel. However it is completely useless, I have measured no performance differences. And also no performance differences in games. RDNA2 has full mesh shading support, IDK why you say it stll uses primitive shading. Primitive shading is not exposed on PC at all, only on PS5 through their own APIs. You can also compare a 5700XT to a 2070 Super and see there's no performance difference in UE5 games.There is no hardware acceleration for Nanite, because current hardware rasterizers on Nvidia, AMD and Intel work on a 2x2 pixel grid.
Nanite rasterizes in software to avoid this bottleneck, when using very dense geometry.
UE5 does support Primitive Shaders. And it's pretty much as fast as Mesh Shaders. They are very similar in a high overview. The main difference is that Primitive shaders do not support the Amplification Stage.
RDNA2 also does not support Mesh Shaders. It still uses Primitive Shaders, though a more advanced version than RDNA1. But then the driver maps and compiles Mesh Shaders into the Primitive Shader.
AMD never did the same translation from Mesh Shaders, with RDNA1.
Alan Wake only supports Mesh Shaders. ANd a more recent patch added support for the traditional Geometry Pipeline.
Nanite does support mesh shading, for geometry that is bigger than a pixel. However it is completely useless, I have measured no performance differences. And also no performance differences in games. RDNA2 has full mesh shading support, IDK why you say it stll uses primitive shading. Primitive shading is not exposed on PC at all, only on PS5 through their own APIs. You can also compare a 5700XT to a 2070 Super and see there's no performance difference in UE5 games.