• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF Weekly: If Xbox Series X is more powerful, why do some PS5 games run better? We finally have some answers.

Mr Moose

Member
20% is not considerably more powerful, Its the difference between 1440p and 1728p average dynamic resolution on ideal scaling scenario, unless you consider that difference considerable then disregard my comment. Besides even ignoring potential bottlenecks, compute does not scale linearly and even if it did the best case scenario the SX would get 20% higher average dynamic resolution which again is not a considerable difference.
That's 28%.
Edit:Wait I might've done this wrong.
Edit: It's 44%?
2560 × 1440
3072 x 1728
 
Last edited:

diffusionx

Gold Member
More people finally realizing that DF doesn't have that much technical knowledge, especially about hardware.
They do some cool side by side comparisons. And if a devs tells them what is going on, they can write about it.
But they don't make quality hardware analysis.
DF wasn't wrong with their initial analysis of the specs. They lacked a holistic picture of the situation but they're not devs, and devs were fine with letting their games do the talking. Yes, Cerngod's approach was right but for obvious reasons DF should not have taken what he said at face value.

If DF made any sort of editorial mistake, it was that they didn't write this article 3 years ago, and instead chose to continue to shill for MS and say BS like it's the tools.

People act like this is a huge win for PS5, but what I read still implies Series X having games that run just as good, or better, while supporting a higher level API with all the advantages this leads too in terms of compatibility.

This is a reason why we get more PC/Xbox titles, and why BC works so well on Xbox. This is also the reason why the next Xbox will have excellent BC. I wouldn't bet on PS6 being BC with how specific some aspects of the PS5 are, it already looks like a more difficult job.
What PC titles are on Xbox but not on PS5, outside of the ones made by Microsoft? PS4 games also work fine on PS5, it turned out much better than people feared before release.

There's no reason why the PS6, iterating upon the PS5, cannot be fully backwards compatible.
 
Last edited:

winjer

Member
DF wasn't wrong with their initial analysis of the specs. They lacked a holistic picture of the situation but they're not devs, and devs were fine with letting their games do the talking. If DF made any sort of editorial mistake, it was that they didn't write this article 3 years ago, and instead chose to continue to shill for MS and say BS like it's the tools.

But they were wrong with their initial specs, because they only looked at the TFLOPs.
Meanwhile there people talking about rasterization, culling and geometry throughput of the PS5.
 

Lysandros

Member
Anyone still looking at those culpable propagandist clowns for an 'explanation' for this entirely made up "mystery" at this point (four years into the generation) is the very root reason for perpetuating this retracted PR fuelled lie to this day. The explanations were in plain sight from the first hour and we were writing about those for four years in this very site.
 
Last edited:

Taycan77

Member
Does make me curious at the level of performance Sony can get from the PS5 Pro with PSSR and the ‘secret sauce’ Cerny is working on. Increasingly it is all about software solutions and efficiency.

I can see Alex getting quite flustered when he see’s what PC hardware is required to match PS5 Pro!
 

Imtjnotu

Member
Sony's Design
Birthday Love GIF by WIESEMANN 1893



Xbox Design
tenor.gif
 

JackMcGunns

Member
Better compiler and higher clock speeds as well. More than just "tools" which has also been said all along.

The higher clock rate we knew about, that's not an advantage, it's a trade-off. Less compute units = less heat => higher clock vs More compute units = more heat => lower clock.

Higher clocks can help certain engines while having more Compute Units could help in others, such as Unreal Engine 5. It's a wash really when it comes to whether higher clock or higher compute is better, but the real world differenciator is the more efficient compiler, that's key in marking a difference.
 

IDWhite

Member
Does make me curious at the level of performance Sony can get from the PS5 Pro with PSSR and the ‘secret sauce’ Cerny is working on. Increasingly it is all about software solutions and efficiency.

I can see Alex getting quite flustered when he see’s what PC hardware is required to match PS5 Pro!

It will be around a 45% performance improvement that we saw on the leaked document. PSSR won't make much of a difference from FSR 2.X, 3.X, or other TAA custom implementations from a performance perspective. It has a huge 2 ms cost on frametime to reconstruct a 1080p image to 2160p, when FSR 3.0 cost is in the low 1.X ms on a GPU with the same performance level. It is clearly a solution to obtain better image quality with a similar performance to other techniques.

And whatever 'Secret sauce' Sony is doing it will only be usefull if they are trying to alleviate potential bottlenecks with hardware customization and fixed function automatization, otherwise devs won't take care if they put more optional dedicated processors to offload some task.
 

Topher

Gold Member
The higher clock rate we knew about, that's not an advantage, it's a trade-off. Less compute units = less heat => higher clock vs More compute units = more heat => lower clock.

Higher clocks can help certain engines while having more Compute Units could help in others, such as Unreal Engine 5. It's a wash really when it comes to whether higher clock or higher compute is better, but the real world differenciator is the more efficient compiler, that's key in marking a difference.

Mark Cerny talked about the advantage higher clocks bring to the table in Road to PS5. Now DF and devs they have talked to are echoing what Cerny said. So no, it isn't a wash according to people who would know.

timestamped
 
Last edited:
DF are disingenuous hacks....:

DF when comparing PC GPUs: "Don't just look at TFLOPs because these are BS theoretical sustained floating point operations that are almost never seen in real software workloads."

Also DF, when comparing XSX with PS5: "On paper, the XSX is a considerably more powerful console [because it has 18% higher SP GPU FLOPS]"

Make up your mind, DF. According to you, max. theoretical single precision FLOPs only seem to matter when Xbox has a bigger number.
 

diffusionx

Gold Member
But they were wrong with their initial specs, because they only looked at the TFLOPs.
Meanwhile there people talking about rasterization, culling and geometry throughput of the PS5.
Like I said, the people who knew let the games do the talking. The problem for DF was that they pulled out the immature tools excuse almost immediately. It's impoissible to find anything a few years old on the internet anymore, but The Verge wrote an article saying as much that leans heavily on DF's analysis:

 

Mr.Phoenix

Member
The higher clock rate we knew about, that's not an advantage, it's a trade-off. Less compute units = less heat => higher clock vs More compute units = more heat => lower clock.

Higher clocks can help certain engines while having more Compute Units could help in others, such as Unreal Engine 5. It's a wash really when it comes to whether higher clock or higher compute is better, but the real world differenciator is the more efficient compiler, that's key in marking a difference.
And you are still making the same mistake DF made all those years ago. Trying to compartmentalize everything into one neat tidy box. Whereas, as DSF should have at least known... consoles don't work that way.

A good way to look at consoles and PCs, is that consoles are Formula One cars Fitted with a 1.6L V6 engine that produces up to 850HP. But a PC, is like a run-off-the-mill consumer vehicle that would need a V8-V12 engine that weighs twice as much to produce the same amount of power.

The point here, is that consoles are typically far more optimized, with every single thing designed and working together to make the whole. From the compute units, to the cache scrubbers, to the compiler, to the IO stack, to the SSD... even down to how the blocks or buses in RAM is addressed. Everything works to make the system what it is. You... and DF make the mistake of forgetting (or not knowing?) this, DF focused on the TFs. To them, its simply more TFs = More performance, you... are focusing on the compiler...

what I am saying, and to be fair a lot of others have said the same on this forum over the years... it's a lot more than that. Everything works together. MS simply made a less efficient system. Be that from compute units, memory architecture, tools.... whatever. The literally made a 12TF machine that performs like a 10TF machine, and sony made a 10Tf machine that can match or outperform a 12Tf machine.
 

JackMcGunns

Member
Mark Cerny talked about the advantage higher clocks bring to the table in Road to PS5. Now DF and devs they have talked to are echoing what Cerny said. So no, it isn't a wash according to people who would know.

timestamped



Yes, we understand that, but it's been discussed that the higher compute unit count in Series X does have its benefits over higher clocks in certain areas.

In an apples to apples comparison with both systems having the same theoretical performance, where one has more compute units and lower clocks vs another with higher clocks and less compute units, it's without a doubt more beneficial to have higher clock speeds, but in the specific case between PS5 and Series X, there's an 18% gain on Series X side because it has a considerably higher CU count, more so than just matching the higher clocked PS5; it exceeds it. So again, both systems will perform relatively the same, that's what I meant by it being a wash, not that the higher clock doesn't help. What is certain is that the major crux in the Series X side is not having the best compiler.
 

Geometric-Crusher

"Nintendo games are like indies, and worth at most $19" 🤡


12 teraflops is the new 64 bit

Microsoft always follows strategies that are proven to fail. Rumored next Xbox will have reference specs for third party manufacturers.
 
Last edited:

Topher

Gold Member
Yes, we understand that, but it's been discussed that the higher compute unit count in Series X does have its benefits over higher clocks in certain areas.

In an apples to apples comparison with both systems having the same theoretical performance, where one has more compute units and lower clocks vs another with higher clocks and less compute units, it's without a doubt more beneficial to have higher clock speeds, but in the specific case between PS5 and Series X, there's an 18% gain on Series X side because it has a considerably higher CU count, more so than just matching the higher clocked PS5; it exceeds it. So again, both systems will perform relatively the same, that's what I meant by it being a wash, not that the higher clock doesn't help. What is certain is that the major crux in the Series X side is not having the best compiler.

Fair enough. Not sure if its the "crux" but the compiler and API are certainly both factors according to DF.
 

FireFly

Member
DF are disingenuous hacks....:

DF when comparing PC GPUs: "Don't just look at TFLOPs because these are BS theoretical sustained floating point operations that are almost never seen in real software workloads."

Also DF, when comparing XSX with PS5: "On paper, the XSX is a considerably more powerful console [because it has 18% higher SP GPU FLOPS]"

Make up your mind, DF. According to you, max. theoretical single precision FLOPs only seem to matter when Xbox has a bigger number.
TFs can be used to make ballpark comparisons within the same architecture, because the ratio between compute and the other theoretical rates is generally kept pretty constant. In this case the PS5 has a higher fillrate due to the much higher clock speed and also some architectural differences.
 

Lysandros

Member
TFs can be used to make ballpark comparisons within the same architecture, because the ratio between compute and the other theoretical rates is generally kept pretty constant. In this case the PS5 has a higher fillrate due to the much higher clock speed and also some architectural differences.
Not only fill rate, every single GPU fixed function metric not tied to CUs including geometry generating/culling and command processing is higher on PS5. Even things like async compute scheduling are faster on it not to mention lower latency caches, higher bandwidth L1 etc.
 
Last edited:
TFs can be used to make ballpark comparisons within the same architecture, because the ratio between compute and the other theoretical rates is generally kept pretty constant. In this case the PS5 has a higher fillrate due to the much higher clock speed and also some architectural differences.

Not really when the clockspeed difference is so significant.

Clockspeed impacts the entire GPU pipeline, e.g. front-end, caches etc., so just looking at TFLOPs (which only apply to the ALUs) and thinking that that provides a meaningful picture of the combined GPU performance differential between the two systems in real games is misguided.
 
The problem should not be why PS5 is quite ahead in some games (like Hogwart Legacy, Dragon Dogma 2 or Elden ring, all 3 using 3 different engines) with a 20% smaller APU, it's why XSX performs so badly compared to similarly specced PC. And what do compiler mostly do? They optimize caches hits. Here XSX is quite disadvantaged compared to PS5 and RDNA2 GPUs. Maybe some games also "compile" better on PS5 because the cache architecture is better.
 

Embearded

Member
These tweets aged like milk.

These people are complete ignorants when it comes to computer architecture and programming.
And i said it before, DF is a nice pixel comparing channel and that's it. They also don't understand how the machines work, they just repeat whatever they can find online.
They also don't have experience on game development, not most of them anyway and surely not enough to call themselves experts.
 

Kenpachii

Member
Maybe it has more to do that developers focus on the PS5 as main platform for there games when it comes to consoles, and for xbox maybe port a downscaled PC version as they share a more similar api solution.
 
Last edited:

Three

Member
Do you know how much theoretical compute advantage a 4090 has over a PS5... and yet that doesn't stop console people from saying the games "look basically the same".... and in most cases they do... because they aren't designed to take advantage of the hardware... which has been my entire point.

Congrats on walking right into that. Perhaps you're the one who should take a break from this thread all things considered :messenger_grinning_sweat:
Look, people are saying the 20% Teraflop difference is one of the smallest differences we have had in years. just in terms of simple teraflops too so that's before you even get into the idea that the CUs are more difficult to keep busy vs lower CUs at higher clocks on PS5. The PS5 was just well designed and that "20%" rarely showed in 4 years because that higher GPU clock they went with gave advantages that beat XSX at other things too beyond just theoretical compute. They have a different philosophy and those came with their own challenges, ie they had to spend money to cool the machine better (I remember the dumb fanboy FUD about PS5s running too hot), they had to provide low level access and break possible compatibility. It's ok to accept that Cerny got it right and concentrated on creating a better balanced machine that's easier to take advantage of because those fewer CUs are faster.

I'm not sure why you're dragging in the 40% difference between the PS5 Pro and SX here either. That's double the difference, no? Why are you trying to make a completely theoretical "20%" seem huge but a 40% difference (plus a completely different/newer architecture) advantage between PS5 Pro and SX seem unsubstantial?

You got a warning in the other PS5 Pro thread where you were talking about midrange GPUs being "far more powerful" than PS6 and your GPU "smoking it" for years before it releases. Then going on to say they shouldn't release the Pro because they should stick to one console for 7years.

Which I would have given the benefit of the doubt about both for claiming midgen cards will beat PS6 and for supporting a single console if you weren't boasting about the Xbox One X pushing the envelope and doing things only todays "highest end cards" can do.

In thinking about a console with a fixed hardware spec, they'll be able to push the envelope with 6tflops and will do things that only todays highest end PCs can do.

That's when people can see through the bullshit.
 
Last edited:

Ev1L AuRoN

Member
You do realize that Series X & Series S games are the same games with a few lines of code changed right?
Given that they want the XSS version to be as close as possible, I'd imagine that the bulk of the optimizing time and QA are spent on the weaker machine, and they just expect the series X to brute force his way, and by doing that they end up subtilizing their stronger machine. Not a surprise given that most Xbox users are on XSS.
 
Last edited:

Mr.Phoenix

Member
i don't know what his job is but he posts dozens tweets everyday, he has more than 100K tweets all about playstation
And you still don't know what his job is?
GRbOHpsb0AUNFhd
GRbOHpxbgAAgej8
GRbOHpxb0AMmSYN



What a bunch of tools. 40-50 more frames per second. 30% instead of 18%. 9 tflops instead of 10 tlfops. STAGGERING! Get the fuck out of here.
What I find crazy is that these fuckers would still jump on the next MS announcement with a straight face claiming its the best thing since sliced bread. Not a chance they arent on a payroll.... no one can be that stupid.
 

marquimvfs

Member
DF are disingenuous hacks....:

DF when comparing PC GPUs: "Don't just look at TFLOPs because these are BS theoretical sustained floating point operations that are almost never seen in real software workloads."

Also DF, when comparing XSX with PS5: "On paper, the XSX is a considerably more powerful console [because it has 18% higher SP GPU FLOPS]"

Make up your mind, DF. According to you, max. theoretical single precision FLOPs only seem to matter when Xbox has a bigger number.
They're deliberately pushing the scale to favor Xbox. To me, the bias is obvious even in the way they're writing the pieces, deliberately downplaying Playstation feats here and there.
See a little of what I mean here:
On paper, the Xbox Series X is a considerably more powerful piece of hardware than PlayStation 5, so why are we looking at a console generation where the Sony and Microsoft machines often deliver like-for-like results?
Now, have same sentence written this way:
On paper, the Playstation 5 is a considerably less powerful piece of hardware than Xbox Series X, so why are we looking at a console generation where the Sony and Microsoft machines often deliver like-for-like results?
It certainly hits you different when they both say the same thing, right? To me, one praises the Xbox power, when the second could praise the PS5 feats of matching the same results.
Some might say that it is innocently done, that I'm overreacting and so, but, I say that when it is done repeatedly over the text, there's some kind of steering of the narrative toward one of the competitors.
Take this other line as an example:
It's a far cry from the Xbox One X vs PS4 Pro face-off, where the Microsoft machine commanded an obvious advantage - or the PS4 vs Xbox One comparison, where Sony typically held a similarly noticeable lead.
Let's rewrite it to flip the bias:
It's a far cry from the Playstation 4 vs Xbox One face-off, where the Sony machine commanded an obvious advantage - or the Xbox One X vs Playstation 4 Pro comparison, where Microsoft typically held a similarly noticeable lead.
Can you spot the difference? Every win that the competition has over their "favourite" platform is, somehow, downplayed and burried under lesser adjectives. Every "win" and bigger characteristics of their preferred platform is mentioned trough hell, even if in real world it means nothing, like the teraflops measure difference.

Now, don't get me wrong, I'm not against a publication have it's own bias, certainly everyone out there does, it's journalism 1-0-1 that "neutrality" doesn't exist, it is called journalism angle, and every publication, the editors, the sponsors have one and it can certainly affect a piece written by someone. The sad part, and what bothers me much is when a publication like that does sell itself as neutral. You can 100% say that, no matter how fair the comparison are done, they will always present the results with bias, and that say a lot about their own bias. Better is to present themselves as an Xbox geared crowd, like many many others do over the internet, be it Sony or Microsoft biased.

I don't have a problem reading something interesting in an Xbox publication, but when I know that it really is an Xbox publication, I can ajust my perception to balance the possible bias. It's only natural, for everyone.
 
Last edited:

SonGoku

Member
Nobody ever said it couldn't. What I'm saying is that games need to be made which prioritize doing more compute heavy aspects of visuals instead of pushing resolution on Xbox.... then you would get games on PS which actually do significantly drop resolution or visual quality compared to Xbox.
Thats what im telling you, going by theoretical specs in the best case scenario for a game made from the ground up for SX all it would take for PS5 to run the exact same game is to drop the resolution by 20% which again is not a considerable difference
then you would get games on PS which actually do significantly drop resolution or visual quality compared to Xbox.
20% compute advantage at best would translate into a 20% resolution advantage IF there is 100% CU utilization and scaling. at thats the best case ideal CU utilization scenario
1728p can feel like a 4k image (its not obvsly but it looks crisp) whereas 1440p always looks "soft"
Thats subjective and very arbitrary to draw the line where a resolution "feels" like 4k. Objectively its only a 20% difference in resolution and 20% is not considerable especially after factoring in image reconstruction, up sampling and other post process effects
It's less than 1728p. You have to calculate from area not vertical resolution.
You need to calculate 20% more pixels not vertical resolution. That would be a little over 1576p

So 1440p -> 1576p is a 20% bump in performance. Tough to notice that on a typical 65” display.

Edit: beaten by fafalada
Thanks, my bad and thats the ideal hw utilization for SX which is less on real game scenarios where engines dont always enjoy perfect CU utilization and theres other bottlenecks to contend with. Even in the best case scenario for xbox, the spec gap is is not considerable
 
Last edited:

GHG

Member
Look, people are saying the 20% Teraflop difference is one of the smallest differences we have had in years. just in terms of simple teraflops too so that's before you even get into the idea that the CUs are more difficult to keep busy vs lower CUs at higher clocks on PS5. The PS5 was just well designed and that "20%" rarely showed in 4 years because that higher GPU clock they went with gave advantages that beat XSX at other things too beyond just theoretical compute. They have a different philosophy and those came with their own challenges, ie they had to spend money to cool the machine better (I remember the dumb fanboy FUD about PS5s running too hot), they had to provide low level access and break possible compatibility. It's ok to accept that Cerny got it right and concentrated on creating a better balanced machine that's easier to take advantage of because those fewer CUs are faster.

I'm not sure why you're dragging in the 40% difference between the PS5 Pro and SX here either. That's double the difference, no? Why are you trying to make a completely theoretical "20%" seem huge but a 40% difference (plus a completely different/newer architecture) advantage between PS5 Pro and SX seem unsubstantial?

You got a warning in the other PS5 Pro thread where you were talking about midrange GPUs being "far more powerful" than PS6 and your GPU "smoking it" for years before it releases. Then going on to say they shouldn't release the Pro because they should stick to one console for 7years.

Which I would have given the benefit of the doubt about both for claiming midgen cards will beat PS6 and for supporting a single console if you weren't boasting about the Xbox One X pushing the envelope and doing things only todays "highest end cards" can do.



That's when people can see through the bullshit.

Google Remij systemwars.

That's who you're dealing with.

oprah winfrey GIF
 

marquimvfs

Member
No console has sold and dominated like the PS2, yet multi platform games on Xbox and GameCube consistently performed better, hence I’m not really buying the whole “PS5 is the lead console therefore devs are optimising it better than XSX”.
Well, one could argue that those machines were substantially different, and that the PS2 also deserved a port instead of being left to dry, because it sold way more.

One proof of it is that, sometimes, devs simply didn't bother to make Gamecube versions. Or that PS2 gained ports from games of the newer generation, even if fundamentally different, like Wolverine. Even if it was hard, they couldn't ignore.

Nowadays it's easier to make ports for everything, the machines are more similar and streamlined, so a dev could develop for everything he desires without putting so much effort like the times you're referring to, and that lack of effort could cause some disparity.

I'm sure that this is not the only problem, to do such affirmation is ridiculous, but i'm reasonable enough to recognize that it can be part of the big picture, for sure.
 
Last edited:
Look, people are saying the 20% Teraflop difference is one of the smallest differences we have had in years. just in terms of simple teraflops too so that's before you even get into the idea that the CUs are more difficult to keep busy vs lower CUs at higher clocks on PS5. The PS5 was just well designed and that "20%" rarely showed in 4 years because that higher GPU clock they went with gave advantages that beat XSX at other things too beyond just theoretical compute. They have a different philosophy and those came with their own challenges, ie they had to spend money to cool the machine better (I remember the dumb fanboy FUD about PS5s running too hot), they had to provide low level access and break possible compatibility. It's ok to accept that Cerny got it right and concentrated on creating a better balanced machine that's easier to take advantage of because those fewer CUs are faster.

I'm not sure why you're dragging in the 40% difference between the PS5 Pro and SX here either. That's double the difference, no? Why are you trying to make a completely theoretical "20%" seem huge but a 40% difference (plus a completely different/newer architecture) advantage between PS5 Pro and SX seem unsubstantial?

You got a warning in the other PS5 Pro thread where you were talking about midrange GPUs being "far more powerful" than PS6 and your GPU "smoking it" for years before it releases. Then going on to say they shouldn't release the Pro because they should stick to one console for 7years.

Which I would have given the benefit of the doubt about both for claiming midgen cards will beat PS6 and for supporting a single console if you weren't boasting about the Xbox One X pushing the envelope and doing things only todays "highest end cards" can do.



That's when people can see through the bullshit.
You're confused... I'm not the one saying a 20% isn't substantial... and I'm certainly not saying 40% isn't substantial... quite the opposite. I mean, I can't even begin to understand how you came to that conclusion. The argument made against what I'm saying is "well just drop the resolution, problem solved"... and yet that's not really the case.

Oh was THAT the reason I got warned? Because some people can't stand to hear that I believe midrange GPUs of the time will completely smoke the PS6 when it releases due to self imposed power limits of consoles? Is it really hard to hear that and simply disagree without claiming I'm being antagonizing? Cmon for crying out loud.
 

solidus12

Member
Thats what im telling you, going by theoretical specs in the best case scenario for a game made from the ground up for SX all it would take for PS5 to run the exact same game is to drop the resolution by 20% which again is not a considerable difference

20% compute advantage at best would translate into a 20% resolution advantage IF there is 100% CU utilization and scaling. at thats the best case ideal CU utilization scenario

Thats subjective and very arbitrary to draw the line where a resolution "feels" like 4k. Objectively its only a 20% difference in resolution and 20% is not considerable especially after factoring in image reconstruction, up sampling and other post process effects


Thanks, my bad and thats the ideal hw utilization for SX which is less on real game scenarios where engines dont always enjoy perfect CU utilization and theres other bottlenecks to contend with. Even in the best case scenario for xbox, the spec gap is is not considerable
They always carefully choose their words to put Xbox in a positive light.

They don’t want to lose access to the perks that Xbox can offer, i.e., trip to Xbox HQ for exclusive access to One X and Series X among others.

That’s why they’re always downplaying PlayStation.
 
Last edited:

Fafalada

Fafracer forever
No console has sold and dominated like the PS2, yet multi platform games on Xbox and GameCube consistently performed better
XBox to PS2/GC 'power' delta was comparable to that of X1X to the original PS4, but even more dramatic in some ways (HDD built in etc).
GC to PS2 is a different story - but between lack of data (noone was doing real analysis back then), wildly different featuresets and power-differential being skewed to PS2 more than GC, it really says nothing about modern console comparisons.
 
Last edited:
Top Bottom