That's 28%.20% is not considerably more powerful, Its the difference between 1440p and 1728p average dynamic resolution on ideal scaling scenario, unless you consider that difference considerable then disregard my comment. Besides even ignoring potential bottlenecks, compute does not scale linearly and even if it did the best case scenario the SX would get 20% higher average dynamic resolution which again is not a considerable difference.
DF wasn't wrong with their initial analysis of the specs. They lacked a holistic picture of the situation but they're not devs, and devs were fine with letting their games do the talking. Yes, Cerngod's approach was right but for obvious reasons DF should not have taken what he said at face value.More people finally realizing that DF doesn't have that much technical knowledge, especially about hardware.
They do some cool side by side comparisons. And if a devs tells them what is going on, they can write about it.
But they don't make quality hardware analysis.
What PC titles are on Xbox but not on PS5, outside of the ones made by Microsoft? PS4 games also work fine on PS5, it turned out much better than people feared before release.People act like this is a huge win for PS5, but what I read still implies Series X having games that run just as good, or better, while supporting a higher level API with all the advantages this leads too in terms of compatibility.
This is a reason why we get more PC/Xbox titles, and why BC works so well on Xbox. This is also the reason why the next Xbox will have excellent BC. I wouldn't bet on PS6 being BC with how specific some aspects of the PS5 are, it already looks like a more difficult job.
DF wasn't wrong with their initial analysis of the specs. They lacked a holistic picture of the situation but they're not devs, and devs were fine with letting their games do the talking. If DF made any sort of editorial mistake, it was that they didn't write this article 3 years ago, and instead chose to continue to shill for MS and say BS like it's the tools.
Better compiler and higher clock speeds as well. More than just "tools" which has also been said all along.
Does make me curious at the level of performance Sony can get from the PS5 Pro with PSSR and the ‘secret sauce’ Cerny is working on. Increasingly it is all about software solutions and efficiency.
I can see Alex getting quite flustered when he see’s what PC hardware is required to match PS5 Pro!
The higher clock rate we knew about, that's not an advantage, it's a trade-off. Less compute units = less heat => higher clock vs More compute units = more heat => lower clock.
Higher clocks can help certain engines while having more Compute Units could help in others, such as Unreal Engine 5. It's a wash really when it comes to whether higher clock or higher compute is better, but the real world differenciator is the more efficient compiler, that's key in marking a difference.
Like I said, the people who knew let the games do the talking. The problem for DF was that they pulled out the immature tools excuse almost immediately. It's impoissible to find anything a few years old on the internet anymore, but The Verge wrote an article saying as much that leans heavily on DF's analysis:But they were wrong with their initial specs, because they only looked at the TFLOPs.
Meanwhile there people talking about rasterization, culling and geometry throughput of the PS5.
And you are still making the same mistake DF made all those years ago. Trying to compartmentalize everything into one neat tidy box. Whereas, as DSF should have at least known... consoles don't work that way.The higher clock rate we knew about, that's not an advantage, it's a trade-off. Less compute units = less heat => higher clock vs More compute units = more heat => lower clock.
Higher clocks can help certain engines while having more Compute Units could help in others, such as Unreal Engine 5. It's a wash really when it comes to whether higher clock or higher compute is better, but the real world differenciator is the more efficient compiler, that's key in marking a difference.
Mark Cerny talked about the advantage higher clocks bring to the table in Road to PS5. Now DF and devs they have talked to are echoing what Cerny said. So no, it isn't a wash according to people who would know.
timestamped
Yes, we understand that, but it's been discussed that the higher compute unit count in Series X does have its benefits over higher clocks in certain areas.
In an apples to apples comparison with both systems having the same theoretical performance, where one has more compute units and lower clocks vs another with higher clocks and less compute units, it's without a doubt more beneficial to have higher clock speeds, but in the specific case between PS5 and Series X, there's an 18% gain on Series X side because it has a considerably higher CU count, more so than just matching the higher clocked PS5; it exceeds it. So again, both systems will perform relatively the same, that's what I meant by it being a wash, not that the higher clock doesn't help. What is certain is that the major crux in the Series X side is not having the best compiler.
TFs can be used to make ballpark comparisons within the same architecture, because the ratio between compute and the other theoretical rates is generally kept pretty constant. In this case the PS5 has a higher fillrate due to the much higher clock speed and also some architectural differences.DF are disingenuous hacks....:
DF when comparing PC GPUs: "Don't just look at TFLOPs because these are BS theoretical sustained floating point operations that are almost never seen in real software workloads."
Also DF, when comparing XSX with PS5: "On paper, the XSX is a considerably more powerful console [because it has 18% higher SP GPU FLOPS]"
Make up your mind, DF. According to you, max. theoretical single precision FLOPs only seem to matter when Xbox has a bigger number.
Not only fill rate, every single GPU fixed function metric not tied to CUs including geometry generating/culling and command processing is higher on PS5. Even things like async compute scheduling are faster on it not to mention lower latency caches, higher bandwidth L1 etc.TFs can be used to make ballpark comparisons within the same architecture, because the ratio between compute and the other theoretical rates is generally kept pretty constant. In this case the PS5 has a higher fillrate due to the much higher clock speed and also some architectural differences.
TFs can be used to make ballpark comparisons within the same architecture, because the ratio between compute and the other theoretical rates is generally kept pretty constant. In this case the PS5 has a higher fillrate due to the much higher clock speed and also some architectural differences.
Yeah man, just wait...They run better because they not using the new Series S with 1TB of hard drive space! Wait until the fall than do the comparisons. These DF fools will be in for a ride awakening!
Been waiting for the past year. I gave up and bought a Nintendo Wii.Yeah man, just wait...
this crazy guy is still on it
What a bunch of tools. 40-50 more frames per second. 30% instead of 18%. 9 tflops instead of 10 tlfops. STAGGERING! Get the fuck out of here.
this crazy guy is still on it
his account is proof of mental illness
Look, people are saying the 20% Teraflop difference is one of the smallest differences we have had in years. just in terms of simple teraflops too so that's before you even get into the idea that the CUs are more difficult to keep busy vs lower CUs at higher clocks on PS5. The PS5 was just well designed and that "20%" rarely showed in 4 years because that higher GPU clock they went with gave advantages that beat XSX at other things too beyond just theoretical compute. They have a different philosophy and those came with their own challenges, ie they had to spend money to cool the machine better (I remember the dumb fanboy FUD about PS5s running too hot), they had to provide low level access and break possible compatibility. It's ok to accept that Cerny got it right and concentrated on creating a better balanced machine that's easier to take advantage of because those fewer CUs are faster.Do you know how much theoretical compute advantage a 4090 has over a PS5... and yet that doesn't stop console people from saying the games "look basically the same".... and in most cases they do... because they aren't designed to take advantage of the hardware... which has been my entire point.
Congrats on walking right into that. Perhaps you're the one who should take a break from this thread all things considered
In thinking about a console with a fixed hardware spec, they'll be able to push the envelope with 6tflops and will do things that only todays highest end PCs can do.
i don't know what his job is but he posts dozens tweets everyday, he has more than 100K tweets all about playstationWhat's sad is people follow them.
Given that they want the XSS version to be as close as possible, I'd imagine that the bulk of the optimizing time and QA are spent on the weaker machine, and they just expect the series X to brute force his way, and by doing that they end up subtilizing their stronger machine. Not a surprise given that most Xbox users are on XSS.You do realize that Series X & Series S games are the same games with a few lines of code changed right?
And you still don't know what his job is?i don't know what his job is but he posts dozens tweets everyday, he has more than 100K tweets all about playstation
What I find crazy is that these fuckers would still jump on the next MS announcement with a straight face claiming its the best thing since sliced bread. Not a chance they arent on a payroll.... no one can be that stupid.
What a bunch of tools. 40-50 more frames per second. 30% instead of 18%. 9 tflops instead of 10 tlfops. STAGGERING! Get the fuck out of here.
They're deliberately pushing the scale to favor Xbox. To me, the bias is obvious even in the way they're writing the pieces, deliberately downplaying Playstation feats here and there.DF are disingenuous hacks....:
DF when comparing PC GPUs: "Don't just look at TFLOPs because these are BS theoretical sustained floating point operations that are almost never seen in real software workloads."
Also DF, when comparing XSX with PS5: "On paper, the XSX is a considerably more powerful console [because it has 18% higher SP GPU FLOPS]"
Make up your mind, DF. According to you, max. theoretical single precision FLOPs only seem to matter when Xbox has a bigger number.
Now, have same sentence written this way:On paper, the Xbox Series X is a considerably more powerful piece of hardware than PlayStation 5, so why are we looking at a console generation where the Sony and Microsoft machines often deliver like-for-like results?
It certainly hits you different when they both say the same thing, right? To me, one praises the Xbox power, when the second could praise the PS5 feats of matching the same results.On paper, the Playstation 5 is a considerably less powerful piece of hardware than Xbox Series X, so why are we looking at a console generation where the Sony and Microsoft machines often deliver like-for-like results?
Let's rewrite it to flip the bias:It's a far cry from the Xbox One X vs PS4 Pro face-off, where the Microsoft machine commanded an obvious advantage - or the PS4 vs Xbox One comparison, where Sony typically held a similarly noticeable lead.
Can you spot the difference? Every win that the competition has over their "favourite" platform is, somehow, downplayed and burried under lesser adjectives. Every "win" and bigger characteristics of their preferred platform is mentioned trough hell, even if in real world it means nothing, like the teraflops measure difference.It's a far cry from the Playstation 4 vs Xbox One face-off, where the Sony machine commanded an obvious advantage - or the Xbox One X vs Playstation 4 Pro comparison, where Microsoft typically held a similarly noticeable lead.
Thats what im telling you, going by theoretical specs in the best case scenario for a game made from the ground up for SX all it would take for PS5 to run the exact same game is to drop the resolution by 20% which again is not a considerable differenceNobody ever said it couldn't. What I'm saying is that games need to be made which prioritize doing more compute heavy aspects of visuals instead of pushing resolution on Xbox.... then you would get games on PS which actually do significantly drop resolution or visual quality compared to Xbox.
20% compute advantage at best would translate into a 20% resolution advantage IF there is 100% CU utilization and scaling. at thats the best case ideal CU utilization scenariothen you would get games on PS which actually do significantly drop resolution or visual quality compared to Xbox.
Thats subjective and very arbitrary to draw the line where a resolution "feels" like 4k. Objectively its only a 20% difference in resolution and 20% is not considerable especially after factoring in image reconstruction, up sampling and other post process effects1728p can feel like a 4k image (its not obvsly but it looks crisp) whereas 1440p always looks "soft"
It's less than 1728p. You have to calculate from area not vertical resolution.
Thanks, my bad and thats the ideal hw utilization for SX which is less on real game scenarios where engines dont always enjoy perfect CU utilization and theres other bottlenecks to contend with. Even in the best case scenario for xbox, the spec gap is is not considerableYou need to calculate 20% more pixels not vertical resolution. That would be a little over 1576p
So 1440p -> 1576p is a 20% bump in performance. Tough to notice that on a typical 65” display.
Edit: beaten by fafalada
Look, people are saying the 20% Teraflop difference is one of the smallest differences we have had in years. just in terms of simple teraflops too so that's before you even get into the idea that the CUs are more difficult to keep busy vs lower CUs at higher clocks on PS5. The PS5 was just well designed and that "20%" rarely showed in 4 years because that higher GPU clock they went with gave advantages that beat XSX at other things too beyond just theoretical compute. They have a different philosophy and those came with their own challenges, ie they had to spend money to cool the machine better (I remember the dumb fanboy FUD about PS5s running too hot), they had to provide low level access and break possible compatibility. It's ok to accept that Cerny got it right and concentrated on creating a better balanced machine that's easier to take advantage of because those fewer CUs are faster.
I'm not sure why you're dragging in the 40% difference between the PS5 Pro and SX here either. That's double the difference, no? Why are you trying to make a completely theoretical "20%" seem huge but a 40% difference (plus a completely different/newer architecture) advantage between PS5 Pro and SX seem unsubstantial?
You got a warning in the other PS5 Pro thread where you were talking about midrange GPUs being "far more powerful" than PS6 and your GPU "smoking it" for years before it releases. Then going on to say they shouldn't release the Pro because they should stick to one console for 7years.
Which I would have given the benefit of the doubt about both for claiming midgen cards will beat PS6 and for supporting a single console if you weren't boasting about the Xbox One X pushing the envelope and doing things only todays "highest end cards" can do.
That's when people can see through the bullshit.
Noteworthy that the 'new feat.' was accessible on PS4 a decade ago (courtesy of GNM just being a 1:1 GPU assembly basically).What's sad is people follow them.
Google Remij systemwars.
That's who you're dealing with.
That's called a moving goal post.No console has sold and dominated like the PS2, yet multi platform games on Xbox and GameCube consistently performed better, hence I’m not really buying the whole “PS5 is the lead console therefore devs are optimising it better than XSX”.
Well, one could argue that those machines were substantially different, and that the PS2 also deserved a port instead of being left to dry, because it sold way more.No console has sold and dominated like the PS2, yet multi platform games on Xbox and GameCube consistently performed better, hence I’m not really buying the whole “PS5 is the lead console therefore devs are optimising it better than XSX”.
You're confused... I'm not the one saying a 20% isn't substantial... and I'm certainly not saying 40% isn't substantial... quite the opposite. I mean, I can't even begin to understand how you came to that conclusion. The argument made against what I'm saying is "well just drop the resolution, problem solved"... and yet that's not really the case.Look, people are saying the 20% Teraflop difference is one of the smallest differences we have had in years. just in terms of simple teraflops too so that's before you even get into the idea that the CUs are more difficult to keep busy vs lower CUs at higher clocks on PS5. The PS5 was just well designed and that "20%" rarely showed in 4 years because that higher GPU clock they went with gave advantages that beat XSX at other things too beyond just theoretical compute. They have a different philosophy and those came with their own challenges, ie they had to spend money to cool the machine better (I remember the dumb fanboy FUD about PS5s running too hot), they had to provide low level access and break possible compatibility. It's ok to accept that Cerny got it right and concentrated on creating a better balanced machine that's easier to take advantage of because those fewer CUs are faster.
I'm not sure why you're dragging in the 40% difference between the PS5 Pro and SX here either. That's double the difference, no? Why are you trying to make a completely theoretical "20%" seem huge but a 40% difference (plus a completely different/newer architecture) advantage between PS5 Pro and SX seem unsubstantial?
You got a warning in the other PS5 Pro thread where you were talking about midrange GPUs being "far more powerful" than PS6 and your GPU "smoking it" for years before it releases. Then going on to say they shouldn't release the Pro because they should stick to one console for 7years.
Which I would have given the benefit of the doubt about both for claiming midgen cards will beat PS6 and for supporting a single console if you weren't boasting about the Xbox One X pushing the envelope and doing things only todays "highest end cards" can do.
That's when people can see through the bullshit.
They always carefully choose their words to put Xbox in a positive light.Thats what im telling you, going by theoretical specs in the best case scenario for a game made from the ground up for SX all it would take for PS5 to run the exact same game is to drop the resolution by 20% which again is not a considerable difference
20% compute advantage at best would translate into a 20% resolution advantage IF there is 100% CU utilization and scaling. at thats the best case ideal CU utilization scenario
Thats subjective and very arbitrary to draw the line where a resolution "feels" like 4k. Objectively its only a 20% difference in resolution and 20% is not considerable especially after factoring in image reconstruction, up sampling and other post process effects
Thanks, my bad and thats the ideal hw utilization for SX which is less on real game scenarios where engines dont always enjoy perfect CU utilization and theres other bottlenecks to contend with. Even in the best case scenario for xbox, the spec gap is is not considerable
Probably had it before.I did
Now I have AIDS
XBox to PS2/GC 'power' delta was comparable to that of X1X to the original PS4, but even more dramatic in some ways (HDD built in etc).No console has sold and dominated like the PS2, yet multi platform games on Xbox and GameCube consistently performed better
Well, it has been out of sight. No one's buying it.
lmao, attacking me and inciting others to because I said 20% of a compute advantage is substantial.. wowGoogle Remij systemwars.
That's who you're dealing with.
lmao, attacking me and inciting others to because I said 20% of a compute advantage is substantial.. wow