My problem with your arguments are:If the difference is native 4k vs upscaled, it is going to make people want native.
A lot of people apparently just brush over the XB1s pathetic marketing, the fact that it was significantly weaker AND more expensive.
Oh look, it's irrelevant 1.3 vs 1.3 comparisons.
Maybe try it with 1.3 vs 1.8 next time.
They were just parroting Cernys talking points.
I guess we're at the "selectively choosing the direction of percentage calculations to downplay the difference" stage.
So again, the XBone was the superior, more powerful console. It had less CU, higher clocks. Apparently developers just decided to render their multiplats at a significantly lower resolution because... Reasons... I guess?
PS5 is not going to sustain 10 TF, that's just a marketing number. Which is theoretical anyway and has little to do with the real world. The power difference is still small.
All I care about is the noise. Sony should be called out if they can't make a quiet console. This is how many generations of noiseboxes now? Especially since MS clearly has the cooling tech sorted out.
So again, the XBone was the superior, more powerful console. It had less CU, higher clocks. Apparently developers just decided to render their multiplats at a significantly lower resolution because... Reasons... I guess?
They were just parroting Cernys talking points.
I guess we're at the "selectively choosing the direction of percentage calculations to downplay the difference" stage.
This is correct. But if all the XSX is doing is pushing 18% more pixels then outside the console warriors most people just aren't going to care.
40% last gen was, sometimes, the difference between HD and sub-HD in multi-platform games and tbh it wasn't really that big a deal for most people. Assuming that 10.3 Tflops is enough to to 4k at a stable frame rate then the extra 18% is going to be even less of a deal.
Price and games will be far more important factors as we go into the next gen.
I'll simplify this for you.
If both consoles are pushing the exact same graphics, the XSX will be doing so at a near 20% higher resolution. If both consoles are pushing the same resolution, the XSX will be capable of doing so with more graphical bells and whistles.
My problem with your arguments are:
While stating the differences in resolution/frames/graphics between One and PS4, you only take in account the GPU. As someone already told you, the PS4 had 3x faster RAM to fed that GPU much better. Even if One had an identical GPU to PS4, you would still see differences. In this case, RAM solutions are more complex and overall near, for sure is not a 2x or 3x difference in speed. However, my wild take is that PS4 would still be able to achieve 40% higher res even without the RAM factor (but not the others advantages) so in that regard I think you are correct. Problem comes with the seguent point:
Xbox One GPU had higher clocks by like 6%, so around 4 times less than the difference between PS5 and SeX to compensate for a 2x difference between the GPU. Even if clocks are variable, we simply don't know how smartishit is effective, only that is clearly there to keep the GPU at max clock without changing the power consumption. Without smartshift, the CPU could waste the power budget running at unnecessary high clocks and let the GPU downclock also because of the overall power budget. Lowering clocks on one part reduce power consumed but let the GPU power increase.
Also, yes, sorry but we need to take into account SSDs. Tell me "ESRAM LULZ" if you want, but we simply don't have example of games designed around SSDs. Now, aside from the 2x difference in raw speed (which is around 4x in ideal conditions), we don't have much idea of how Sex handles the SSD integration with all the other parts.
Cerny designed the SSD of the PS5 to be around 100% efficient, meaning 100x faster in specs, 100x faster in actual usage, while SSDs on PCs have bottlenecs that prevent them to actually use their true speed. I'm sure SeX mitigate this, but by how much? We don't know. What if the SSD is 40x faster, but the data loading can only reach 20x? You would see a 4x to 8x difference there. Mind you, both Sony and MS stated that SSDs CAN increare RAM efficiency and they are using them for that infact. If PS5 SSD has such a big advantage, you can already see how the advantage of those 10 GBs could be vastly mitigated. Also, as any other storage SSDs can be used to load on and off parts of the games, if you can do it very fast you can save GPU power because the GPU do not need to consider all the data togheter but only some at any given time (same for the RAM, that's the point). If you have a far faster SSD, you can do it better.
And there is also all the audio offload part that SeX does, but it seems only for CPU and RAM, while PS5 Tempest take care also of the GPU part. Not informed about this.
I honestly have little doubts that SeX will perform better because third parties aren't going to put all the effort, but if they want to they have the tools to mitigate a part of the differences. That's why I don't find totally correct the 20% more power= 20% more this and that. However, my understanding of all of this is VERY BASIC, I'm not trying to correct you, I'm trying to consider all the factors.
I'll simply
Your argument makes no sense.
Less CU's doesn't give you better performance over a 1.84 GPU.
Even if clocks are variable, we simply don't know how smartishit is effective
you wot???
You do realise that the XBone had less CU and higher clocks than the PS4 right? You also realise the PS4 had more tflops yes?
So unless the XBone is all of a sudden better than the PS4 because clocks your entire argument is moot.
Performed as expected compared to PS4, who also had much, much better RAM aside from GPU.That's a whole lot of word salad to completely misplace your own arguments about SSDs and ESRAM.
Accept it or not but Microsoft knew what they were doing with ESRAM on the XBone. That's why it wasn't rendered entirely useless while using extremely slow DDR ram for the GPU although obviously it was a work around and would have been much better off having proper GDDR memory. At the end of the day, the console performed as expected for it's computational abilities.
Then you follow up with trying to use an even slower SSD to try and make up for memory speed differences. An SSD is not ram, an SSD is not a GPU, scenes aren't rendered on an SSD. Is it going to be better than a standard HDD? Obviously, but it's not the only next gen console to be getting an SSD.
I've already commented before on this pipe dream of streaming in assets as a player pans the camera. That might work with the super slow aim super slow brain types that take 2 seconds to turn around but everyone else that can whip 180 degrees in 1/10th of a second, it just isn't going to be viable unless you like to see a scene render in front of you every single time you turn around.
And that's where I stopped.
It did have less CUs.
So, what's your point?
Performed as expected compared to PS4, who also had much, much better RAM aside from GPU.
Yes, I know scenes aren't rendered on an SSD and an SSD it's not RAM, but it's NOT my word when talking about increasing RAM efficiency with SSDs, both MS and Sony stated this, it's their plan instead of putting 24 GBs of RAM while also sacrificing storage speed.
I'm not talking about loading the entire game when you turn around lol I talk about selectively loading off an on various elements to make the GPU more efficient, which is actually what RAM does by default. Any storage with enough speed can do this. In theory, PS5 should be able to focus more on what it's in the FOV than SeX without stressing the RAM.
guess what else also has less CUs.
Yes, being in RAM it's not equal to being rendered. That's my point.guess what else also has less CUs.
Exactly?
ESRAM did what it was designed to do, make up for having bad main memory. It was still an objectively worse way to design a processor.
Being in RAM =/= currently being rendered. At best, the PS5 will be able to slightly use it's available but slower memory more efficiently.
It's becoming less amusing reading this thread .. it's kinda pathetic most of the posts.
Somehow numbers are no longer numbers...
I can see why America ended up with a president who can't accept reality .. or even accept what he said yesterday.
The day github became real ....suddenly Cu counts don't matter ..unstable clock speed is a benefit .. and unknown unexplainable hidden technologies will easily overcome and outshine cold hard facts.
Weren't you banned a while ago? I see you are back.It's becoming less amusing reading this thread .. it's kinda pathetic most of the posts.
Somehow numbers are no longer numbers...
I can see why America ended up with a president who can't accept reality .. or even accept what he said yesterday.
The day github became real ....suddenly Cu counts don't matter ..unstable clock speed is a benefit .. and unknown unexplainable hidden technologies will easily overcome and outshine cold hard facts.
So your entire argument is moot.The PS5 has less CUs than the XsX.
And your point?
Lol exactly ...reminds me of the "hidden chip" inside the Xbox One... funny shit
It's becoming less amusing reading this thread .. it's kinda pathetic most of the posts.
Somehow numbers are no longer numbers...
I can see why America ended up with a president who can't accept reality .. or even accept what he said yesterday.
The day github became real ....suddenly Cu counts don't matter ..unstable clock speed is a benefit .. and unknown unexplainable hidden technologies will easily overcome and outshine cold hard facts.
And with this I agree anyway. I don't expect PS5 to perform better in any case, especially not because of clocks.So your entire argument is moot.
10.2 is not going to be > 12.1 because clocks. Just stop already.
I was banned for trolling people who turned out to be snowflakes.Weren't you banned a while ago? I see you are back.
So your entire argument is moot.
10.2 is not going to be > 12.1 because clocks. Just stop already.
Maybe stop trolling people before pretending that others stop being snowflakes.I was banned for trolling people who turned out to be snowflakes.![]()
These things have been stated by Digital Foundry.
You must have other information they don't know.
I'm sure you can't show it because you're not speaking based on facts.
Who said 10.2TF > 12TF?
At least take the time to read.
How it is his logic when it's a fact?
But it don't show Xbox SX to be the most powerful over all when PS5 has the rendering advantage from having a higher clock rate.
df seems fairly confident that the console with the better specs will be faster. ...
news at 11
I have never lowered myself to the shame of having to ask a mod to step in on my behalf ... I am the hunter never the prey.Maybe stop trolling people before pretending that others stop being snowflakes.
You have been constantly trying to claim that the power difference doesn't exist because clocks.
That is you trying to rush to the defence of
I have never lowered myself to the shame of having to ask a mod to step in on my behalf ... I am the hunter never the prey.
It looks like you're having trouble comprehending what's going on.
Lol the hunter? Do you think you are cool saying this? This is a forum buddy, not Hunger Games. Mods are here to mod, they can't be everywhere so - get ready for this - people call them. Learn how to proper discuss about something, that's the only thing here. You're no hunter if mods can fuck you up with a ban, get real.I have never lowered myself to the shame of having to ask a mod to step in on my behalf ... I am the hunter never the prey.
You certainly are.
Your arguments of 1.3 vs 1.3 are entirely irrelevant.
Just stop already.
You replied to this post and I responded10.2TF from a 36CU GPU clocked at 2.23GHz wouldn't get you the same result as 10.2TF from a 44CU GPU clocked at 1.825GHz
Both will have "4K native". Next gen is different from last as next gen will have technology for optimization of resolution and image quality.If the difference is native 4k vs upscaled, it is going to make people want native.
SSD allow you to improve image quality. And the faster SSD, the more sophisticated things devs can do.Then you follow up with trying to use an even slower SSD to try and make up for memory speed differences. An SSD is not ram, an SSD is not a GPU, scenes aren't rendered on an SSD. Is it going to be better than a standard HDD? Obviously, but it's not the only next gen console to be getting an SSD.
The relevance of this difference might be much less than you imagine. This means that other parts (like audio or SSD) might actually brings more to the graphics fidelity and image quality than pure paper specs.10.2 is not going to be > 12.1 because clocks. Just stop already.
If your talking about this..They seem fairly confident in the things you're trying to debunk, such as the SSD and CUs
You just continue to rant with no facts. It's better to be informed than completely ignorant.Except "44CUs" is fictional irrelevant nonsense again. Just like your 1.3 vs 1.3 rubbish.
Just stop already. Take your "but clocks" arguments to gameFAQs.
You just continue to rant with no facts. It's better to be informed than completely ignorant.
If your talking about this..
They make sure to say "Sonys pitch" or Cernys claim ... there is almost no editorial or fact checking..
They certainly in no way claim it's faster than the XSX.
I don't know what your trying to achieve out of that lol.
Ray Tracing needs more CUs.Good to know the XSX was downgraded to 44CU recently and that the XBone was always the superior more powerful console.
They are the "facts" you're trying to share.
Just stop already.
10/10 relevance.Ray Tracing needs more CUs.
You really need to do more research. lol
It's becoming less amusing reading this thread .. it's kinda pathetic most of the posts.
Somehow numbers are no longer numbers...
I can see why America ended up with a president who can't accept reality .. or even accept what he said yesterday.
The day github became real ....suddenly Cu counts don't matter ..unstable clock speed is a benefit .. and unknown unexplainable hidden technologies will easily overcome and outshine cold hard facts.
You guys should watch the following video from 14:59 to 20:42. It explains why Sony chose to sacrifice a higher number of CUs for I/O optimization.
1. Accessing and storing information is the biggest bottleneck in modern computer systems.
2. Performing arithmetic operations doesn't consume nearly as much energy and bandwidth as accessing and storing the information needed to perform the operations - and accessing and storing the information that results from the operations.
3. Hence, CU count and frequency, which are proportional to the number of operations that a GPU can perform per cycle, are not as important as conveying and accessing the information associated with the computations that are performed by the CUs per cycle.
4. This is why Sony opted to allocate more of the budget for the PS5's design to its I/O system rather than the number of CUs in its GPU.
5. Hence, the XSX GPU will be hindered by the massive bandwidth consumption of the larger number of computations that its larger number of CUs perform per cycle, and it will consume much more power.
6. Hence, the greater computational capacity of the XSX's GPU will counteract itself due to its greater consumption of bandwidth and energy - and it will also be counteracted by the PS5's significantly faster I/O system.