[Digital Foundry]PS5 uncovered

l
Was somewhere at the doom part where he was playing it if i can remember i could have heard him wrong tho as i did some administration work while listening towards it, but that quote u just pushed makes it sound like those clocks are fixed and will be able to run always at max frequency's whenever they feel like it. U either can run at full clocks or u can't there is no in between basically.

That cpu and gpu clocks jump up and down when need is there isn't much interesting, that's just nice for your energy bill at the end of the day that's about it.



Debunk what i say, or stop quoting me. This is the 3rd time now in a topic u start to shit on my posts with no arguments other then your flower power feelings logic i couldn't give 2 shits about. If you got information i don't know share it and shit with facts on my post. I will appreciate it even. As of now u are just wasting my time.



Nice argument. Got actual some factual information or tech detail instead of posts that say nothing then "ur wrong".



Hey look another one pops up. no argument just straight up shitposting. Nice job mate.
Nothing of you said makes sense lol
Plus there is incorrect claims/assumptions too.
 
Last edited:
perfoamce wise RE3 runs better on the pro than the x....i mean this is a fact with the numbers shown, performance as in frame rate. Why is this hard to understand people.
The X has better resolution
The Pro has better framerate performance, hence that version performs better
the pro version run better becase the xone x version run at 4k, if they both run at the same resolution the xone x version would be superior.
even a noob like me can grasp this simple concept.

it's just a fuck up from capcom who want to push 4k on xone x, if they decide for 1800p, puf the xone x version would be better both on resolution and framerate.
 
the pro version run better becase the xone x version run at 4k, if they both run at the same resolution the xone x version would be superior.
even a noob like me can grasp this simple concept.

it's just a fuck up from capcom who want to push 4k on xone x, if they decide for 1800p, puf the xone x version would be better both on resolution and framerate.
Actually neither version are 4K... it is both CBR.
 
the pro version run better becase the xone x version run at 4k, if they both run at the same resolution the xone x version would be superior.
even a noob like me can grasp this simple concept.

it's just a fuck up from capcom who want to push 4k on xone x, if they decide for 1800p, puf the xone x version would be better both on resolution and framerate.

That's quite the assumption considering both of them are CBR.

Edit: lol Ethomaz beat me to it.
 
Last edited:
The CPUs are virtually identical on both machines (3% diff). Considering on PS5 the CPU won't have the audio and the I/O to process at all (which won't be the case on XSX contrary to what they want you to believe), PS5 CPU may well be more potent in actual games.

And XSX bandwidth has constraints. But yeah XSX has a bit more Tflops power, there is no denying that.
PS5 CPU is in boost mode almost identical (8-9% difference), and not sure what you mean by the audio part, since the XSX has dedicated hardware for this.
 
Matt is not trustworthy at all.
Nobody is saying the ultra fast SSD will give PS5 an extra teraflop. Why would anyone say that? Teraflop doesn't matter anyway. That 18% advantage in teraflop will only result in a big higher screen output resolution that you can't notice even in youtube comparison videos anyway.

People are saying (including developers) that the PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art. A developer explained this in detail in one of his tweets.
What good are high texture details if you don't have the resolution to display them? Which developers will stock up their art department massively so they can use their work output on exactly one platform? How many use cases actually exist where you want to swap out GBs of data in a second? It's basically what you see in Outer Wilds (appearing trees if you look away), the Titanfall 2 time in travel level and when in Jedi Fallen Order you run to a door, turn around and you are somewhere else. Yeah, it already works today, with 80 MB/s.

Even the XSX SSD won't get a lot of use aside from loading times. Because having a static part of memory is nice. Not having to constantly change memory addresses and flushing content is nice.

Where they will be used is in streaming assets in open world games, where you no longer have the same restrictions of how fast your player can traverse the landscape. But that's limited by the slower platform. Devs will just load uncompressed assets on PS5 and (lossless) compressed assets on XSX. Can't have players run or fly faster on one platform.
 
Indeed, remember the crazy streaming techniques GTA5 used on PS360 streaming simultaneously from disk and hdd to maximize bandwidth
Next R* next gen only game will be insane, GTA6 might be crossgen :/
nah, gta6 is gonna be full nextgen, that game is not come out before 2022-2023, rdr2 was in developing for 8-9 years with 1000+ people working on it (so almost nothing to spare for gta6 development) and gta6 is even more ambitious and bigger as a brand.
no chance that rockstar is gonna waste the new cpu\ssd stuff to develop for 10+ years old machine.
 
Last edited:
What good are high texture details if you don't have the resolution to display them? Which developers will stock up their art department massively so they can use their work output on exactly one platform? How many use cases actually exist where you want to swap out GBs of data in a second? It's basically what you see in Outer Wilds (appearing trees if you look away), the Titanfall 2 time in travel level and when in Jedi Fallen Order you run to a door, turn around and you are somewhere else. Yeah, it already works today, with 80 MB/s.

Even the XSX SSD won't get a lot of use aside from loading times. Because having a static part of memory is nice. Not having to constantly change memory addresses and flushing content is nice.

Where they will be used is in streaming assets in open world games, where you no longer have the same restrictions of how fast your player can traverse the landscape. But that's limited by the slower platform. Devs will just load uncompressed assets on PS5 and (lossless) compressed assets on XSX. Can't have players run or fly faster on one platform.

This post is the most wrong thing I've seen yet LOLOLOLOL!!!! And neither console today streams data at 80 MB/s. It's closer to 50 MB/s.
 
Actually neither version are 4K... it is both CBR.
and yet the xone x version is still superior in resolution output:
  • PS4P - 2880x1620, CB
  • XOX - 2160p with reconstruction

doesn't matter if it's native or not, there is a reason why console can't do 8k cbr, because it's still fucking heavy on performance, it's not free just because it's cbr.
if you think that with resolution parity an inferior hardware can get better framerates i don't know what to say except lazy af devs not optimizing for the better hardware.

ps5 vs sex differences are gonna be far less because the difference is not that big, i give you that.
 
Why do you guys care so much about who has the "stronger" console? You've been arguing for weeks about this shit. The people that like Playstation will buy the next one no matter what. The same goes for Xbox. Most of you are grownups, with jobs, and could buy both of them if you really wanted to.
Both consoles are far better than their predecessors; meaning the games you love playing will play and look even better than they did before. The OS on both consoles are also going to be better as well.
 
What good are high texture details if you don't have the resolution to display them?

We can definitely see a CGI running at 1080p looking better than Fortnite or PUBG running at 4K. (an exaggeration of course but you get the point)

Which developers will stock up their art department massively so they can use their work output on exactly one platform?

Assets can be created procedurally and not all of them has to be handcrafted. I believe I saw in the documentary about HZD that Guerrilla Games did just that. These created assets doesn't have to be cut down or reduced in size just so they can fit in the RAM (including those pre-cached assets that may or may not be used).
 
Why do you guys care so much about who has the "stronger" console? You've been arguing for weeks about this shit.

Threads about PS5 can discuss PS5 power and advantages without comparison. The problem is a lot of "them" are intervening and making a lot of absurd claim and strawman arguments.

Next-gen speculation thread is a different matter though. Just don't go there if you don't want to read bickering back and forth.

Meanwhile, this kind of thread here should be safe if only some people would not go here and spoil the fun.
 
the pro version run better becase the xone x version run at 4k, if they both run at the same resolution the xone x version would be superior.
even a noob like me can grasp this simple concept.

it's just a fuck up from capcom who want to push 4k on xone x, if they decide for 1800p, puf the xone x version would be better both on resolution and framerate.
I see a lot of if's and's and butts but all I see is the current most powerful hardware failing to reach a better frame rate. Frame rate over resolution always.
 
Even Nvidia is clearer and more tranparent with this work around power shifting tech.

GeForce_Spring_2020_14.jpg

GeForce_Spring_2020_15.jpg
 
I see a lot of if's and's and butts but all I see is the current most powerful hardware failing to reach a better frame rate. Frame rate over resolution always.
ok dude, think what you want, not gonna waste time with people who can't understand even simple facts or are just straight up fanboying\trolling.
good day to you.
 
Last edited:
ok dude, think what you want, not gonna waste time with people who can't understand even simple facts or are just straight up fanboying\trolling.
good day to you.
Facts? The fact of the matter is the ps4 pro version runs at a better frame rate, period. Is it a lower resolution, durrr ive stated that. Somehow you oh so special people want to play fantasy with "well if the ps pro ran at a higher resolution the frame rate would dive" sure and if my aunt had balls she'd be my uncle, but the ps pro version doesnt and in fact has a HIGHER frame rate and PERFORMS better than the x version. Would you like me to teach you how to tie your shoes next?
 
Last edited:
Even Nvidia is clearer and more tranparent with this work around power shifting tech.

GeForce_Spring_2020_14.jpg

GeForce_Spring_2020_15.jpg

That's similar to AMD Smartshift.

PS5 is different although it also incorporates the AMD technology.

PS5 will throttle based on workloads and not based on thermals. CPU and GPU will run at maximum frequency for normal workloads (which is 99% of the time). It will only throttle down by a few percent if the work load has reached its maximum for BOTH the CPU and GPU. If the workload has reached 100% utilization for the GPU only, it will not throttle because the CPU can shift its power budget.

It's like having the PS5 APU running at constant boost without having to revamp its power budget and cooling budget. There will be no worst case scenario that Sony has to account for because that 10 milliseconds of 100% utilization for both CPU and GPU (which is very rare) will be compensated by throttling the frequency back by a few percent.

This is a smart and efficient design. PS5 APU will be punching above its weight 99% of the time without Sony having to increase power and cooling budget.
 
Last edited:
Facts? The fact of the matter is the ps4 pro version runs at a better frame rate, period. Is it a lower resolution, durrr ive stated that. Somehow you oh so special people want to play fantasy with "well if the ps pro ran at a higher resolution the frame rate would dive" sure and if my aunt had balls she'd be my uncle, but the ps pro version doesnt and in fact has a HIGHER frame rate and PERFORMS better than the x version. Would you like me to teach you to tie your shoes next?
dude what are you talking about? fantasy?
do you think an higher reasolution doesn't make the framerate worst? it's basic graphic stuff....it's not an abstract concept, it's fucking reality.

i don't need to defend this stuff because it's only logical when you put off your sony glasses and think about it for a moment.
 
Last edited:
We can definitely see a CGI running at 1080p looking better than Fortnite or PUBG running at 4K. (an exaggeration of course but you get the point)



Assets can be created procedurally and not all of them has to be handcrafted. I believe I saw in the documentary about HZD that Guerrilla Games did just that. These created assets doesn't have to be cut down or reduced in size just so they can fit in the RAM (including those pre-cached assets that may or may not be used).

You can use machine learning for that at runtime, don't need to waste resources there. And Mip Mapping won't go away. You still need multiple versions of textures because of their distance to the player and to avoid shimmering.
 
That's quite the assumption considering both of them are CBR.

Edit: lol Ethomaz beat me to it.
yep, it's quite the assumption saying that with a better resolution the framerate get worst, it's not like a well known fact for whoever understand even a little bit about graphics...

i usually put games on my pc at 4k if i want a better framerate :messenger_sunglasses:
 
[...] know better then the hardware engineers designing the PS5. I've rarely seen so much hubris even within this industry.

People have all the rights to have doubts and concerns after how Pro ended up being, that wasn't a well designed system, to say the least. Cerny has a track record of designing two systems so far, one of them is PS4, which was really great, but mostly because it wasn't bad, or exotic - it was as simple as possible, basically a closed-spec PC, and devs loved that, it's basically the most balanced, most well-designed console along with X360. He did fantastic job there, no doubts. But his other consoles is Pro, a "4K" consoles that field to deliver what was promised, didn't even come with UHD BD, and the only change compared to base model was basically that "butterfly" GPU, which A) overheats like crazy, hence the jet engine noise, and B) already caused some serious issues with base model compatibility, basically without a patch the games run exactly the same, without MS showing how it's done with X1X we would most likely never receive the Boost Mode in Pro. And his PS5 presentation and recent DF interview don't sound confident/optimistic, so yeah, people have all the rights to question PS5's design, until proven otherwise.

[...] are making a 30 fps game. That'll be 600 MBs of data that can be both requested and delivered ever frame! Marvel's Spiderman for the PS4 had 1.5 MBs per frame that was being streamed from the HDD. The difference between 600 MBs per frame and 1.5 MBs per frame is orders and orders of magnitude different! In theory, the quality of the textures in PS5 games should be darn near movie-like quality given those numbers for a 30 fps game. Next-gen games will 100% guarantee look like this in real time....

Please no more 30FPS BS....
 
We don't know for sure if the 3.5GHz speed on the cpu is with SMT disabled. If it is then we are talking about more than just a 3% difference in cpu power. Given that 3.5GHz is a best case for Sony with the variable clock it could be a lot more. Tell us more about the 'contrary to what they want you to believe' conspiracy comment will you? Microsoft has stated that a custom decompression block reduces the texture decompression IO cpu usage down to 1/10 of a zen2 core. It sounds like Sony has done a bit better offloading IO from the cpu but I doubt there will be much difference between them as both have a custom approach.

The biggest differentiator from what I can see if going to be in ray tracing. The ps5 appears a little inadequate due to the lower CU count and as a result this may impact the visual fidelity of ray tracing on the ps5.
We do know SMT is always enabled on PS5, they never talked about deactivating it. That's FUD.

Even if the CPU is downclocked, in most cases it won't impact the performance. Actually the CPU is going to be downclocked everytime it can afford to without impacting the perfs (when CPU is idle and is waiting for next vsync for instance), so probably most of the time. But the perf will be in most cases identical. Cerny has talked about this.

1/10 of Zen 2 core for I/O on XSX ? That's already 1.5% of the total CPU power dedicated to I/O on XSX. CPUs on both machines will be near identical in performance.
 
You can use machine learning for that at runtime,

Sure. But whether procedural assets at runtime (which uses GPU resources) or streaming insane quality textures will produce better visual result will be seen once the games starts to appear. There's a lot to be said about Cerny's vision for the SSD. We'll have to wait and see if Cerny is right in his call. I understand the doubts though.

Now if you want to be more technical and debate about that, I can't give you that. I don't make games. All I have is developers saying it will make a difference especially that one tweet by a dev with a very good explanation. And it's still a wait and see for me to be fair. At least unlike that Dictator who has a strong opinion about it as though his opinion weigh a lot heavier that developers who make AAA games.
 
dude what are you talking about? fantasy?
do you think an higher reasolution doesn't make the framerate worst? it's basic graphic stuff....it's not an abstract concept, it's fucking reality.

i don't need to defend this stuff because it's only logical when you put off your sony glasses and think about it for a moment.
Do you know how to read? Ive stated that a higher resolution would make the frame rate janky, but it doesnt have a higher resolution than the X however it does have a higher framerate thus the game PERFORMS (performs as in plays better) than the X version. No idea how a fact that the ps pro version has a higher frame rate is a fan boy statement? Jesus is this what I have to work with?
 
Last edited:
Do you know how to read? Ive stated that a higher resolution would make the frame rate janky, but it doesnt have a higher resolution than the X however it does have a higher framerate thus the game PERFORMS (performs as in plays better) than the X version. No idea how a fact that the ps pro version has a higher frame rate is a fan boy statement? Jesus is this what I have to work with?
but this wan not my point, my point is that with resolution parity, at 99% the xone x will perform better, we can at least agree on this?
i know that this it's an hypotetical thing because the game is out and the ps4 pro version has a better framerate, i never said the contrary, but thinking that a better hardware with the same resolution would not be better is kinda absurd, don't you think?

i don't even have a xone x or the game, do you think i have any interest in defending the xone x version by making stuff up? i'm most of a sony guys at the moment because they have the better exclusive for my taste and i only need a ps5 for next gen, but this doesn't mean that i can't recognize some merit to xone x.
 
People have all the rights to have doubts and concerns after how Pro ended up being, that wasn't a well designed system, to say the least. Cerny has a track record of designing two systems so far, one of them is PS4, which was really great, but mostly because it wasn't bad, or exotic - it was as simple as possible, basically a closed-spec PC, and devs loved that, it's basically the most balanced, most well-designed console along with X360. He did fantastic job there, no doubts. But his other consoles is Pro, a "4K" consoles that field to deliver what was promised, didn't even come with UHD BD, and the only change compared to base model was basically that "butterfly" GPU, which A) overheats like crazy, hence the jet engine noise, and B) already caused some serious issues with base model compatibility, basically without a patch the games run exactly the same, without MS showing how it's done with X1X we would most likely never receive the Boost Mode in Pro. And his PS5 presentation and recent DF interview don't sound confident/optimistic, so yeah, people have all the rights to question PS5's design, until proven otherwise.

What? This entire paragraph is a mess. Also, MS released the 1X a year after Pro. Of course they were going to make up where the Pro "disappointed". In the end though, the Pro still outsold the 1X by itself for being such a "failure to deliver".
 
Could Still mean that the fact that ps5 could operate smoother then the xsx due to the ram setup. With the own spu ram pool on audio and with using much cpu and fast ssd without using much ram. And i know xsx is only using 10gig of ram and another 3gig from a much slower bus speed. Where ps5 could use more then 10gig without sacrificing speed. So considering the cu count and rdna 2 ps5 is much more capable system of playing in 4k
 
I REALLY don't understand why Richard Ledbetter made the DF video so freaking terrible, compared to their article. The article is 1000x better and he WROTE the article! Like check this part out about the SSD.



Do people here at GAF truly understand what this quote even means? Think about it for a second......data off the SSD can be BOTH requested AND delivered within one or two frames! NO.....not 1 or 2 seconds. But 1 or 2 frames! With GPU scrubbers in the mix (I'm still not sure if the scrubbers are hardware based or not), this completely changes the way data can and will be used relative to this generation.

Let's assume a dev has the data compressed using Kraken from an SSD at 18 GBs per second and they are making a 30 fps game. That'll be 600 MBs of data that can be both requested and delivered ever frame! Marvel's Spiderman for the PS4 had 1.5 MBs per frame that was being streamed from the HDD. The difference between 600 MBs per frame and 1.5 MBs per frame is orders and orders of magnitude different! In theory, the quality of the textures in PS5 games should be darn near movie-like quality given those numbers for a 30 fps game. Next-gen games will 100% guarantee look like this in real time....


I need to read this article then......If there is an article, why was this video butchered.....I hope there's information on the Geometry engine too and more on Tempest....
 
Well for one Microsoft have the Xbox Series S and X to point to as proof they learned their lessons on dispersing heat from a console. The PS4 Pro and Base PS4 sound like rocket engines taking off. Microsoft have also let multiple outlets tear down the console and see the cooling system for themselves. That goes a long way to putting any heating issues or concerns of heating issues to rest.
I don't think the question was about heat but rather about whether they have smartshift too and their stated clocks are the same scenario .
 
Last edited:
That's similar to AMD Smartshift.

PS5 is different although it also incorporates the AMD technology.

PS5 will throttle based on workloads and not based on thermals. CPU and GPU will run at maximum frequency for normal workloads (which is 99% of the time). It will only throttle down by a few percent if the work load has reached its maximum for BOTH the CPU and GPU. If the workload has reached 100% utilization for the GPU only, it will not throttle because the CPU can shift its power budget.

It's like having the PS5 APU running at constant boost without having to revamp its power budget and cooling budget. There will be no worst case scenario that Sony has to account for because that 10 milliseconds of 100% utilization for both CPU and GPU (which is very rare) will be compensated by throttling the frequency back by a few percent.

This is a smart and efficient design. PS5 APU will be punching above its weight 99% of the time without Sony having to increase power and cooling budget.

There is nothing efficient by having to overclocked and run at max clocks in a game profile.
PS5 APU will be forced to run above its weight.

Besides Nvidia slides indicate W(att) = power/workload. Same thing. Just not locked overclocked, there is no need for it in a real smart efficient design.
 
I REALLY don't understand why Richard Ledbetter made the DF video so freaking terrible, compared to their article. The article is 1000x better and he WROTE the article! Like check this part out about the SSD.



Do people here at GAF truly understand what this quote even means? Think about it for a second......data off the SSD can be BOTH requested AND delivered within one or two frames! NO.....not 1 or 2 seconds. But 1 or 2 frames! With GPU scrubbers in the mix (I'm still not sure if the scrubbers are hardware based or not), this completely changes the way data can and will be used relative to this generation.

Let's assume a dev has the data compressed using Kraken from an SSD at 18 GBs per second and they are making a 30 fps game. That'll be 600 MBs of data that can be both requested and delivered ever frame! Marvel's Spiderman for the PS4 had 1.5 MBs per frame that was being streamed from the HDD. The difference between 600 MBs per frame and 1.5 MBs per frame is orders and orders of magnitude different! In theory, the quality of the textures in PS5 games should be darn near movie-like quality given those numbers for a 30 fps game. Next-gen games will 100% guarantee look like this in real time....


To be honest I didn't understand the full meaning until you explained it now. That's why I am actually on NeoGAF. I am not even mad about it.



I need to read this article then......If there is an article, why was this video butchered.....I hope there's information on the Geometry engine too and more on Tempest....

it's because there are hidden messages everywhere. Cerny maybe asked DF to be illusive and to be blunt in the video but on point in the article! Orrrrrr he MADE him do it! Imagine that, he has the ability to manipulate space and time!
 
Last edited:
Do people here at GAF truly understand what this quote even means? Think about it for a second......data off the SSD can be BOTH requested AND delivered within one or two frames! NO.....not 1 or 2 seconds. But 1 or 2 frames! With GPU scrubbers in the mix (I'm still not sure if the scrubbers are hardware based or not), this completely changes the way data can and will be used relative to this generation.

Let's assume a dev has the data compressed using Kraken from an SSD at 18 GBs per second and they are making a 30 fps game. That'll be 600 MBs of data that can be both requested and delivered ever frame! Marvel's Spiderman for the PS4 had 1.5 MBs per frame that was being streamed from the HDD. The difference between 600 MBs per frame and 1.5 MBs per frame is orders and orders of magnitude different! In theory, the quality of the textures in PS5 games should be darn near movie-like quality given those numbers for a 30 fps game. Next-gen games will 100% guarantee look like this in real time....



That is my expectation as well reading into the tweets of a dev regarding the speed of the SSD loading data as you turn view allowing every tree have a 3D bark full of ants marching (or something like that lol).

Insane detail and texture resolution everywhere at every turn. Don't let me down Cerny.
 
One more important thing that must be talked about. If the fan speed is constant then that means that by the time, for example 3 years of use you will get into problems (or if you store it in the wrong place) because it collects some dust and the cooling will be worse = performance will be worse. Or this constant fan speed just means that it can adjust itself for such cases but keeping the same fan speed which will be faster as time goes on? For example it will be a bit louder after 3 years but maybe you won't even notice.
 
There is nothing efficient by having to overclocked and run at max clocks in a game profile.
PS5 APU will be forced to run above its weight.

Cerny had a budget for silicon and size for that APU in the PS5. They didn't shoot for a big APU although obviously AMD can provide that.

Working with the APU they created, they could clock it at 3.2Ghz for the CPU and 2Ghz for the GPU part. But they have to account for the worst case scenario of 100% utilization although it only happens 1% of the time for less than a second.

Now, without changing the power budget and cooling budget as planned for the 3.2ghz/2ghz APU, they clocked it a lot higher than usual. It can stay on that higher clock 99% of the time anyway because game code cannot saturate the CPU and GPU at the same time 100%. If there is ANY occasion that that will happen, Cerny devised a method where the APU will downclock a few percentage to compensate.

Now that is a smart and efficient design. Cerny was able to clock the APU higher without having to account for a very rare spike in power usage. The APU is punching above its weight 99% of the time without blowing up the budget.

Cerny could design a 500mm APU if he wants to. But of course there is price target for the console. What Cerny has done with the PS5 APU is smart and efficient while remaining on the cheap side.
 
Last edited:
PS5 is using AVFS (Adaptive Voltage and Frequency Scaling) which allows for higher frequencies with lower voltage requirements. It also helps reduce variance between chips, increases lifetime and reliability, and can improve yields.

As far as having to down clock during certain high workloads, I think it is to mitigate voltage droops.


AMD Polaris Whitepaper said:
"Another advantage of AVFS is that it naturally handles changes induced by the workload. For example, when a complex effect such as an explosion or hair shader starts running, it will activate large portions of the GPU that suddenly draw power and cause the voltage to "droop" temporarily until the voltage regulators can respond. Conceptually, these voltage droops in a GPU or processor are similar to brownouts in a power grid (e.g. caused by millions of customers turning on their lights when they get home from work around 6pm).

The power supply monitors detect the voltage droop in 1-2 cycles, and then a clock-stretching circuit temporarily decreases the frequency just enough so that all circuits will work safely during the droop. The clock stretcher responds to voltage droops greater than 2.5% and can reduce the frequency by up to 20%. These droops events are quite rare, and the average clock frequency decreases by less than 1%, with almost no impact on performance. However, the efficiency benefits are quite large. The clock-stretching circuits enable increasing the frequency of Polaris GPUs by up to 140MHz.

 
there is more theorycrafting in here than there is supercomputing power onplanet earth, to be brutally honest. If, god help us, things turn out half as spectacular, as any of the threads in the last weeks about these topics, the upcoming consoles will be able to fold hundreds of proteins in miliseconds in 3d and cure cancer, aids and corona in a blink of an eye and the best thing is they will deliver their solutions in a whopping 256 bit, 10,000 superchannel sample with the voice of Stevie Wonder and Tina Turner unified into one transcendent voice of god himself - they will literally whisper you the truth in your ear!

and all of that because of the secret sauce and the teraFLOPS and Tempest!
WOW!
 
Last edited:
Mark Cerny:
I suspect you will have to keep on posting that for a long time now......No matter how much evidence you post or how you logically word your post, they will just repeat the same thing....They are not looking for answers or debates, they are just trying to spread FUD, so if you constantly engage it will be the same "LOOP" with no >END< Code....

Just look at some of the Vgtech threads, even when you present hard facts, they pretend the facts aren't there, the much higher FPS is irrelevant in RE3......They attack the poster because they cant argue against the stats.....The concern is more to defend the box at all costs and eschew the cold hard facts...….It's the same with PS5, it's 10.3TF, but they still say 9.2TF......These are not people wanting debate, these are just people set out to lowball a console and play a dirty console war......I am a fan of PlayStation, but I will never misrepresent the stats of XBOX.....It's 12.1TF, I will never say it's 10TF or any such nonsense. Facts are facts, stats are stats.....

Keep in mind Mark Cerny was the Producer of both open-world games Spiderman and Death Stranding. He helped with the tech side of both games. I'd trust he knows how open-world games are made and how they run. Especially on a console he designed (considering he designed the PS4, PS4 Pro, and PS5).
I will take Dictator over him...….any day...Dictator knows what's best for an open world game....

Cerny also said this to DF during their interview.....
So why didn't you guys make a thread on the article alone......That video was the most underwhelming thing I've seen from DF for a while, it offered nothing new really, not after how Cerny broke the PS5 down so eloquently...….We should be discussing the new details in it's own thread, because most will just watch the video......I knew about an article since I saw the video but I thought it was just the content in the video, so I didn't bother...
 
We do know SMT is always enabled on PS5, they never talked about deactivating it. That's FUD.

Even if the CPU is downclocked, in most cases it won't impact the performance. Actually the CPU is going to be downclocked everytime it can afford to without impacting the perfs (when CPU is idle and is waiting for next vsync for instance), so probably most of the time. But the perf will be in most cases identical. Cerny has talked about this.

1/10 of Zen 2 core for I/O on XSX ? That's already 1.5% of the total CPU power dedicated to I/O on XSX. CPUs on both machines will be near identical in performance.

I remain sceptical regarding SMT enabled at 3.5GHz as it is naive to assume unless directly stated but I take your point. I do think in regards to cpu it is pretty much a wash between the two.

I really don't care if PS5 is slightly less performant the the Series X as I know it will have great games. My greatest concern is console longevity. It appears that Sony has been very conscious of the low CU count of the gpu and has mitigated this by clocking up to an absurd degree. I'm not at all comfortable with my PS5 clocking so high if it affects the longevity the device.
 
It was about the power before the reveal understandably. Now certain PS fans are on the defense, certain Xbox fans are rubbing it in. Most of us on here know when it comes down to it there's not going to be that much in it, but most of us being honest it was about that TF number. Ps fans arguing now know there really ain't going to be that much difference and Xbox fans on here should as well. So Xbox fans now you shouldn't argue about it now cause you'e just setting yourself's up for "told you so, ner ner ner ner ner" etc from certain Ps4 people comforting them selfs.
Me as iv'e already said they are going to be fairly close and that should be stating the obvious on this to most forum. I'm more interested now on how quiet they're going to be under constant power hungry stressing games.
But it is just too much fun.
 
  • There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower.
If this point is true, then yes, you are right and I'm wrong. I definitely hope this is the case. However, based on what I gathered in this thread, this point is simply not true, the CPU and the GPU can't run at their max frequencies at the same time. Several people said this. What's the truth then? We need the exact words from Cerny's mouth to decide but I suppose he was vague, that's why the confusion. And vaguePR talk rarely means good news. Again, I hope I'm wrong.
Logically it can't be true. If it were true, it would just be locked at those clocks. The fact that it's not, can only mean that it cannot sustain those speeds. All I hear is a lot of potentiallys, most of the times, and close toos, coming from PR machines.
 
The CPUs are virtually identical on both machines (3% diff). Considering on PS5 the CPU won't have the audio and the I/O to process at all (which won't be the case on XSX contrary to what they want you to believe), PS5 CPU may well be more potent in actual games.

And XSX bandwidth has constraints. But yeah XSX has a bit more Tflops power, there is no denying that.

https://news.xbox.com/en-us/2020/03/16/xbox-series-x-glossary/
Project Acoustics – Incubated over a decade by Microsoft Research, Project Acoustics accurately models sound propagation physics in mixed reality and games, employed by many AAA experiences today. It is unique in simulating wave effects like diffraction in complex scene geometries without straining CPU, enabling a much more immersive and lifelike auditory experience. Plug-in support for both the Unity and Unreal game engines empower the sound designer with expressive controls to mold reality. Developers will be able to easily leverage Project Acoustics with Xbox Series X through the addition of a new custom audio hardware block.

--------

MS also wasted money and resources on a new custom audio hardware block for XSX.

From https://gpuopen.com/gaming-product/true-audio-next/
AMD's True Audio Next has with Radeon Rays (running on GCN CUs).

Each CU in XSX GPU's 52 CU has 233.6 GFLOPS RDNA 2. XSX GPU also has higher RT cores due to 52 CU count.

PS5's CU based DSP has 100 GFLOPS.

https://www.eurogamer.net/articles/digitalfoundry-2020-inside-xbox-series-x-full-specs

"Memory performance is asymmetrical - it's not something we could have done with the PC," explains Andrew Goossen "10 gigabytes of physical memory [runs at] 560GB/s. We call this GPU optimal memory. Six gigabytes [runs at] 336GB/s. We call this standard memory. GPU optimal and standard offer identical performance for CPU audio and file IO. The only hardware component that sees a difference in the GPU."
In terms of how the memory is allocated, games get a total of 13.5GB in total, which encompasses all 10GB of GPU optimal memory and 3.5GB of standard memory. This leaves 2.5GB of GDDR6 memory from the slower pool for the operating system and the front-end shell.

xQ2vBjB.png



Try again.
 
Last edited:
Logically it can't be true. If it were true, it would just be locked at those clocks. The fact that it's not, can only mean that it cannot sustain those speeds. All I hear is a lot of potentiallys, most of the times, and close toos, coming from PR machines.

Down clocks plus 36 CU's equals Sony got caught with their pants down. Just my opinion. And all that Cerny fodder sounded like spin. It is what it is, the actual hardware doesn't lie. The PS5 is weaker. That's fine. With around 10 TF's Sony's devs should still pump out great games. I'll get a PS5 for exclusives only, everything else I'll play on XsX (hopefully some great exclusives there too with all the studios and talent MS gathered last few years) or PC.
 
Last edited:
Read carefully again. Project acoustics is an API (available on Unity since last year), notice the word 'leverage' here.

Developers will be able to easily leverage Project Acoustics with Xbox Series X through the addition of a new custom audio hardware block.

Leverage' means the custom audio hardware (probably some audio decompressor like on PS4) will help project Acoustics API, well it's probably going to decompress the audio before being processed by the CPU with the API.

AFAIK they never actually detailed what their 'custom audio hardware' exactly do.
 
Why do you guys care so much about who has the "stronger" console? You've been arguing for weeks about this shit. The people that like Playstation will buy the next one no matter what. The same goes for Xbox. Most of you are grownups, with jobs, and could buy both of them if you really wanted to.
Both consoles are far better than their predecessors; meaning the games you love playing will play and look even better than they did before. The OS on both consoles are also going to be better as well.
Too bad this comment can't be pinned as a public service announcement.
 
Logically it can't be true. If it were true, it would just be locked at those clocks. The fact that it's not, can only mean that it cannot sustain those speeds. All I hear is a lot of potentiallys, most of the times, and close toos, coming from PR machines.

Yes it can. Because real world games will not saturate the GPU and CPU on workloads. Therefore, except in very very rare cases (code specifically designed to saturate a 100% workload, not real-world scenario), the APU will operate at that clockspeed. That rare may not even come because PS5 is a close box and developers may choose a different approach to development to not saturate the workload on the APU in their specific implementations but still give the same result. (I'm just echoing what Mark Cerny said.) The problem is some people think Cerny is a liar. He's an engineer, not a PR man.

Cerny designed the PS5 APU with a specific price target in mind. He could have chosen a bigger APU at the expense of higher price tag. But what he has done with the APU actually make the APU punch above its weight. It's a smart design.
 
Last edited:
Read carefully again. Project acoustics is an API (available on Unity since last year), notice the word 'leverage' here.



Leverage' means the custom audio hardware (probably some audio decompressor like on PS4) will help project Acoustics API, well it's probably going to decompress the audio before being processed by the CPU with the API.

AFAIK they never actually detailed what their 'custom audio hardware' exactly do.
From https://gpuopen.com/gaming-product/true-audio-next/
AMD's True Audio Next has with Radeon Rays (running on GCN CUs).

Each CU in XSX GPU's 52 CU has 233.6 GFLOPS RDNA 2. XSX GPU also has higher RT cores due to 52 CU count.

PS5's CU based DSP has 100 GFLOPS.
 
Sony just recycled one of the dead CU for sound. You can say PS5 is a 36.5CU apu.
That is being cheap or environmental friendly.
 
Top Bottom