[Digital Foundry]PS5 uncovered

That doesn't mean cpu will be running at full power. The devs had to downclock the cpu to keep gpu freq at 2.23. In retail variable will be active but since dev coded game to have cpu throttled to keep gpu maxed out, it won't matter much with variable. It'll only let cpu go higher freq when nothing is being pushed much. But since the game devs make have cpu throttled to keep gpu at 2.23, then in game the cpu will be mostly downclocked below 3.5ghz as well
Downclocked it to keep the heat across the die at a consistent temperature. Says in the video.
 
Last edited:
I might be alone but all of these threads have thoroughly killed my hype for next gen. It's not the information, it's the constant bickering and gotchas from people that are literally just guessing and shitting up the board. I'm going to play what I have and remain blissfully ignorant until they release. :messenger_peace:
 
So either multiple devs are lying or Cerny is.

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
Cerny is not really lying. He's playing with words.

The reality is this. I'll try to break it down as simple as possible;

  • The PS5 has a max power budget based on the PS5 cooling capabilities.
  • That power budget needs to be divided between the GPU and the CPU.
  • The max clocks can be reached as long as that power budget is not exceeded.
  • The actual power used by the system depends on the workload of the components.
  • It is rare that both the CPU and GPU are at 100% workload at all times.
  • If the power budget is at risk of being exceeded, the clocks of the hardware that has the lower workload, be it the CPU or the GPU, is lowered in order to not exceed the max power budget.
  • The system cannot handle both the CPU and the GPU having max workload, but it can handle both of them having max clocks.
  • Developers can choose to limit the workload on one to max out the clock speed of the other.
 
Yup and those games the devs made will have cpu being downclocked in retail as well. otherwise it'll be both max freq with no issues then ;)
Aye, the dev kits don't have Smartshift so they have to do it manually seemingly. As another poster said, it's due to the cross-gen titles being held by Jaguar cores so its just underutilising the CPU hugely.
All fascinating stuff like.

EDIT: I wonder if only the in house/first party studios have access to the Smartshift kits, the ones doing PS5 launch titles. They might be trying to cut down on leaks.
 
Last edited:
Cerny is not really lying. He's playing with words.

  • The system cannot handle both the CPU and the GPU having max workload, but it can handle both of them having max clocks.

From what I understood, instead of drawing additional power from the mains, AMD Smartshift will downclock either GPU/CPU whichever is not under load and that saved power instead will be redirected to the other component.
 
Devs spoke with Rich about devkits folks. Not the retail version. Is on the video.
Dev kits are generally more powerful than retail consoles. XSX dev kits for example, have 56CUs rather than 52 of the retail one.

Most of the time at max. BOTH
You do not understand. Cerny is using wordplay. They can run at max clocks at the same time. But they cannot run at max workload at the same time.

Let me put it like this. You can lock your CPU at 4GHz in Windows and disable downclocking when idle. If you look in Task Manager, your CPU usage will be around 5% or so when idle despite the clock remaining at 4GHz. The workload is 5%. If you increase the workload, the CPU usage goes to 100% and the heat and power use go up, despite still running at clocks of 4GHz. If you're limited by power supply, the system will have to lower the max clocks to reduce power consumption. In that case, the CPU remains at 100% usage, but will be clocked at 3.5 GHz instead. The max workload achievable cannot be reached due to the power limit.

I kept the GPU out for simplicity's sake.
 
Last edited:
The PR messaging is all over the place once again and this version of a deep dive with developer input only stirs up more confusion. This is why you actually got to showcase things in action and not keep on talking theoretical scenarios.

They tripped over they own feet and and landed flat on their face. They shouldn't open their mouth again until they can release the design,price, and gameplay footage of one of their titles that is coming out soon to calm their user base, because this sort of communication thus far has been abysmal.

I actually can't believe that its this bad.
 
DigitalFoundry said:
The power budget for the overall SoC - there's a set limit there tied to the thermal dissipation capabilities of the cooler.


"Higher clocks"...yeah, right. Those are VARIABLE clocks, which COULD be achieved on certain peak moments, but nowhere near mantained constantly.

I have a MacBook Air that has an Intel i5 CPU that's clocked at 1.6 Ghz and turbo-boosts up to 2.5 Ghz. Can I say my CPU is 2.5Ghz when in reality it rarely hits that clock speed? Of course not!

Same happens with PS5 variable clocks ;)
Pretty sure people have taken the effort to address the difference between thermal throttling and what the PS5 clock strategy is. It's not how you think it, unrelated strategies.
So.... it will thermal throttle. Colour me surprised. /s
 
Last edited:
So either multiple devs are lying or Cerny is.

"There's enough power that both CPU and GPU can run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."


It's neither

"Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.


Optimize using a worst case, and retail units boost past it to near the peak most of the time.
 
You're the one who's not understanding. the PS5 uses AMD smartshift. Cerny is not inventing anything here. Look at the diagram below.
The keyword is "shared budget". the power used by the CPU has an impact on what power is available to be used by the GPU and vice versa.
Unless you are claiming that power is irrelevant for the CPU or GPU to deliver their maximum performance, your interpretation is flawed.
AQPviMR.jpg

I think so many Sony fans are going to be utterly dissapointed when the PS5 are out and people start testing them.
 


really wondering what will happened on ps5 in similar scenario
also will main thread high ipc love in games stop at some point ?
 
Last edited:
I might be alone but all of these threads have thoroughly killed my hype for next gen. It's not the information, it's the constant bickering and gotchas from people that are literally just guessing and shitting up the board. I'm going to play what I have and remain blissfully ignorant until they release. :messenger_peace:


Im still hyped but these threads have become redundant awhile now. Especially all the derailing. I've honestly been over the specs since the two reveals. Just want to see some games already.

Did they show the console at all??? Can't watch rn
 
That doesn't mean cpu will be running at full power. The devs had to downclock the cpu to keep gpu freq at 2.23. In retail variable will be active but since dev coded game to have cpu throttled to keep gpu maxed out, it won't matter much with variable. It'll only let cpu go higher freq when nothing is being pushed much. But since the game devs make have cpu throttled to keep gpu at 2.23, then in game the cpu will be mostly downclocked below 3.5ghz as well
Yeap that is invisible to games you will play.
I'm just saying in that case the CPU is dropping the frequency because it is not being used like DF stated... actually these devs choose a debug profile where the CPU run slower... that won't exists in retail unit.

There is be cases where devs will have either CPU or GPU dropping frequency due workload.
 
Last edited:
Max. clock for both in the same time has nothing to do with thermals. He stated that at GDC
Everything has to do with thermals. You can't magically ignore heat output. The power limit he talks about is based on the cooling capabilities of the console. The difference is that they are not using temperature sensors to determine the heat output in real-time. THey have created preset tables to ensure every workload throttles back predictably, rather than dependent on the environment.

But WTF are you using? Oh, yes, FUD
Welcome to my ignore list.
Yes. Just like that. You are not interested in understanding anything. You're only interested in defending the PS5. And I have no time or use for such content. Goodbye.
 
Cerny is not really lying. He's playing with words.

The reality is this. I'll try to break it down as simple as possible;

  • The PS5 has a max power budget based on the PS5 cooling capabilities.
  • That power budget needs to be divided between the GPU and the CPU.
  • The max clocks can be reached as long as that power budget is not exceeded.
  • The actual power used by the system depends on the workload of the components.
  • It is rare that both the CPU and GPU are at 100% workload at all times.
  • If the power budget is at risk of being exceeded, the clocks of the hardware that has the lower workload, be it the CPU or the GPU, is lowered in order to not exceed the max power budget.
  • The system cannot handle both the CPU and the GPU having max workload, but it can handle both of them having max clocks.
  • Developers can choose to limit the workload on one to max out the clock speed of the other.
so basically when the console has to run games that are heavy both on cpu and gpu like gta6 for example, the console is not gonna work at his best absolute performance because the workload it's gonna be so high that the system has to shift power from the gpu to cpu and viceversa?
is this really an optimal solutions for devs? it sounds so contrived...
 
Here is the reality:
The PS5 is going to be an outstanding gaming console. One that I will own.
On paper, the XSX is more powerful and will likely have better performance. I will own that one too

Here is where I have a question:
A lot of people are skeptical about Sony's boost. Claiming it is a desperate decision and they wont hit those boosts. Is this true? Maybe, but I'll believe it when I see it. People are skeptical that Sony can keep the system cool to maintain boost.

On that same note, why is everyone taking Microsofts claim that the XSX system will maintain these massive speeds at face value. I'd say it's just as possible that the XSX could have heating issues.
You're right, there's no guarantee that the XSX won't have heating issues. But then the console will crash or go into another RROD scenario. They are facing the issue upfront with their technical cooling solution and not compromising the performance that the customer paid for in order to mitigate heating issues. It's a choice.
 
On that same note, why is everyone taking Microsofts claim that the XSX system will maintain these massive speeds at face value. I'd say it's just as possible that the XSX could have heating issues.

Well for one Microsoft have the Xbox Series S and X to point to as proof they learned their lessons on dispersing heat from a console. The PS4 Pro and Base PS4 sound like rocket engines taking off. Microsoft have also let multiple outlets tear down the console and see the cooling system for themselves. That goes a long way to putting any heating issues or concerns of heating issues to rest.
 
nope

"Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing,...Right now, it's still difficult to get a grip on boost and the extent to which clocks may vary. "

"Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core."

So, you posted the same stuff on ERA and it was put to the ground. What happened ?
 
Dev kits are generally more powerful than retail consoles. XSX dev kits for example, have 56CUs rather than 52 of the retail one.
That was never PlayStation case... except more RAM and HDD space the PS final devkits have identical power of the retail machine.
 
The more I hear about the PS5, the more I am interested.

If you are not a console fanboy then it is really easy to be interested in the PS5 as well as the Xbox Series X. It is only the console warrior idiots who are trying to downplay 36cu vs 52 or whatever the hell XBox has. Most of them have zero clue what the ramifications of that even are. PS5 is weaker and will have awesome games. Xbox Series X is more powerful and with the right developers will have the better looking 1st and third party games. But Sony still have the ace up their sleeve with their first party lineup. That is essentially what the entire console war boils down to.
 
so basically when the console has to run games that are heavy both on cpu and gpu like gta6 for example, the console is not gonna work at his best absolute performance because the workload it's gonna be so high that the system has to shift power from the gpu to cpu and viceversa?
is this really an optimal solutions for devs? it sounds so contrived...
Well, as already stated by DF, devs will most likely throttle one to maximize the other. The thing is, doing it this way will allow devs to extract more performance than having set clocks. Otherwise we would likely have a PS5 running at say 3.2 GHz CPU and 1.9 GHz GPU flat, for example. But this way, you can boost your CPU or GPU at the times where the other is idle.

Say for example you have to render 50 people on the street with high polygons. That is high on the CPU, but a street itself is not that taxing on the GPU. You lower the GPU power usage and give the CPU its max performance.
If you have a scene with ray tracing but with barely any polygons in it, the CPU will not need to use that much power but the GPU is under a heavy workload. You then shift the power to the GPU.
If you had set clocks, you would have lower performance in either of the above scenarios.

And games are basically like that. There are very few situations where both the CPU and the GPU are both at max workload. What the PS5 is doing is very smart and efficient, compared to what its specs would be if they didn't do it this way. It's still inferior to the brute force approach of the XSX however, because the max performance of the PS5 is simply below the standard performance of the XSX. If the XSX was allowed to use SmartShift, the gap would only widen.
 
Last edited:
Everything has to do with thermals. You can't magically ignore heat output. The power limit he talks about is based on the cooling capabilities of the console. The difference is that they are not using temperature sensors to determine the heat output in real-time. THey have created preset tables to ensure every workload throttles back predictably, rather than dependent on the environment.


Welcome to my ignore list.
Yes. Just like that. You are not interested in understanding anything. You're only interested in defending the PS5. And I have no time or use for such content. Goodbye.

To bad you're aren't on ignore list to all members.

Will copy paste onw post from ERA thread :

Alex said what is elaborated on in the article - that some developers DF spoke to said they were using locked profiles in order to keep the GPU at 2.23Ghz all the time. Cerny makes a slightly different point here - that he expects processing to run most of the time 'at or near' peak clocks when the chip is busy.

If devs want a 100% locked GPU clock even when not busy, they have a debug profile that lets them do that. And the only way to guarantee it would be to use one of those profiles. But it's a debug profile rather than a release profile it seems.

So Alex was right, and Cerny was right, but they're talking in slightly different contexts and with different degrees of precision I think re. how tightly the clock stays pinned to peak.

So, yeah. It can be both at max. frequency. Like i've said, Xbone fans are insane
 
"I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."


Worst case means /really/ worst case
 
"I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."


Worst case means /really/ worst case
Shit-your-pants worst-case sillycoding.
 
If you are not a console fanboy then it is really easy to be interested in the PS5 as well as the Xbox Series X. It is only the console warrior idiots who are trying to downplay 36cu vs 52 or whatever the hell XBox has. Most of them have zero clue what the ramifications of that even are. PS5 is weaker and will have awesome games. Xbox Series X is more powerful and with the right developers will have the better looking 1st and third party games. But Sony still have the ace up their sleeve with their first party lineup. That is essentially what the entire console war boils down to.
Yup. It's brand preference.
 
Well, as already stated by DF, devs will most likely throttle one to maximize the other. The thing is, doing it this way will allow devs to extract more performance than having set clocks. Otherwise we would likely have a PS5 running at say 3.2 GHz CPU and 1.9 GHz GPU flat, for example. But this way, you can boost your CPU or GPU at the times where the other is idle.

Say for example you have to render 50 people on the street with high polygons. That is high on the CPU, but a street itself is not that taxing on the GPU. You lower the GPU power usage and give the CPU its max performance.
If you have a scene with ray tracing but with barely any polygons in it, the CPU will not need to use that much power but the GPU is under a heavy workload. You then shift the power to the GPU.
If you had set clocks, you would have lower performance in either of the above scenarios.

And games are basically like that. There are very few situations where both the CPU and the GPU are both at max workload. What the PS5 is doing is very smart and efficient, compared to what its specs would be if they didn't do it this way. It's still inferior to the brute force approach of the XSX however, because the max performance of the PS5 is simply below the standard performance of the XSX. If the XSX was allowed to use SmartShift, the gap would only widen.
but what about the scene where all the power from cpu and gpu is needed?
you make the example of a street with 50 people in let's say a demanding game like gta6, but what about if you start a shooting in that street and bothcpu and gpu have to be used at their max power? you will see the framerate tanking because the machine can't sustain that scene in the same way sex does?!

i mean, maybe for the first 2 years it's not gonna be a problem, but what about future really heavy games that use every bit of power?

it sound like a limit for devs who want to create more demanding\ambitious games...
 
Last edited:
Dev kits are generally more powerful than retail consoles. XSX dev kits for example, have 56CUs rather than 52 of the retail one.

You have a PS5 devkit? If so, please share what still missing in comparison with the retail version.
 
so basically when the console has to run games that are heavy both on cpu and gpu like gta6 for example, the console is not gonna work at his best absolute performance because the workload it's gonna be so high that the system has to shift power from the gpu to cpu and viceversa?
is this really an optimal solutions for devs? it sounds so contrived...
NO. This has to do with designing PS5 with fixed TDP to ensure optimal cooling for all the components inside it. Apparently, game will be designed for that fixed TDP and that fixed TDP ensures that CPU/GPU run at those aforementioned clockspeeds delivering peak performance. So if some section of that game is GPU intensive without any additional load on CPU, PS5(without implementation of AMD Smartshift) would need to draw additional power from mains to maintain optimal workload making the Fans run like "Jet Engine". Now the implementation of AMD's Smartshift ensures that this tech will kick-in which will reduce the frequency of CPU thereby reducing power needed significantly, that power saved will be redirected to GPU instead to run optimally.
 
While everyone is arguing over max frequency specs are we going to ignore that Cerny officially verified that ps5 games are going to be designed cross generational around the ps4 Jaguar CPU capabilities?

One of the major points I see in the "war" between ps and xb fans is ps fans stating they are getting a bunch of games spcifically built around new ps5 specs at launch but it doesnt sound like this will be the case after all...
 
Top Bottom