[Digital Foundry]PS5 uncovered

There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz simultaneously while under high utilization most of the time.

It's just hard for me get past the "potentially" and "most of the time" caveats. I honestly would have felt better if it was just 3.3 ghz cpu and 2.0 ghz gpu fixed all the time. No room for inconsistencies or development shortcomings due to poor use of the power profiles.
 
It's just hard for me get past the "potentially" and "most of the time" caveats. I honestly would have felt better if it was just 3.3 ghz cpu and 2.0 ghz gpu fixed all the time. No room for inconsistencies or development shortcomings due to poor use of the power profiles.
Agreed.

And that's what doesn't make sense. The system supposed works at max speeds of 3.5 and 2.23 no problem, yet then you get these caveats of "up to" or "most of the time" and "need to balance max power budget", etc.....

In other words, a highly intense game cannot maintain maxing out at 3.5 and 2.23. It will have to downclock to cool down. Then when things are less hectic, it can perhaps max out again.

There's a difference playing Tetris where the cpu and gpu are barely working and can shift clock cycles if it feels like it, vs. GTA6 with 100 people on screen and the dev is trying to grind out 3.5 and 2.23 for as long as possible.

But we don't know how long PS5 can sustain it before down clocking since Cerny was vague about it.

Maybe it can only last 1 minute? Maybe 1 hr? Maybe 10 hrs straight? Nobody knows.
 
Last edited:
It's just hard for me get past the "potentially" and "most of the time" caveats. I honestly would have felt better if it was just 3.3 ghz cpu and 2.0 ghz gpu fixed all the time. No room for inconsistencies or development shortcomings due to poor use of the power profiles.
Why? why leave potential performance on the table just to satisfy a very niche vocal minority? Besides the way they designed the system it automatically adjusts frequencies even if the dev used a locked frequency profile.

We are already hearing this from Richard
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core.
So fixed 2230MHz GPU while CPU jumps around 3.2-35GHz on a frames by frames or scenes by scenes basis.
 
Last edited:
It's just hard for me get past the "potentially" and "most of the time" caveats. I honestly would have felt better if it was just 3.3 ghz cpu and 2.0 ghz gpu fixed all the time. No room for inconsistencies or development shortcomings due to poor use of the power profiles.
That way you won't have game getting true max of the system.
Cerny is trying to make sure you will get the max of the system and that include even when some parts are being non-used.
 
Last edited:
Agreed.

And that's what doesn't make sense. The system supposed works at max speeds of 3.5 and 2.23 no problem, yet then you get these caveats of "up to" or "most of the time" and "need to balance max power budget", etc.....

In other words, a highly intense game cannot maintain maxing out at 3.5 and 2.23. It will have to downclock to cool down. Then when things are less hectic, it can perhaps max out again.

There's a difference playing Tetris where the cpu and gpu are barely working and can shift clock cycles if it feels like it, vs. GTA6 with 100 people on screen and the dev is trying to grind out 3.5 and 2.23 for as long as possible.

But we don't know how long PS5 can sustain it before down clocking since Cerny was vague about it.

Maybe it can only last 1 minute? Maybe 1 hr? Maybe 10 hrs straight? Nobody knows.
You don't understand or you don't want to understand.

The 2.23Ghz and 3.5Ghz is sustainable in most situations but there some that needs the clock to be dropped a few percents.

So let's say PS5 could run flawless at 3.3Ghz and 2.1Ghz... no drop any time.

But Cerny saw he could get more from the system... so he worked to the consoles runs at 3.5Ghz and 2.23Ghz most of time with small controlled drops in heavy workloads (which games doesn't use that much... most of times there is idle units in both GPU and CPU).

So he is allowed devs to get peak of the hardware performance all the time and not just fixed 3.3Ghz and 2.1Ghz.
 
That is the opposite of what Cerny said... all machines will have the same level of noise.

I think he was referring more to a consistent heat/noise level across workloads. They still have to guarantee a performance level across all systems, power usage among systems will differ (one chip might only need 120w to hit the max clocks while others need 180w). The max power envelope will be high enough to accommodate chips with the lowest PPW they are willing to accept. Some systems will use less power and therefore be cooler, that's just the nature of chip fabrication (unless they are planning on throwing away a lot of chips).
 
You don't understand or you don't want to understand.

The 2.23Ghz and 3.5Ghz is sustainable in most situations but there some that needs the clock to be dropped a few percents.

So let's say PS5 could run flawless at 3.3Ghz and 2.1Ghz... no drop any time.

But Cerny saw he could get more from the system... so he worked to the consoles runs at 3.5Ghz and 2.23Ghz most of time with small controlled drops in heavy workloads (which games doesn't use that much... most of times there is idle units in both GPU and CPU).

So he is allowed devs to get peak of the hardware performance all the time and not just fixed 3.3Ghz and 2.1Ghz.
I do understand. What you and I both said are the same.

PS5 can't sustain 3.5 and 2.23 in tense situations and needs to downclock.
 
It's just hard for me get past the "potentially" and "most of the time" caveats. I honestly would have felt better if it was just 3.3 ghz cpu and 2.0 ghz gpu fixed all the time. No room for inconsistencies or development shortcomings due to poor use of the power profiles.
I think that a poor use of "power profile" in the PS5 will be exactly what you said you prefer: The system working in full clock all the time.
 
I do understand. What you and I both said are the same.

PS5 can't sustain 3.5 and 2.23 in tense situations and needs to downclock.
But those locked** frequencies are the die sweet spot. By having them there they have full control over both power and heat. That's the reason they were chosen.
Having it balanced like that means that power regulation is a piece of piss around the die and makes the IO and system architecture more efficient.*

*as I understand it.
** as in chosen
 
Last edited:
It's just hard for me get past the "potentially" and "most of the time" caveats. I honestly would have felt better if it was just 3.3 ghz cpu and 2.0 ghz gpu fixed all the time. No room for inconsistencies or development shortcomings due to poor use of the power profiles.

You feel that way, because this is different than what we are all used to. Plus, we don't have our hands on the devkits or retail hardware yet.
 
You feel that way, because this is different than what we are all used to. Plus, we don't have our hands on the devkits or retail hardware yet.
Actually is that way that most companies uses the term.

Nvidia says the boost clock as "most of time" but in fact it keep over the marketed target.
AMD says the game clock as "most of time" but in fact it keep below that marketed target.

Companies uses most of time.
 
What doesn't make sense is anyone saying running at 3.5 and 2.23 is no problem.

If it's no problem, then just leave it at that, instead of downclocking.
But Cerny said it is not problema and runs at that clock most of time :pie_thinking:
Why do you want people to not believe in what the chief design of the console said?
 
FINALLY, some concrete information. Maybe now the conversation can become more serio-

-sees thread-

1166FrK.gif
 
That doesn't mean cpu will be running at full power. The devs had to downclock the cpu to keep gpu freq at 2.23. In retail variable will be active but since dev coded game to have cpu throttled to keep gpu maxed out, it won't matter much with variable. It'll only let cpu go higher freq when nothing is being pushed much. But since the game devs make have cpu throttled to keep gpu at 2.23, then in game the cpu will be mostly downclocked below 3.5ghz as well

Richard is just saying since most games for first year will be mostly utilising only 8cores with no threading, the downclock makes sense since they're not really going to push cpu anyway. But for games that push cpu, gpu will be variable and not maintaining 2.23ghz when workloads are being pushed to limits in graphical heavy scenes
And Cerny says that the fixed profile method is for dev optimization and the actual system itself will adjust accordingly.

They didn't have to downclock the CPU, they're CHOOSING to do so to optimize their game based on how their engine renders. This doesn't mean the CPU will HAVE to run at this set speed at all times, just like the GPU won't have to either.

Lot of missing the forest for the trees in this thread.
 
do people understand that in heavy games like open world you can create an even heavier scene from nothing just by player agency?!

i mean just looks at rdr2, console barely keeps 30 frame at saint denis at night, and you can make a giant firefight between npc and police with fire and explosion and destruction that tanks the framerate.

this is why people are scared\dubious, yeah the ps5 can sustain max clock in saint denis in rdr3 with a calm situation but what about when the player decide to make caos? (you know, one of the most fun thing in open world games).
This is how players create by themselfs an heavy scene not exactly predictable by devs (because if it was predictable, you wouldn't have the framerate tanking like my first example with rdr2)

you can predict and limit the caos in a relatively small scope games like tlou2, but not with games like AC or days gone or gta with a shitload of stuff on screen, this is when the ps5 is gonna be at full workload, can sony guys bet their house that ps5 is gonna sustain the max clock in these occasions? i think not and the irony is that these are the scene where you NEED the best performance from the console.
 
Last edited:
Nobody is saying the ultra fast SSD will give PS5 an extra teraflop. Why would anyone say that? Teraflop doesn't matter anyway. That 18% advantage in teraflop will only result in a higher screen output resolution that you can't notice even in youtube comparison videos anyway.

People are saying (including developers) that the PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art. A developer explained this in detail in one of his tweets.
Show me a developer who has said "PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art." when compared to the XSXs SSD.
We all know that upgrading to SSDs from HDs will improve that, which is what they are saying. No developer has compared XSX to PS5.
 
Show me a developer who has said "PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art." when compared to the XSXs SSD.
We all know that upgrading to SSDs from HDs will improve that, which is what they are saying. No developer has compared XSX to PS5.
there was a tweet from a ND devs with a comparison between unchy1 and tlou about the improvement in graphics if i remember well.
 
Last edited:
Show me a developer who has said "PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art." when compared to the XSXs SSD.
We all know that upgrading to SSDs from HDs will improve that, which is what they are saying. No developer has compared XSX to PS5.

It will allow more than double assets streaming on PS5 compared to XSX. Of course it will be used (at least by 1st party) to make a notable difference...
 
Nobody is saying the ultra fast SSD will give PS5 an extra teraflop. Why would anyone say that? Teraflop doesn't matter anyway. That 18% advantage in teraflop will only result in a higher screen output resolution that you can't notice even in youtube comparison videos anyway.

People are saying (including developers) that the PS5 SSD will allow an unprecedented and unmatched insane texture details and more varied art. A developer explained this in detail in one of his tweets.

People have said that the PS5 SSD will add some never previously seen before hidden power advantage. That's absolutely been a narrative pushed in multiple threads. People have said the SSD alone will allow for more polygons on screen, too.

Also, the rest of your point is flawed as well. You can't cheat fillrate with an SSD. You can't cheat CPU and GPU rendering ability with an SSD.

There Is No SSD Secret Sauce.

The XsX SSD can handle all the "detailed textures" the PS5 can. They will render images to the best of the their CPU/GPU limitations.

SSDS DO NOT RENDER IMAGES.
 
At least we have proof now that you cannot use the full frequencies of the CPU and GPU at the same time. I think this is absolutely awful. It would have been so much simpler for developers if they would have simply went with a locked frequency.
 
It will allow more than double assets streaming on PS5 compared to XSX. Of course it will be used (at least by 1st party) to make a notable difference...
Show me where a developer has said that it will allow them to stream double the assets in game that the XSX cant?
That's like saying because the XSX 120gb/s of RAM bandwidth more than the PS5, the XSX can stream 120gb/S of assests and textures into a game compared to the PS5.
 
Last edited:
do people understand that in heavy games like open world you can create an even heavier scene from nothing just by player agency?!

i mean just looks at rdr2, console barely keeps 30 frame at saint denis at night, and you can make a giant firefight between npc and police with fire and explosion and destruction that tanks the framerate.

this is why people are scared\dubious, yeah the ps5 can sustain max clock in saint denis in rdr3 with a calm situation but what about when the player decide to make caos? (you know, one of the most fun thing in open world games).
This is how players create by themselfs an heavy scene not exactly predictable by devs (because if it was predictable, you wouldn't have the framerate tanking like my first example with rdr2)

you can predict and limit the caos in a relatively small scope games like tlou2, but not with games like AC or days gone or gta with a shitload of stuff on screen, this is when the ps5 is gonna be at full workload, can sony guys bet their house that ps5 is gonna sustain the max clock in these occasions? i think not and the irony is that these are the scene where you NEED the best performance from the console.
Without wanting to sound fanboyish, the SSD speed will compensate for this somewhat. By having a faster access to immediate surroundings without having to load up the system. Cerny went over that.
So we have an immediately quieter system load because of this. Pushing on, the 'Smartshift' tech merely means that in frames that are GPU heavy, CPU power (if underutilized) can be diverted. Ditto the other way.
We're not talking 'scenes' here, we're talking where the player is looking in a single frame.*

We don't know how this resolves itself in real world conditions though. We don't know what the PS5 has architecture-wise, plus we haven't seen any real world RDNA2 performance.

At this moment it's wait and see. But what we know is that the Xbox is the most powerful machine in a traditional sense. With the PS5 there are a heap of hitherto unknown variables to call it.

So enjoy.

*as always, as I understand it
 
Last edited:
Without wanting to sound fanboyish, the SSD speed will compensate for this somewhat. By having a faster access to immediate surroundings without having to load up the system. Cerny went over that.
So we have an immediately quieter system load because of this. Pushing on, the 'Smartshift' tech merely means that in frames that are GPU heavy, CPU power (if underutilized) can be diverted. Ditto the other way.
We're not talking 'scenes' here, we're talking where the player is looking in a single frame.*

We don't know how this resolves itself in real world conditions though. We don't know what the PS5 has architecture-wise, plus we haven't seen any real world RDNA2 performance.

At this moment it's wait and see. But what we know is that the Xbox is the most powerful machine in a traditional sense. With the PS5 there are a heap of hitherto unknown variables to call it.

So enjoy.

*as always, as I understand it
i hope your optimism is well placed, i sound hyper critical because i want the best from the ps5.
 
Last edited:
One thing I can't quite comprehend is why it seems like such a daunting task to cool Sony consoles. What is designed so differently that led to noisy Pro's vs. whisper quiet One X's? People can dislike the XsX tower design but it seems to have made cooling a breeze (pun intended).

It seems to be a point of emphasis on Cerny's PS5 talk like they had to really brainstorm to figure out how to cool the PS5. Curious to find out what their solution ended up being.
I think they are fixated on having a slim compact console like ps4. Og Xbone was huge.
 
PS5 can't sustain 3.5 and 2.23 in tense situations and needs to downclock.
That's not the case
GPU will spend most of its time at or near its top frequency in situations where the whole frame is being used productively in PS5 games. The same is true for the CPU, based on examination of situations where it has high utilization throughout the frame, we have concluded that the CPU will spend most of its time at its peak frequency.
With race to idle out of the equation and both CPU and GPU fully used, the boost clock system should still see both components running near to or at peak frequency most of the time.

Tense situations aren't necessarily more power intensive, for example as Cerny pointed out processing dense geometry consumes less power than processing simple geometry, also there are certain CPU instructions (like 256bit AVX) that are more power hungry.

The situation you are thinking of simultaneous near 100% GPU CUs utilization and near 100% CPU cores utilization is rare in gaming and if it happens it would be during a brief scene or a few frames. During that brief duration they can for example drop GPU clocks to 2171 Mhz (10TF) and CPU to 3.4GHz after which clocks would jump back up.
 
Show me where a developer has said that it will allow them to stream double the assets in game that the XSX cant?
That's like saying because the XSX 120gb/s of RAM bandwidth more than the PS5, the XSX can stream 120gb/S of assests and textures into a game compared to the PS5.
Xbox needs more RAM it has more CUs.
Both consoles have close bandwidth per CU... similar to AMD desktop cards.
 
Last edited:
Xbox needs more RAM it has more CUs.
Both consoles have close bandwidth per CU... similar to AMD desktop cards.
And both consoles have the same amount of RAM and both SSDs will be able to load in the assets needed to display any game developed next gen.
Its all good.
 
Xbox needs more RAM it has more CUs.
Both consoles have close bandwidth per CU... similar to AMD desktop cards.
They have equal bandwidth proportionate to their GPUs computational performance
PS5 has more bandwidth per CU but since they are clocked higher they need more that's why bandwidth per CU isn't a good comparison.
 
They have equal bandwidth proportionate to their GPUs computational performance
PS5 has more bandwidth per CU but since they are clocked higher they need more that's why bandwidth per CU isn't a good comparison.




Digital Foundry also tested

  • RX 5700 super overclock and diminishing frame rate gains due to memory bandwidth limitations.
  • RX 5700 vs RX 5700 XT at the same 9.67 TFLOPS level. RDNA is not GCN.
 



Digital Foundry also tested

  • RX 5700 super overclock and diminishing frame rate gains due to memory bandwidth limitations.
  • RX 5700 vs RX 5700 XT at the same 9.67 TFLOPS level. RDNA is not GCN.

While I understand the points being made in the video..the comparison is by nature off. You'd also need to test more than 1 game.
 
I wish someone from DF could explain this difference too.



You are understanding things 100% correctly! It's not that MS is bad or that Sony screwed up. These 2 companies just choose to go about building a console in two different ways. They focus on different things. Currently, the PS5 is being down in a way that some can't understand, because the design philosophy is just so different compared to your typical PC and consoles of the past.
Late reply but thanks for the clarification.
The few times I enjoy this forum is interacting with members like you who are able to break down the technical bits.
I wish we have more conversations like these on both Series X and PS5.
Thanks mate.
 
Top Bottom