Actually you are misunderstanding.It keeps the higher clock rate for how long? And what is the lowest clock rate it can drop to?
There's plenty of information Sony hasn't given out because they prefer you just think about Smartshift and about the PS5's maximum achievable frequency.
I know TimDog is very welcomed here but whatever
I know TimDog is very welcomed here but whatever
Can somebody ELI5 that for me (from Mike Ybarra) because I just can't parse any of it ..
I know exactly what smartshift is but I went to AMD to find out, not Cerny. All the stuff you put there is the PR talk Cerny have when asked to clarify why devs already had to throttle the CPU to guarantee predictable GPU performance. You're free to drink the kool aid my bro.Actually you are misunderstanding.
PS5 GPU de is always at 2.23Ghz and CPU at 3.5Ghz except when some workload tries to break the power draw limit when it downclock to mataime the power draw.
SmartShift is used only when PS5 CPU is not using all power available to it and GPU is already on limit of it power draw so SmartShif takes place and shift that power draw to GPU.
So that way even GPU is at the power limit it can squeeze a bit from it using addiction non-used power from CPU.
Why the focus in power draw and workload? Because some workloads )most of it) won't reach the limit of power draw even at max clock while others will reach the limit of power draw even lowering the clock (exceptions)... in simple terms even at fixed clock a workload can use more or less power depending of what it is doing.
thnx got it nowMS financial position is untouchable, but for Sony the recession will be precarious.
Based on that link, it seems that the CPU and GPU of the PS5 won't run both at max clock. Even though I know Cerny said they would, that website seems to say otherwise. They share a budget, and since Cerny said what the max was for both, they can't both run at that max, right? Or am I misinterpreting that website?
Quote or link please. They said they had RDNA2 and HW accelerated RT. Still have to see confirmation that it's AMD's standard RT implementation or the same that's in the XSX.
Are we back to RT implementation again? Christ almighty, do you people ever get tired of repeating the same FUD just through different accounts every sodding week, like clock-work.
It really is though. And as a consumer, one should really be mindful of handing over even more power over a market to such a powerful corporation.MS financial position is untouchable, but for Sony the recession will be precarious.
Wow, that MS arrogance sure is hard to let go of.
Based on that link, it seems that the CPU and GPU of the PS5 won't run both at max clock. Even though I know Cerny said they would, that website seems to say otherwise. They share a budget, and since Cerny said what the max was for both, they can't both run at that max, right? Or am I misinterpreting that website?
There's no way you can't differentiate 720p from 1080p, I love framerate (the reason I'm mainly a pc player) but the difference from 720p 1080p and 4k it's like night and dayReally resolution is overrated compared to framerate.
I play Nioh 2 in 720p60 at my parent's place (PS4 Base) and in 1080p60 at my place (PS4 Pro with an LG OLED). Both of them are perfectly fine, very beautiful even with that glorious vibrant HDR and artstyle.
Not once, literally not even a single time during my awesome fun with it did I think to myself "jeez that GPU should push some extra dots".
4K with advanced reconstruction is the way to go I think.
Something you heard?
And when one of the two shifts from itss default (max) position, what happens to the other that was already also at max by default? Does it go to Super max?The shared power budget is enough to max the clocks for both the GPU and CPU
That's because they won't both run at max clock. Or it wouldn't be smart shift. People just need to read.Based on that link, it seems that the CPU and GPU of the PS5 won't run both at max clock. Even though I know Cerny said they would, that website seems to say otherwise. They share a budget, and since Cerny said what the max was for both, they can't both run at that max, right? Or am I misinterpreting that website?
Of course I can differentiate it, but it did not take away from my experience much. I was perfectly fine. I'd probably be more critical if the game itself wasn't that good.There's no way you can't differentiate 720p from 1080p, I love framerate (the reason I'm mainly a pc player) but the difference from 720p 1080p and 4k it's like night and day
Devs didn't have to throttle the PU they have to choose a profile because the automatically clock will only works on retail machies.I know exactly what smartshift is but I went to AMD to find out, not Cerny. All the stuff you put there is the PR talk Cerny have when asked to clarify why devs already had to throttle the CPU to guarantee predictable GPU performance. You're free to drink the kool aid my bro.
Running 100% frequencies =/= 100% usage.And when one of the two shifts from itss default (max) position, what happens to the other that was already also at max by default? Does it go to Super max?
A GPU set to max frequency that isn't doing very much won't use much power. If it's doing a lot of work.. to maintain that frequency, it needs more power. Hence why it might need to borrow power budget from the CPU.And when one of the two shifts from itss default (max) position, what happens to the other that was already also at max by default? Does it go to Super max?
The website not but PS5 doesn't use SmartShift to set the clocks.Based on that link, it seems that the CPU and GPU of the PS5 won't run both at max clock. Even though I know Cerny said they would, that website seems to say otherwise. They share a budget, and since Cerny said what the max was for both, they can't both run at that max, right? Or am I misinterpreting that website?
And when one of the two shifts from itss default (max) position, what happens to the other that was already also at max by default? Does it go to Super max?
You're telling me that the fact that processors won't maintain the speed that they're advertised at doesn't matter and that I shouldn't try to understand?A GPU set to max frequency that isn't doing very much won't use much power. If it's doing a lot of work.. to maintain that frequency, it needs more power. Hence why it might need to borrow power budget from the CPU.
It also might have to downclock to keep within the budget; as can the CPU.
People are getting in the weeds trying to understand SmartShift when it really doesn't matter... the CPU/GPU both try to stay at max speed as often as possible, sometimes they can't. Power is also shifted sometimes but honestly that doesn't really matter, it's just how it's accomplished.
Smart shift exists because both can't always run at the max frequency. Sony lists their frequencies as "max" because they won't always run at that frequency.
Source:"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
"Can potentially"False.
Mark Cerny:
Source:
![]()
PlayStation 5 uncovered: the Mark Cerny tech deep dive
On March 18th, Sony finally broke cover with in-depth information on the technical make-up of PlayStation 5. Expanding …www.eurogamer.net
You got it man. What do you do when you want to claim a performance that you can't maintain because you didn't aim for it? You start praising volatility and advertise your "max" frequency, not your sustained one.Smart shift exists because both can't always run at the max frequency.
Sony lists their frequencies as "max" because they won't always run at that frequency.
Or because in most games you have the CPU not using all power so why not transfer that power to be used by GPU?Smart shift exists because both can't always run at the max frequency.
Sony lists their frequencies as "max" because they won't always run at that frequency.
The question is how often and how much the chips will throttle for next gen games. Sony isn't willing to answer that right now so we just have to wait till the games come out.
You're telling me that the fact that processors won't maintain the speed that they're advertised at doesn't matter and that I shouldn't try to understand?
It matters to me. I'm a customer, not a fanboy.
Let it go. U said nonsense and got caught. He explains by potentionally he means if a code wants them to be at max freq devs don't have to choose they will be both at max ."Can potentially"
That does not say they can always run at the max.
He's choosing those words deliberately.
Smartshift is not about clocks. It's about redistributing power budget from CPU to GPU. If that's not enough, then the system can lower the clocks for CPU, GPU or both.Smart shift exists because both can't always run at the max frequency.
Sony lists their frequencies as "max" because they won't always run at that frequency.
The question is how often and how much the chips will throttle for next gen games. Sony isn't willing to answer that right now so we just have to wait till the games come out.
Wow you really are misinformed.I know exactly what smartshift is but I went to AMD to find out, not Cerny. All the stuff you put there is the PR talk Cerny have when asked to clarify why devs already had to throttle the CPU to guarantee predictable GPU performance. You're free to drink the kool aid my bro.
You got it man. What do you do when you want to claim a performance that you can't maintain because you didn't aim for it? You start praising volatility and advertise your "max" frequency, not your sustained one.
Or because in most games you have the CPU not using all power so why not transfer that power to be used by GPU?
I will draw an example.
CPU power limit 50W
GPU power limit 150W
Overall power limit 200W.
Now a game uses only 30W of the CPU power and 150W of the GPU power... so what SmartShift does is tell the GPU you can use more 20W so you can break the 150W and reach 170W if you want maintaining the CPU at 30W.
How about the clock? In that case the CPU will be running probably at 3.5Ghz even with 30W power because it is not being stressed.
The GPU is being stressed so maybe it could drop but it received more 20W from CPU so it could maintain the 2.23Ghz.
Clock and power usage will be defined by the workload... the GPU with most of workload will maintain 2.23Ghz even not reaching 150W limit of power draw... some heavy workload will need drop in clock even with added power from CPU (170W for example).
In simple terms even at the same limited power draw (150W) you have workload where the GPU can maintain or go over 2.23Ghz (but it is capped for that) or workload that will need a lower clock.
The worst case and rare scenario is when both CPU and GPU are being estresses with heavy workload so it will probably reach a limit where both needs to drop the frequencies but that is a situation that rarely happens in games... the opposite most of time there non-used power in both the CPU and GPU (AMD Async Compute in GPU was created to try to use in parallel all that non-used processing power that games logic can't use).
The hell, lolLet it go. U said nonsense and got caught. He explains by potentionally he means if a code wants them to be at max freq devs don't have to choose they will be both at max .
Based on that link, it seems that the CPU and GPU of the PS5 won't run both at max clock. Even though I know Cerny said they would, that website seems to say otherwise. They share a budget, and since Cerny said what the max was for both, they can't both run at that max, right? Or am I misinterpreting that website?
Your explanation of Sony's claims is sound and reasonable. You're just forgetting to say that nothing has been proven. Until we see games running on PS5, they are just questionable claims. And even if the same piece of code always has the same outcome, it does not make the performance predictable. You need to run the code to know what clocks you would get.No, that's not what I'm saying. I'm saying SMARTSHIFT doesn't matter. It's a small part of the solution and games don't care about how many watts are going to a CPU/GPU, they care what frequency they are set at.
The overall solution is what is important, not SmartShift. You really don't need to know the details of SmartShift to understand the PS5 and what matters to you.
What actually is important, as it effects rendering:
1) Both CPU/GPU can maintain max frequency most of the time according to Cerny
2) Sometimes one or the other has to downclock, this is not ideal, but hopefully limited
3) This downclocking is consistent; as opposed to other variable frequency setups, where the clocks are set based on things like ambient temps... for PS5, if you run a piece of code it will always experience the same CPU/GPU clocks, that is ideal as far as variable frequencies go from a dev perspective
The part where sometimes watts are shifted around.. is what doesn't really matter.. obviously if you are curious, learn it.. but from a gamer's perspective it's not what has an effect on games. It's just how Sony accomplished PART of their variable frequency within a power budget solution.
So what I am getting out of this, is that it seems to be a waste of resources to have one component constantly running at full clocks, if the process it's executing doesn't require/need full clocks (hence you get more idles in the draw calls), so why not shift the excess resources somewhere else for more of a gain in said area to lower the draw calls there.
Is this a correct understanding?
They should always be aiming to run at full clocks. Lower the clocks and you reduce performance in game.
Even Cerny did not use the word sustained. Thank you Sir. You are selling the product harder than Cerny.The max are the sustained clocks![]()
Yeap.So what I am getting out of this, is that it seems to be a waste of resources to have one component constantly running at full clocks, if the process it's executing doesn't require/need full clocks (hence you get more idles in the draw calls), so why not shift the excess resources somewhere else for more of a gain in said area to lower the draw calls there.
Is this a correct understanding?