Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Mark Cernys own words were that the clocks will drop.

I don't know what else to say at this point. You guys are just making up all this stuff explain how the variable clocks(as advertised by Sony) are not variable.
People are just trying to explain that Smartshift isn't born FOR variabile clocks and having smartshift do not equate to have it for variable clocks.
Smartshift is always good because use wasted power from CPU, period.
Cerny also said that clocks can be at max for both GPU and CPU at the same time.
There is nothing else to add or elaborate, on a surface level is as simple as that.
 
Last edited:
Smartshift is not about clocks. It's about redistributing power budget from CPU to GPU. If that's not enough, then the system can lower the clocks for CPU, GPU or both.
What?

I feel like I'm not speaking the same language as some of you.

You start you post of by saying "it's not about clocks" then finish you post with "the system can lower the clocks...."

What am I missing? All I said was that the clocks would not always run at max frequencies.
 
They should always be aiming to run at full clocks. Lower the clocks and you reduce performance in game.

Nope, if you reduce clocks when less is being used at that time, nothing happens

You think in a frame everything is 100 % , no it is not - there is plenty of opportunities for efficiencies, so why not ?


eDTxBgC.png
 
Last edited:
People are just trying to explain that Smartshift isn't born FOR variabile clocks and having smartshift do not equate to have it for variable clocks.
Smartshift is always good because use wasted power from CPU, period.
Cerny also said that clocks can be at max for both GPU and CPU at the same time.
There is nothing else to add or elaborate, on a surface level is as simple as that.
But that's how the PS5 is using the technology, right?

The frequencies will be lowered based on power needs.
 
You got it man. What do you do when you want to claim a performance that you can't maintain because you didn't aim for it? You start praising volatility and advertise your "max" frequency, not your sustained one.
TIL people don't know a CPU or GPU can be at max frequency but if its not doing any work it won't be drawing much power. Sustained frequency does not mean more performance. It just means the processor is locked at a frequency which it can't go over or below.
 
TIL people don't know a CPU or GPU can be at max frequency but if its not doing any work it won't be drawing much power. Sustained frequency does not mean more performance. It just means the processor is locked at a frequency which it can't go over or below.
Predictable is what it is and that's something that helps a developer a lot more than sporadic peaks above the mean.
 
It wasn't just theoritical numbers. It was also an actual Minecraft demo that can be compared to the performance of the most powerful PC cards on Minecraft RTX.
Im talking about Cerny (since that's who you claimed to be surprised). Cerny just like MS has to see the RT implementation on a game to see how far they take it. For MS that's two last gen games: Minecraft & Gears, for Sony it is a undisclosed next gen game.
It's called transparency.You're refusing to aknowledge it where it is and you excuse it where it's not.
Its called schedule, strategy & planning. Just because MS chose to blow their load early doesn't mean Sony has to do the same, different companies with different schedules.
You don't know how those components where designed, what is off the shelf and what is bespoke and with what amount of customisation. This is willful simple mindedness.
RT is a core RDNA2 feature and to suggest consoles will use an alternative solution without a single piece of supporting evidence is willful simple mindedness on your part.
They talked about one. One that "successfully" implements RT. And that's how they marketed their RT capabilities.
He talked about all possible implementations from Audio to full raytracing and gave one example of a game he saw. He opened the presentation by stating they'll show games at a later date.
 
That is not true. There are plenty of scenes drawn on the screen where your CPU fluctuates usage because it's not as demanding. This is essentially what Ryzen does on the PC now with boost clocks, etc..

Fluctuating usage doesn't mean the clock speed is also fluctuating

Set windows to run at high performance, and if the CPU can, it'll constantly run at max speeds, because it has no reason not to.
 
Last edited:
Predictable is what it is and that's something that helps a developer a lot more than sporadic peaks above the mean.
If you bothered to watch the Road to PS5 video and open up your mind with hopes of understanding what is boing talked about you would know there is nothing unpredictable with how Sony implemented their variable frequency strategy. Developers make their game as if they are developing for a console with locked frequency because they are guaranteed enough power for

8 core 16 thread 3.5GHz CPU and
10.23TF GPU at 2.23GHz

There is no sporadic peaks. The system monitors what a game is doing each frame and adjusts clock and power by itself, developers do not. They do however have the option during development to lock the clocks for profiling purposes but retail consoles do not have that option. Watch the video and open your mind to understand what is being said, not with a cynical rose tinted mentality of a fanboy. This has the potential to change how future consoles are made if the strategy pays off.
 
Fluctuating usage doesn't mean the clock speed is also fluctuating

Set windows to run at high performance, and if the CPU can,,it'll constantly run at max speeds, because it has no reason not to.

I know this, and there is a lot of wasted idle time at various moments as well. Hence Smart Shift and other technologies to not let resources go to waste.

If the CPU let's say is only under 30% load, and allocating extra resources somewhere else in that given moment will not impact performance of it due to low load and higher idle, then that is how Smart Shift is born, no?
 
re. the console sales;




For PS4



Based on this and various other sources it is fair to estimate that PS4 sold 2x better than XBOne.

but it is interesting to see and compare these companies services;

  • XGP was rumoured to have 9.5M subscribers in May last year and then in Nov. 2019 MS CEO said that the numbers doubled in the Q4 2019, so we can estimate subscribers at 20M
  • XBL had a couple of months with over 65M Monthly Active Users in Q4 as well
  • PS Now had 1M subscribers in late 2019
  • PS Plus had 37M subscribers in late 2019

I think this is very relevant in regards to what pricing both companies can go with. And the xCloud is not yet released :)
Yea. I guess we can safely assume the XSX can sell for $1 to reach as many people as possible and claim super high numbers :messenger_relieved:
 
I know this, and there is a lot of wasted idle time at various moments as well. Hence Smart Shift and other technologies to not let resources go to waste.

If the CPU let's say is only under 30% load, and allocating extra resources somewhere else in that given moment will not impact performance of it due to low load and higher idle, then that is how Smart Shift is born, no?

It will impact performance if you reduce the CPU's power to the point where it can't run at max speeds for its 30% load
 
Last edited:
He said in addition they run smartshift. The tech they're using for variable clocks is their own bespoke solution
Ok so a question about smart shift itself as I'm trying to make sense of it in my head.

I understand that both CPU and GPU won't always need full power.

So say the CPU is not using full power and that power is diverted to the GPU.

What is that power being used for if not to maintain a higher frequency in the GPU?
 
I though Mark Cerny described it as "smart shift" in his talk? What did he call it?
I'll repost this as there appears to be some confusion regarding the underlying technology behind PS5 variable frequency & AMD generic smartshift
PS5
  • SoC has a fixed power budget (300W) GPU & CPU each have a power budget within SoC. Say the reference values are GPU 250W & CPU 50W
  • Smartshift only diverts power to GPU if there are leftover CPU resources while at 3.5GHz and GPU is exceeding its power budget. CPU only using 30W so 20W diverted to GPU->270W
  • There's key difference in how smartshift operates on PS5: Its limited only to a scenario in which there's left over CPU resources so there won't be any downclocks as a result of using smartshift. Both CPU & GPU have fixed power budgets, smartshift can only divert left over power from CPU.
  • Developers can choose to prioritize GPU over CPU by increasing the GPU power budget (250W->270W) and decreasing CPU budget (50->30W). CPU will run on average slower (3.2GHz) but for less power intensive tasks go up to 3.5Ghz. GPU would be able to run at 10.27TF with the rare few percentage frequency drop depending of load (10.27TF<->10TF). This is the likely scenario developers spoke of to DF where they throttled CPU in order to ensure a sustained 2.23GHz clock on the GPU. Note this is not smartshift
AMD APU Smartshift
  • Power and frequency balance between CPU & GPU happens all the time bidirectionally depending of the type of loads
  • Can get throttled by thermals

Watt numbers are just made up for the explanation
 
Last edited:
It really is though. And as a consumer, one should really be mindful of handing over even more power over a market to such a powerful corporation.
But apparently, the desire to "side with the winner" outweights the ability to think long-term for far too many people.

Don't worry, recessions love technology. It's everything else that takes a kick in the teeth. Sony isn't a car manufacturer, or hotels etc etc

The concern trolling is cute though. Xbox crowd has been talking about money in the bank since the day the OG Xbox launched. It's sad really.

Hell since when did it become cool to root for the rich guy?
 
Last edited:
Ok so a question about smart shift itself as I'm trying to make sense of it in my head.

I understand that both CPU and GPU won't always need full power.

So say the CPU is not using full power and that power is diverted to the GPU.

What is that power being used for if not to maintain a higher frequency in the GPU?

The additional power could let the GPU handle more demanding ray tracing while still maintaining max clocks, for example
 
Last edited:
I though Mark Cerny described it as "smart shift" in his talk? What did he call it?
He mentions it only saying that it helps getting extra power, while (as far as I understand) variable clocks in PS5 are about regolating performance with consumption usage instead of temperature, which should be better because consumption is totally predictable by devs (as he calls it "deterministic") while temperature is not, every room has different temperatures.
Smartshift is a solution developed for laptops, where using bigger GPUs has diminishing returns in performance because the temperature gets too high and the GPU simply can't performs so much better than a littler one as it should. While "for laptops" sounds bad, it's not that consoles are much better, they have heating issues also. Smartshift do not directly decrease temperature, but it allows for a modest size GPU to perform better so to put a big one in a laptop is not required.
I still need to study the details, so I'm really not the one who can teach you anything, but this is the thing more or less.
 
Last edited:
The additional power could let the GPU handle more demanding ray tracing while still maintaining max clocks, for example
I'd have to see some info on that as Sony seemingly isn't even making that claim.

But my question was about "smart shift" outside of what is happening with the PS5. I guess that's a bit off topic for this thread though.
 
I'll repost this as there appears to be some confusion regarding the underlying technology behind PS5 variable frequency & AMD generic smartshift
PS5
  • SoC has a fixed power budget (300W) GPU & CPU each have a power budget within SoC. Say the reference values are GPU 250W & CPU 50W
  • Smartshift only diverts power to GPU if there are leftover CPU resources while at 3.5GHz and GPU is exceeding its power budget. CPU only using 30W so 20W diverted to GPU->270W
  • There's key difference in how smartshift operates on PS5: Its limited only to a scenario in which there's left over CPU resources so there won't be any downclocks as a result of using smartshift. Both CPU & GPU have fixed power budgets, smartshift can only divert left over power from CPU.
  • Developers can choose to prioritize GPU over CPU by increasing the GPU power budget (250W->270W) and decreasing CPU budget (50->30W). CPU will run on average slower (3.2GHz) but for less power intensive tasks go up to 3.5Ghz. GPU would be able to run at 10.27TF with the rare few percentage frequency drop depending of load (10.27TF<->10TF). This is the likely scenario developers spoke of to DF where they throttled CPU in order to ensure a sustained 2.23GHz clock on the GPU. Note this is not smartshift
AMD APU Smartshift
  • Power and frequency balance between CPU & GPU happens all the time bidirectionally depending of the type of loads
  • Can get throttled by thermals

Watt numbers are just made up for the explanation
Yeah this is basically a better me explaining. Here you go.
 
Predictable is what it is and that's something that helps a developer a lot more than sporadic peaks above the mean.
But devs have control of the power of the CPU and GPU lol.
They are the ones that call the workloads.

Like I said new Xbox could beneficiaste a lot with that.
 
Last edited:
Ok so a question about smart shift itself as I'm trying to make sense of it in my head.

I understand that both CPU and GPU won't always need full power.

So say the CPU is not using full power and that power is diverted to the GPU.

What is that power being used for if not to maintain a higher frequency in the GPU?
It is to maintain max frequencies, but any PC system would perform better with it not only PS5, because variable frequencies aren't PS5 discover (again, it's a problem in laptops). For PS5 it is mentioned because it's not the same way of variating clocks.
 
Last edited:
I'd have to see some info on that as Sony seemingly isn't even making that claim.

But my question was about "smart shift" outside of what is happening with the PS5. I guess that's a bit off topic for this thread though.

It's about workload

If SmartShift wasn't a feature, and devs wanted to use more demanding ray tracing, the GPU would need to work harder, and for the GPU to work harder it needs more power. If it can't get that power from the CPU's unused power via SmartShift, then it needs to reduce its own clock speed
 
Last edited:
What?

I feel like I'm not speaking the same language as some of you.

You start you post of by saying "it's not about clocks" then finish you post with "the system can lower the clocks...."

What am I missing? All I said was that the clocks would not always run at max frequencies.
The 2 systems are independant.
- Smartshift does one thing: it takes power (electricity) from the CPU and gives it to the GPU if the CPU doesn't use all of it.
- The variable clocks system does his own thing, it will monitor CPU and GPU load and reduce one or 2 clocks if necessary (and depending of dev priority).
 
Example

CPU may only need 50% of its power budget when drawing a scene where you're pushing less action on screen, or less individual actors. Conversely, it may need it's full budget when you're in a busy city scenario with hundreds / thousands of NPCs being simulated.

Previously, consoles had max power budget and you could utilize the CPU and GPU up to that budget. Usually you wouldn't use all of it, and this is why some games draw more power than others. The PS5 has a fixed power budget, meaning you will be always drawing the same power and allocating it as needed to your CPU / GPU.

Some PS3 games drew more than 200W, others less. With the PS5, they all draw the same wattage.

Please correct me if I'm mistaken.
 
Well, interesting highlights by MBG, former xbox fan, about this Halo co-creator talking about current gen as stone age consoles, yet people talking about scalability:

 
Alright so the PS5 doesn't use smartshift, because that's a different type of technology. Then why did BGs BGs post the website?
 
It will impact performance if you reduce the CPU's power to the point where it can't run at max speeds for its 30% load

Something isn't making any sense. No one said "to the point", but if there is 70% idle, it's a waste of energy and power to have it sitting there while not under full load is what I understand with the tech.

I'll repost this as there appears to be some confusion regarding the underlying technology behind PS5 variable frequency & AMD generic smartshift
PS5
  • SoC has a fixed power budget (300W) GPU & CPU each have a power budget within SoC. Say the reference values are GPU 250W & CPU 50W
  • Smartshift only diverts power to GPU if there are leftover CPU resources while at 3.5GHz and GPU is exceeding its power budget. CPU only using 30W so 20W diverted to GPU->270W
  • There's key difference in how smartshift operates on PS5: Its limited only to a scenario in which there's left over CPU resources so there won't be any downclocks as a result of using smartshift. Both CPU & GPU have fixed power budgets, smartshift can only divert left over power from CPU.
  • Developers can choose to prioritize GPU over CPU by increasing the GPU power budget (250W->270W) and decreasing CPU budget (50->30W). CPU will run on average slower (3.2GHz) but for less power intensive tasks go up to 3.5Ghz. GPU would be able to run at 10.27TF with the rare few percentage frequency drop depending of load (10.27TF<->10TF). This is the likely scenario developers spoke of to DF where they throttled CPU in order to ensure a sustained 2.23GHz clock on the GPU. Note this is not smartshift
AMD APU Smartshift
  • Power and frequency balance between CPU & GPU happens all the time bidirectionally depending of the type of loads
  • Can get throttled by thermals

Watt numbers are just made up for the explanation

Basically how I have been getting up to speed with it.
 
Last edited:
Both GPU and CPU can run at max speed, yes, Cerny words.
Doesn't mean they run always at max. Smartshift simply help preventing downclocks by optimizing power usage.

Thank you. That is what I was understanding, especially when he talked about fan speeds kicking up on menu screens and noise factor tied in with the cooling system as well for examples.
 
Last edited:
But it seems smart shift goes hand in hand with variable clocks otherwise it wouldn't exist.

The difference with PS5 is that it's power draw is set, while with a PC it would vary.
Yes, I think so.
But I also think smartshift can be used for systems with no clocks problems, because you can sustain higher uncapped clocks with the same power usage, that's the point. Using wasted CPU power is always good.
In the case of laptops or PS5 is even more fitting of course.
 
Last edited:
Something isn't making any sense. No one said "to the point", but if there is 70% idle, it's a waste of energy and power to have it sitting there while not under full load is what I understand with the tech.
Basically how I have been getting up to speed with it.

Aren't we talking about clock speed here? If its not under full load then yes the power is going to be diverted.

Regardless of however much load it's under though, if it has the power budget it will run at max clocks because it's always beneficial for the processes to be doing so
 
Last edited:
Quote or link please. They said they had RDNA2 and HW accelerated RT. Still have to see confirmation that it's AMD's standard RT implementation or the same that's in the XSX.

AFAIK this is what Sony have officially said about RT (I typed the subtitles myself as this seems to be a recurring topic that doesn't advance because people can't easily quote a video)

Mark Cerny
"Another major new feature of our custom RDNA2 based GPU is ray tracing using the same strategy as AMD's upcoming PC GPUs."
"The CUs contain a new specialized unit called the intersection engine, which can calculate the intersections of rays with boxes and triangles."
"To use the intersection engine, first you build what is called an acceleration structure."
"It is data in RAM that contains all of your geometry."
"There is a specific set of formats you can use, their variations on the same BVH concept."
" Then in your shader program you use a new instruction, that asks the intersection engine to check array against the BVH."
"While the intersection engine is processing the requested ray-triangle ray-box intersections the shaders are free to do other work."
Having said that, the ray tracing instruction is pretty memory intensive, so it's a good mix with logic heavy code."
"There is of course not need to use ray tracing. PS4 graphics engines will run just fine on PlayStation 5."
"But it presents an opportunity for those interested."
"I'm thinking it'll take less than a million rays a second to have a big impact on audio. That should be enough for audio occlusion and some reverb calculations."
"With a bit more of the GPU invested in ray tracing it should be possible to do some very nice global illumination."
"Having said that, adding ray traced shadows and reflections to a traditional graphics engine could easily take hundreds of millions of rays a second."
"And full ray tracing could take billions."
"how far can we go? I'm starting to get quite bullish."
"I've already seen a PlayStation 5 title that's successfully using ray tracing based reflections in complex animated scenes with only modest costs."
"Another set of issues for the GPU involved size and frequency."
"How big do we make the GPU? And what frequency do we run it at?"
"This is a balancing act. The chip has a cost and there's a cost for whatever we use to supply that chip with power, and to cool it."
"In general, I like running the GPU at a higher frequency."
"Let me show you why."
"Here's two possible configurations for a GPU roughly of the level of the Playstation 4 Pro".
36 CU @ 1GHz VS 48 CU @ 0.75GHz
"This is a thought experiment. Don't take these configurations too seriously."
"If you just calculate Teraflops you get the same number, but actually the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU."
"That's just one part of the GPU, there are a lot of other units and those other units all run faster when the GPU frequency is higher. At 33% higher frequency rasterization goes 33% faster."
"Processing the command buffer goes that much faster."
"The L2 and other caches have that much higher bandwidth, and so on."
"About the only downside is that system memory is 33% further away - in terms more cycles."
"But the large number of benefits more than counterbalanced that..."

 
Does anyone have ANY indication at all when we will see more from Sony? Im fuckin thirsty for news why cant they just say hey on May 24th we dropping new info
 
I'll repost this as there appears to be some confusion regarding the underlying technology behind PS5 variable frequency & AMD generic smartshift
PS5
  • SoC has a fixed power budget (300W) GPU & CPU each have a power budget within SoC. Say the reference values are GPU 250W & CPU 50W
  • Smartshift only diverts power to GPU if there are leftover CPU resources while at 3.5GHz and GPU is exceeding its power budget. CPU only using 30W so 20W diverted to GPU->270W
  • There's key difference in how smartshift operates on PS5: Its limited only to a scenario in which there's left over CPU resources so there won't be any downclocks as a result of using smartshift. Both CPU & GPU have fixed power budgets, smartshift can only divert left over power from CPU.
  • Developers can choose to prioritize GPU over CPU by increasing the GPU power budget (250W->270W) and decreasing CPU budget (50->30W). CPU will run on average slower (3.2GHz) but for less power intensive tasks go up to 3.5Ghz. GPU would be able to run at 10.27TF with the rare few percentage frequency drop depending of load (10.27TF<->10TF). This is the likely scenario developers spoke of to DF where they throttled CPU in order to ensure a sustained 2.23GHz clock on the GPU. Note this is not smartshift
AMD APU Smartshift
  • Power and frequency balance between CPU & GPU happens all the time bidirectionally depending of the type of loads
  • Can get throttled by thermals

Watt numbers are just made up for the explanation
Good explanation thank you. But we also heard that the latest ps5 dev kits iteration will have automated smartshift and the deva will no longer need to use performance profiles.
So everything doesn't quite add up.
 
My neighbor has been emotionally and physically unwell recently. Subsequently, she's been too weak and depressed to take out her trash and has thus been leaving it in the hallway. So, I've been taking it outside and to the dumpster whenever I see it. To express her appreciation, she gave me $100.

I used it to buy three games from the PlayStation store: Days Gone, Death Stranding, and God of War. I've sampled Days Gone on my PS4 Pro and am quite impressed with its graphics; the character models are decent, but the environmental textures and general level of detail are incredible, which is intensified by what feels like a smooth frame rate (though I don't know how it'll hold up when I encounter hordes of freakers). How does Sony's first-party studios do it?!

It's ironic that the PS4 Pro is weaker than the Xbox One X (which I also have), but renders Days Gone with visual quality that rivals Gears of War 4 running on Xbox One X (I haven't played Gears of War 5). The difference of 1.87 teraflops between the PS5 and the XSX surely won't be noticeable considering that the difference of 1.8 teraflops between the PS4 Pro and the X1X isn't noticeable; 1.8 teraflops comprise a much greater percentage of the PS4 Pro's and X1X's graphics power than the percentage that 1.87 teraflops comprise of the PS5's and XSX's graphics power.

This has me thinking about how much better games will be able to look on the next-generation consoles in regard to texture resolution and general level of detail. I mean, how much room is there for improvement?

I know that the PS5 and the XSX will be able to implement ray tracing, but apart from that, how much of a graphical leap can we expect?
 
Speaking of prices, a reminder:







The same Ahmad said in the same thread that the BOM and manufacturing costs for a Series X console could come in at $460 to $520


Scarce components have pushed the manufacturing costs for Sony Corp.'s next PlayStation to around $450 per unit,

 
Last edited:
Isn't that what variable frequency is for?
I'll describe two scenarios to help you understand
  1. GPU & CPU running a full load maxing their respective power budgets, downclocks will prevent GPU from exceeding its power budget
  2. GPU using all of its power budget and CPU using only 80%, in the event a GPU load needs extra power smartshift can divert unused CPU power to GPU to prevent downclocking
. But we also heard that the latest ps5 dev kits iteration will have automated smartshift and the deva will no longer need to use performance profiles.
I assume you mean DF interview? Its automated boost, not automated smartshift.
 
Last edited:
Status
Not open for further replies.
Top Bottom