Rusco Da Vino
Member
For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.
Do you have an official source for that or it is just a new way to keep pushing the 9.2TF thing?
Edit:

Last edited:
For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.
Jaguar is the best solution for the money around 2013. The alteratives are ARM Cortex A15, IBM PowerPC A2(PPE replacement) and Intel Atom Silvermont.
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.I'm not sure where you're getting that information, but the world doesn't agree with that. Several devs have already called that BS out. Can't wait for July event myself to shed more lights on that matter and clear the fog.
AMD's Smaiftshift is not about cooling. 1000 watts TDP cooling solution wouldn't solve VRM power feed budget.
When your next gen game is forever associated with last gen hardware.....Who gives a fuck about Jaguar in 2020? Read again what I'm saying, we need to throw those Jaguars behind us as far as possible, maybe ask Elon Musk to throw them into outer space.
That's the main highlights, lossless as @PaintTinJr and others has explained previously, compared to the lossy BCPack to reach higher number with quality compromises.
Oodle Texture RDO can product very high quality encodings that are nearly visually indistinguishable from non-RDO encodings, but compress much more, simply by being a smart encoding which takes into consideration the rate of the choices.
Oodle Texture RDO can encode to the same quality as the non-RDO encoders at low lambda, and gradually decreases rate as lambda goes up.
Oodle Texture RDO can make BC7 encodings that are much more compressible. For example :
non-RDO BC7 :
Kraken : 1,048,724 -> 990,347 = 7.555 bpb = 1.059 to 1
RDO lambda=40 BC7 :
Kraken : 1,048,724 -> 509,639 = 3.888 bpb = 2.058 to 1
Modern games are using more and more BC7 textures because they provide much higher quality than BC1 (which suffers from chunky artifacts even at max quality). This means lots of game packages don't benefit as much from compression as we'd like. Oodle Texture RDO on BC7 fixes this.
I think it will be much improved, graphics, loading times (or lack of those), more crowded environments, better animations, etc etc... It was already such a great game that more of the exact same would be greatMicrosoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.
I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.
I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.
Yes Xbox One was known for introduce new mechanics this genMicrosoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.
I agree that the CPU can bring amazing changes that won't be possible on current gen. Just not at the start, and Microsoft knows this.
Mark Cerny revealed fixed frequency clock speed for PS5 before enabling the Smartshift feature.Do you have an official source for that or it is just a new way to keep pushing the 9.2TF thing?
PS5 programmer has to budget their CPU usage for max GPU clock speed.Yes because when you make a videogame you always try to reach 100% of GPU and CPU all the time who
cares if something happens during the gameplay the only important thing is to reach that load in both chips.
PS5 programmer has to budget their CPU usage for max GPU clock speed.
Cross buy worked on PS3 to PS4... both fully netorked consoles.. wasn't just a Vita thing.
And BC has nothing to do with this converation.
I was just using it as an example of how the "mechanism" could differ for the PS4 to PS5 cross buy.
WTF is your problem?
Every programmer has to deal with some budget performance and if you are decent one not just an student/amateurPS5 programmer has to budget their CPU usage for max GPU clock speed.
Clock speed is not a true indication for ALU usage e.g. heavy scalar integer workloads can yield high clock speed while 256-bit vector AVX yields lower clock speed.And where is that BS coming from? Mark Cerny said that both will be at max most of the time. Have any official sources declaring otherwise?
EDIT:
"If the CPU doesn't use its power budget - for example, if it is capped at 3.5GHz - then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
![]()
PlayStation 5 uncovered: the Mark Cerny tech deep dive
On March 18th, Sony finally broke cover with in-depth information on the technical make-up of PlayStation 5. Expanding …www.eurogamer.net
You didn't answer my question... What's new about Spider-Man that has such an influence on the gameplay and must use the new CPU? Jim Ryan is a marketing guy, that was purely saying this to differentiate them from Microsoft. And he doesn't even mention the CPU, why do you think that is?Glad I'm not listening to Microsoft then.
"We do believe in generations, and whether it's the DualSense controller, whether it's the 3D audio, whether it's the multiple ways that the SSD can be used... we are thinking that it is time to give the PlayStation community something new, something different, that can really only be enjoyed on PS5." - Jim Ryan, CEO of PlayStation.
![]()
You didn't answer my question... What's new about Spider-Man that has such an influence on the gameplay and must use the new CPU? Jim Ryan is a marketing guy, that was purely saying this to differentiate them from Microsoft. And he doesn't even mention the CPU, why do you think that is?
AMD's Smartshift tech is about sharing electrical energy between the CPU and GPU.
![]()
To reduce Smartshift's influence, the OEM designer has to budget for max power with CPU and GPU separately.
For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.
Mark Cerny has warned against AVX usage.
Then let's wait with saying Microsoft made the wrong choice?Let's play the game first?
Then let's wait with saying Microsoft made the wrong choice?![]()
R&C is all about the SSD, not the CPUNo need, the madness in Ratchet and Clank can't be done elsewhere, only picking a game that has barely shown any gameplay to measure?
R&C is all about the SSD, not the CPU
From https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-diveThis is still inaccurate. You've been inaccurate on this for a quite a few weeks now ....
Yes Smartshift is about the efficient allocation of power. Your first sentence is a win!
No, the developer doesn't deal with separate cpu/gpu power budgets - that's exactly the opposite of what smartshift is doing. The power budget is unified.
Going further the developer doesn't directly control the power allocation at all - that is something the AMD tech does.
And smartshift has no direct influence on clock speeds.
NeoGAFSource ?
Cerny said it's capable of running at 3.5GHz and 2.23GHz or close to those frequencies "most of the time" within the set power budget. So if the CPU and GPU are running at 3.5GHz and 2.23GHz and something GPU-intensive is happening on screen at that moment, the CPU's barely doing anything and is slightly downclocked. Then, part of the CPU's power budget is shifted to the GPU which is ALREADY running at 2.23GHz, this doesn't help increase the frequency even more due to the extra power boost it's getting from the CPU since it's capped at 2.23GHz. What the power allocation from the CPU to the GPU does, is that it helps the GPU maintain that 2.23GHz clock frequency for a longer period of time now due to the extra power supply headroom it has. What Mark Cerny meant by the absolute worst case scenario is when workloads are causing both the GPU and CPU already running at peak clock speeds, to possibly exceed the set power budget already provided to them. THAT's when both the GPU and CPU are slightly downclocked in order to make sure that the set power budget isn't exceeded.To reduce Smartshift's influence, the OEM designer has to budget for max power with CPU and GPU separately.
For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.
Mark Cerny has warned against AVX usage.
Where's the high NPC count?Here is what smartshift does so far:
![]()
We've missed your FUD around here.
Could you stop talking out of your behind?PS5 programmer has to budget their CPU usage for max GPU clock speed.
R&C is all about the SSD, not the CPU
Again, clock speed is NOT a true indication for the ALU usage level.Cerny said it's capable of running at 3.5GHz and 2.23GHz or close to those frequencies "most of the time" within the set power budget. So if the CPU and GPU are running at 3.5GHz and 2.23GHz and something GPU-intensive is happening on screen at that moment, the CPU's barely doing anything and is slightly downclocked. Then, part of the CPU's power budget is shifted to the GPU which is ALREADY running at 2.23GHz, this doesn't help increase the frequency even more due to the extra power boost it's getting from the CPU since it's capped at 2.23GHz. What the power allocation from the CPU to the GPU does, is that it helps the GPU maintain that 2.23GHz clock frequency for a longer period of time now due to the extra power supply headroom it has. What Mark Cerny meant by the absolute worst case scenario is when workloads are causing both the GPU and CPU already running at peak clock speeds, to possibly exceed the set power budget already provided to them. THAT's when both the GPU and CPU are slightly downclocked in order to make sure that the set power budget isn't exceeded.
Again, clock speed is NOT a true indication for the ALU usage level.Could you stop talking out of your behind?
Here, a direct quote from Cerny:
"The CPU and GPU each have a power budget, of course, the GPU power budget is the larger of the two. If the CPU doesn't use its power budget – for example, if it is capped at 3.5GHz – then the unused portion of the budget goes to the GPU. That's what AMD calls SmartShift. There's enough power that both CPU and GPU can potentially run at their limits of 3.5GHz and 2.23GHz, it isn't the case that the developer has to choose to run one of them slower."
Source: https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Now what do you have to say
AMD's Smartshift tech is about sharing electrical energy between the CPU and GPU.
![]()
To reduce Smartshift's influence, the OEM designer has to budget for max power with CPU and GPU separately.
For PS5, GPU's 200Mhz boost over 2Ghz is shifted from the CPU's power budget.
Mark Cerny has warned against AVX usage.
That's the main highlights, lossless as @PaintTinJr and others has explained previously, compared to the lossy BCPack to reach higher number with quality compromises.
Again, clock speed is NOT a true indication for the ALU usage level.
from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
Again, clock speed is NOT a true indication for the ALU usage level.
from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
In the same paragraph:From https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
It's because the devkits' frequencies are LOCKED. And what did I say in my previous post?There's likely more to discover about how boost will influence game design. Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.
AMD's SmartShift technology is responsible for allocating power based on the set power budget to certain elements of the APU, not for managing clock speeds of the CPU and GPU. You do realize that the only reason the Series X's CPU can hit 3.6GHz with SMT enabled and 3.8GHz with SMT disabled is because of the fact that they have a low GPU frequency (compared to PS5's) right? Running the CPU at frequencies higher than 3GHz and the GPU at 2GHz while keeping both frequencies locked at the same time with varying power supply levels? That's a thermal nightmare and the console will catch on fire by that point.
Nope. AMD's Smart shift is more than TDP management. Smart shift manages the shared power feed into CPU and GPU.So, a higher TDP would enable the CPU and GPU to hit 3.5Ghz and 2.23Ghz without each component having to borrow power from the other? I guess the current setup is the best that Sony could achieve without causing the system to become too hot for the cooling system that they devised; otherwise, the console would be even larger than it is now.
Now I wonder which console will run quieter.
I hope you're not serious (timestamped)
Bo I like you but you should start to listen.
Before Oodle Texture the compression was losless.
Now with the usage of Oodle Texture the compression is not losless.
You can expect both oodle texture and bcpack to be the same in quality. However currently we can't say which one is better compressed (in terms of size reduction) overall.
We can however expect something between 20-60% of texture compression from original to final compression overall in both cases. ( depending on the texture )
The rest of the paragraph you quoted specifically explains that these devs are not "budgeting" their cpu usage, but rather chose to optimize with locked performance profile, because they don't need the extra cpu power:Again, clock speed is NOT a true indication for the ALU usage level.
from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
Again, clock speed is NOT a true indication for the ALU usage level.
from https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
PS4 and PS4 Pro consoles have Sony's "fix frequency" regime since 2013.You seem to not understand that fixed frequency is more prone to throttling than smartshift? Constant frequency at max drawing more unnecessary heat? Do you want us to go this route now?
Nope. AMD's Smart shift is more than TDP management. Smart shift manages the shared power feed into CPU and GPU.
PS4 and PS4 Pro consoles have Sony's "fix frequency" regime since 2013.
I own Ryzen APU mobile and I can set CPU usage to 50% via Windows power management in exchange for high GPU clock speed. I have GPU bias mode profile for my Ryzen APU laptop. This behavior is replicable across multiple mobile Ryzen APUs with the same power design limit.The rest of the paragraph you quoted specifically explains that these devs are not using smartshift, but rather a locked performance profile, because they don't need the extra cpu power:
"However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny."
The best thing about this is that devs right now can take all their assets and run them through those compression tools long before their games are to release, even if it's November. We're all going to thank Oodle for our shorter downloads and (to me more important) less space taken on the precious 800-ish GB drive.Just looks this compression rates:
127 MB block compressed GPU textures, mix of BC1-7
78 MB with zip/zlib/deflate
70 MB with Oodle Kraken
40 MB with Oodle Texture + Kraken
https://cbloomrants.blogspot.com/2020/06/oodle-texture-slashes-game-sizes.html
When I registered, account approval waiting time was around a full month and you couldn't register with a free email (like Gmail). This made FUD spreaders much more careful. Now we're into the full festival of stupid.I only heard about NeoGAF in January, lurked for a month, then couldn't resists the memes and exciting chat. Sign up 99.99% for this thread. Gonna feel empty when it gets closed.
I'm going to bite a month of Game Pass to try that game just because it's set in my neighbourhood. I always find it incredibly immersive to play in real locations.What do you mean with this?
I am disagree with your opinion of The Medium, for me looks good for be a game of the first wave and will have some interesting mechanics so good.
Who gives a fuck about Jaguar in 2020? Read again what I'm saying, we need to throw those Jaguars behind us as far as possible, maybe ask Elon Musk to throw them into outer space.
Microsoft knows that this might be the case after 2-3 years but not at the start of the gen. The new Spider-Man, how different do you think the AI will be? How much will the actual gameplay change compared to running on a Jaguar CPU? It's mostly a visual upgrade, new story, and reusing a lot of assets of the original game.
I've just imagined if any of the platform holders made it mandatory to upgrade. Guess what'd Rockstar and EA do? Release the same game with a new name, include 2 hours of content and cash in. Greed always finds a way.God is it that hard to get it, its blowing my mind, 3rd party pricing is set by 3rd party publishers, thats it
Your quote comes from Digital Foundry, pet pixel counters with no knowledge of game development. People who are sponsored by Microsoft. They're as unbiased and objective as you are.From https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core
does that mean you need to download the whole thing with all the duplicates in XSX?
Sorry for the mild rant, but I'm sick of this too much worry and talk about current gen and cross-gen games, as if we're excited about them! I, and many others, care about new true next gen.
Yeah? I own eyes and a brain with reading comprehensionI own Ryzen APU mobile and I can set CPU usage to 50% via WIndows power management in exchange for high GPU clock speed. I have GPU bias mode profile for my Ryzen APU laptop. This behavior is replicable across multiple mobile Ryzen APUs with the same power design limit.