So now you agree with me that developers need to be able to choose what to downclock or idle if you prefer that. The cpu or gpu. That's been my main point this entire time.
And while Leadbetter may be incorrect in his understanding he says he was told this by multiple devs. Are they wrong too?
I think I'm done. Maybe I'm wrong but the evidence doesn't seem to say that.
It can be helpful for developers to use a fixed profile for evaluating the performance of different code, but it is not a part of production code that is run on real PS5's.
It's a feature of the dev-kit.
It's explained in the Eurogamer interview that the video you link to comes from.
On release hardware it's handled by PS5's "model SoC".
You seem to be focussing in one one point instead of the context in which it was said and the other points from the same article that say how both CPU and GPU stay at or near their full clock-speeds.
It seems a bit like cherry picking to force a point, to be blunt about it.
The entire quote and context from the original article the video references:
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores. However, this doesn't sound like a boost solution, but rather performance profiles similar to what we've seen on Nintendo Switch. "Regarding locked profiles, we support those on our dev kits, it can be helpful not to have variable clocks when optimising. Released PS5 games always get boosted frequencies so that they can take advantage of the additional power," explains Cerny.
This isn't the first time this has been explained in this thread.
As to my earlier point of synthetic burn-tests exceeding power draw limits, the same interview also has something to say about that:
I wondered whether there were 'worst case scenario' frequencies that developers could work around - an equivalent to the base clocks PC components have. "Developers don't need to optimise in any way; if necessary, the frequency will adjust to whatever actions the CPU and GPU are performing," Mark Cerny counters. "I think you're asking what happens if there is a piece of code intentionally written so that every transistor (or the maximum number of transistors possible) in the CPU and GPU flip on every cycle. That's a pretty abstract question, games aren't anywhere near that amount of power consumption. In fact, if such a piece of code were to run on existing consoles, the power consumption would be well out of the intended operating range and it's even possible that the console would go into thermal shutdown. PS5 would handle such an unrealistic piece of code more gracefully."
Emphasis mine.
This isn't "bad hardware design". It's how fixed clock systems work when they're designed to be performant with realistic pieces of code and not pointless loops.
Exceeding power draw simply is not a problem unique to variable clocks at all, and by targeting maximum power draw and varying clocks a "few percent" to peg it there, you're actually making it a lot easier to get close to the limit of what the chip can do as far as useful work goes.
Power hungry scenes aren't complex action sequences. They are low triangle high frame rate scenes.
Writing code to try and consume as many Watts as possible (stress test benchmark) is trivial.
Writing code to get the absolute most out of a chip while doing useful work (calculating prime numbers, scientific work etc) is a very complex subject and not at all trivial.
Writing very efficient game code (high utilisation) with lots of memory look ups and adjustments, with staged pipelines that depend on other work being completed, and trying to parallelise it is extremely difficult and just doesn't even come close to the power consumption of the other two tasks, especially the first one, which is the "unrealistic" code Cerny means.
Fixed clocks on a games console in no way whatsoever means it can run all three scenarios without exceeding TDP. If it was clocked low enough to sustain the first scenario, gaming code would be running way slower than it otherwise could, and the chip would be running very cool.
This is the kind of guess work on cooling and power-supply that gets done when designing an APU with fixed clocks. You have to bake in some kind of margin.