Well, it ain't about hitting 5 WGP or not.
It's about trying to find out why Mark Cerny changed his philosophy on CU count.
Road to PS5
"In general, I like running the GPU at a higher frequency. Let me show you why.
Here's two possible configurations for a GPU roughly of the level of the PlayStation 4 Pro. This is a thought experiment, don't take these configurations too seriously.
If you just calculate teraflops, you get the same number, but actually, the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU.
That's just one part of the GPU, there are a lot of other units and those other units all run faster when the GPU frequency is higher. At 33% higher frequency, rasterization goes 33% faster. Processing the command buffer goes that much faster, the L2 and other caches have that much higher bandwidth and so on.
About the only downside, is that system memory is 33% further away in terms of cycles. But the large number of benefits more than counterbalanced that.
As a friend of mine says a rising tide lifts all boats.
Also it's easier to fully use 36CUs in parallel than it is to fully use 48CUs. When triangles are small, it's much harder to fill all those CUs with useful work."
The bolded is the most important in this case. Why would Cerny increase the CUs within the Shader Engines even though it's already hard to utilize 48CUs in parallel?
AMD also never go over 5WGP per Shader Array. How Cerny figure it out is what I'm interested in learning.
I don't think that his philosophy changed. If anything, how I understand his thinking is that he would sooner clock higher, than add more CUs. That's not saying he has a disdain for more CUs, just that he has a preference for higher clocks. That 36CU mark they hit with the PS4pro, has pretty much gotten clocked as high as they could with the PS5. And also, even if they are going with more CUs now, it doesn't mean that everything said then still doesn't stand true, it's just necessary now and with the PS5 launch, it wasn't.
Think of it, they would have to find a way to clock it (36CU) as high as around 4Ghz to get a similar performance to a 60CU chip clocked at ~2.3Ghz.
More evidence that he's just not resorted to throwing more CUs at the problem, they are throwing in AU units and they didn't just go for another doubling of the CU count and go up to 72. Even though that would have been possible.
What kind of performance boost are we talking about with existing games like Helldivers 2, or maybe FFXVI?
My current theory (or rule of thumb), is that any game that runs at or has a 60fps mode on the base PS5 (900p-1080p), will have a 60fps mode while running at 1440p or more which will then be reconstructed to 4K using AI. And this will look better than whatever the base PS5 was doing in its 30fps mode, or look the same as the base PS5 in its 30fps mode while running at 60fps on the Pro (using AI upscaling).