thicc_girls_are_teh_best
Member
WARNING: Very technical discussion incoming. Skip to the TL;DR if you'd like. If you'd like to know the methodology behind the numbers there, read through the rest.
------------
Even with the next-gen systems arriving next month, some of us are already anticipating the mid-gen refreshes. Some of us might even already be anticipating the 10th-gen consoles (PlayStation 6, Xbox Series X-Next)...and I'll get to doing something like this for those, too. But for now, and primarily inspired by this thread over on B3D (I'll try cross-posting the PS6/Series X-Next parts there too, to keep in the general discussion over on that site. Plus I'm interested in if any posters there have thoughts to add, counter-points etc. to whatever I end up posting), I started thinking about what the next-next gen systems could be in terms of specs, design philosophy, business strategy approaches, features etc. It's been a LOT of research and numbers involved, and I realized in order to best guess at what things could be, we need to have a better understanding of things as they are right now, and what things could be with the mid-gen refreshes.
So where do we start? Well, we have the PS5 and Series X (and Series S) releasing in little over a month from now, and while I have some very interesting ideas regarding mid-gen refreshes for the latter two as well, the realization hit that it'd actually be easier to start with PS5. Why? Well, it's true there's still a lot of stuff about the system we don't know of yet, but we do technically have a roughly equivalent GPU on the market that can serve as a reference: the 5700 XT. Now before anyone starts, no this isn't a good reference due to any silly "RDNA 1.5" gossip; I think we should all accept that we're dealing with a RDNA2 architecture when it comes to PS5's GPU.
Rather, the reason the 5700 XT is a good reference is because it's literally about the same size as the PS5's GPU: 40 CUs, with all 40 active. The PS5's GPU is also 40 CUs, but 4 are disabled for yields, leaving 36 active. We already know a lot of similar data between the two chips: clock speeds, IPC, peak TF etc. We can also infer certain PS5 GPU features from what the 5700 XT (and indeed the Series X) has, such as 64 ROPs and perhaps roughly equivalent L0$ and L1$ amounts to the 5700 XT.
There are also a few things regarding the PS5 GPU we can roughly infer from guessimates of other things. For example, we know the Series X APU is about 360 mm^2, and estimates pin the PS5's APu at about 320 mm^2. Estimates from the Series X die diagrams (via Hot Chips) peg its GPU at roughly 45% of the total APU size. We know that the Series X has 16 more CUs than the PS5, or 44% more CUs. Therefore, if the Series X's GPU is roughly 162 mm^2 (360 mm^2 * .45% = 162 mm^2), we can simply do (162 mm^2 * .44% = (162 mm^2 - 71.28 mm^2 =), giving us 90.72 mm^2....although this isn't a 100% accurate number. The reason being because, we don't necessarily know the size of a given CU on PS5 compared to Series X, and there are parts of the GPU which are not CUs and would not scale with a percentage reduction. For example, 44% less CUs doesn't mean you suddenly lose a Primitive Shader unit or display unit! Therefore it may be more accurate to say that while the PS5 has 44% less CUs than Series X, the actual GPU is probably only about 35% smaller. So that would give us (162 mm^2 * .35% = (162 mm^2 - 56.7mm^2 =) 105.3mm^2, which sounds like a more accurate number.
One of the biggest questions surrounding PS5's GPU, though, is the exact amount of power it's actually consuming. This probably won't be a number that's ever readily provided, but we can try to calculate roughly what it is. Now one thing we should keep in mind immediately is that the PS5 has a 350 watt PSU. We can infer that Sony are at least aiming for a cooling solution as good as the Xbox One X's (most likely better), and we take a look at that console, its max power consumption was 170 watts on a 245 watt PSU. That gives a power consumption headroom of 75 watts. We could easily then say that if the PS5's cooling solution were to aim to be better than that console's, it should need less power consumption headroom, right? Welll.....
If the next-gen consoles were mainframes...
See, both PS5 and Series X are packing much higher densities in smaller spaces due to massively reduced nodes. Truthfully speaking, there is no linear correlation between the relationship of power(electricity) and heat. In other words, you don't need a certain amount of electricity to produce a certain amount of heat, and you aren't guaranteed a certain amount of heat based on a certain amount of generated power! So, even with the benefits of a 7nm process and the power consumption savings it brings, that doesn't guarantee your heat generation and heat dissipation amounts scale down with it! So even if the next-gen consoles had "only" a One X's level of cooling in place, it's very likely they would need a bit more than 75 watts on the PSU as headroom. This should be important to keep in mind because I'm going to give an estimate for the PS5's system TDP later on.
Now in order to try figuring out the PS5 GPU's TDP, I looked at the 5700 XT's TDP (courtesy via Techpoweredup), listed at 225 watts. Keep in mind this also includes the 8 GB of GDDR6 RAM, which we couldn't include going forward. The common GDDR6 module consumes 1 to 1.5 watts of energy. Taking the most lenient route I did (8 * 1.5), to shave off 12 watts, leaving the GPU itself to 213 watts. Now to figure a few other things out, I took another big liberty; seeing as how the PS5 only has 36 active CUs, I needed to try figuring the wattage per CU on the 5700 XT. (213 wTDP / 40 CUs) gave me 5.325 watts per CU. This isn't a perfect metric, but it's a "good enough" one and, again, is pretty lenient here. That further shaved down the TDP to 191.7 watts (213 - (5.325 * 4)).
From there it got a bit trickier; the PS5 is an RDNA2 chip, so it's going to have SOME power consumption reduction over the RDNA1 5700 XT. Going by some logical guesses, we can probably say that the PS5 (and most likely Series X and S) are 7nm DUV Enhanced chips. This means they won't likely enjoy the full benefit of power consumption reduction over 7nm a 7nm EUV chip would, but a happy middle-ground is possible. 7nm EUV brings a 15% power consumption reduction over 7nm, so this would put 7nm DUV at a 7.5% power consumption reduction. Decent enough. However, this reduction is ONLY for clock-for-clock, so if the two items are of differing clocks, you'd have to adjust. And we clearly know the PS5's GPU is much HIGHER clocked than a 5700 XT's, to the point where this 7.5% is likely negated. Thankfully, there is still a sizable power consumption reduction for RDNA2 over RDNA1...though maybe not as high as AMD's claimed 50% PPW increase.
If we look at RDNA1's gains over Vega, we saw a 14% performance increase with a 23% power consumption reduction clock-for-clock. RDNA1 also saw a 25% performance increase over GCN (which Vega belonged to, though was a later-gen GCN). IIRC, AMD have claimed a 50% PPW increase and 50% IPC gain for RDNA2 over RDNA1; I think at least the 50% IPC gain claim is slightly exaggerated. The 50% PPW improvement might be realistic, but I imagine this is on 7nm EUV chips, and I don't think the PS5 and Series systems are on this exact process (additionally we don't know to what frequency range, i.e in-sweetspot or out-of-sweetspot, that 50% PPW improvement is based around). That all said, I clearly cannot see AMD having gone backwards in improvement gains, so I figured to be friendly (if slightly reserved) and give RDNA2 a 30% power consumption reduction and 25% IPC gain over RDNA, on 7nm DUV Enhanced, clock-for-clock. These percentages will serve as basis for both Sony and MS's systems going forward.
The last (and most challenging) thing to try and figure out here is how exactly the curve for power-to-frequency gain works for RDNA2 beyond its sweetspot. Part of the challenge here is that we don't actually know what RDNA2's sweetspot range is at. However, I actually think this is somewhat easy to calculate. We can use the power consumption reduction of 7nm DUV Enhanced (7.5%) as a modifier to RDNA1's known sweetspot low and high (1.7 GHz, 1.8 GHz), to get:
>1.7 GHz * 7.5% = 1.8275 GHz
>1.8 GHz * 7.5% = 1.935 GHz
These are what I suspect are the new sweetspot points for a 7nm RDNA2 GPU on the DUV Enhanced process. As we can see here, the 5700 XT, if it were an RDNA2 GPU, would fall right in the near the upper end of this new sweetspot, while the PS5, as an RDNA2 GPU, actually exceeds it by some 295 MHz. The last part of this puzzle, then, is to figure out what the power-to-frequency ratio would curve like within that 295 MHz range. We know the absolute peak ratio at the peak of that range, given to us by Cerny himself, at 5:1. But that doesn't mean it remains 5:1 for the entire curve. In fact, it would most likely ramp up exponentially over a wide spread from 1.935 GHz to 2.23 GHz. This spread is probably more "clustered" or sharper for a 7nm DUV Enhanced part than it is a 7nm EUV part, as well. Unfortunately, there's no such graphic to show a similar power-to-frequency ratio spread on a curve graph for the 5700 XT AFAIK. We can assume that at its peak, the 5700 XT probably had a somewhat harsher power-to-frequency ratio, though, at perhaps 6:1 or even 7:1, and that was with a narrow MHz clock range to push up to.
After a hard....some measure of time's....of crunching numbers, eh?.......but we ain't done yet!!
With these other numbers settled on, we can start to actually figure what the PS5 GPU's power consumption is likely at. Going back to the 191.7 watt number (5700 XT's GPU @ 1.905 GHz with 36 CUs active), we can simply apply our (somewhat conservative, but likely probable) power consumption percentage for RDNA2 (on 7nm DUV enhanced) over RDNA1, 30%, as a modifier. That gives us (191.7 watts *.30% = (191.7 watts - 57.51 watts =)) 134.19 watts, which we can probably round down to 134 watts. However, keep in mind that's clock-for-clock, and the PS5 GPU's clock is not the same as the 5700 XT's; it's much faster. 295 MHz faster, in fact. If you divide 2230 MHz by 295 MHz, you get a percentage of 7.6%. Now again, we're going to have to treat things with a bit of liberty. Being very lenient and assuming 1:1 power-to-frequency (which, for anything beyond the sweetspot, would not be the case and we'll correct for this in a moment), we can add that 7.6% (representing the amount of additional frequency as a conversion to wattage increase in a theoretical 1:1 relationship) as a wattage amount, 7.6 watts, to the 134.19 watts calculated earlier, giving us 141.75 watts.
However, it doesn't actually end there. We still have to account for the gain the PS5 GPU has over a 5700 XT when their CUs are adjusted to even, at their native GPU clocks. A 5700 XT @ 1.905 GHz (Boost clock) with 36 CUs active gives 8.778 TF of performance. Meanwhile, a PS5 with 36 CUs active and an "sustained boost" (these are Cerny's own words) of 2.23 GHz gives 10.275 TF of performance. That accounts for a difference of 1.35852 TF. Dividing this difference into the PS5's TF total gives a percentage ratio of 7.56%. Again, we're going to be a bit lenient and assume a 1:1 linear relationship here, this time being clock frequency to floating point operations per second. However, in this case we should ideally calculate the 5700 XT with the idea of it as an RDNA2 chip, meaning we should check to see if its actual Boost clock is beyond the new sweetspot peak. It is not, so we should instead calculate its performance here @ the 1.935 GHz clock, since we're trying to account for the power consumption at the difference of clocks between a 5700 XT in the sweetspot on 7nm DUV Enhanced, and a PS5 outside of the sweetspot on the same process.
36 CUs * 64 ROPs * 2 IPC * 1.935 GHz gives 8.91648 TF. Subtract this from PS5's 10.275 TF and it leaves us with 1.35936 TF, which if you divide into 10.275 TF gives a percentage of...7.559% While we still needed to take this step, it actually changes much nothing from the calculation in the above paragraph, we've just merely confirmed it was correct. Simplifying things, then, we can round up 7.559 to 7.56 without being egregious. In any case, we now need to add this 7.56% percent as if to represent a wattage increase for obtaining it (by assuming a linear ratio between performance and power, which we know for anything beyond the sweetspot this isn't the case). 141.75 watts + 7.56 watts = 149.31 watts. From here we simply need to give a final assumed wattage increase to account for the actual non-linear scaling of power-to-frequency we have at clocks beyond the sweetspot, which we'll figure is about 10 watts, 11 watts tops, by taking the aforementioned 7.6% and 7.56% figures, adding them, then multiplying by 2/3, as I wouldn't think of such a correction on the wattage TDP to be driven by an even amount from those two figures, but anything approaching the sum of those figures would likely be too aggressive.
In total, adding the 11 watts to 149 watts, we get a PS5 GPU with a watts TDP of 160 watts. When looking at it from this context, it fits in nicely after we start to do some rough calculations for other system components and their power consumption costs, to arrive at what the total system TDP might be. Start by taking the aforementioned 12 watts for the 8 GB of GDDR6, add in about 22 watts for the Zen 2 processor (both PS5 and Series X are likely based on the 4800U Zen 2 APU, which has a CPU rated at 25 watts with max clock of 4.2 GHz. The PS5's CPU clock is a bit lower than that, at 3.5 GHz, so I shave off 3 watts), 8 watts for the Tempest Engine, 15 watts for the SSD I/O Block (it's been stated to be equivalent to 9 Zen 2 cores in processing power, though I'd say with some stuff removed that an actual Zen 2 CPU would have, so I shave off 10 watts from the 4800U's CPU TDP assuming the performance of the I/O block here is probably referencing this particular CPU), and 6 watts for the NAND chip modules (assuming half a watt for each of the 12 64 GB modules), and you get a total system TDP budget of 225 watts.
SO, a TL;DR:
>PS5 GPU Power Consumption: 160 watts
>PS5 GPU Die Area: ~ 105 mm^2
>PS5 System Power Consumption: 225 watts
>PS5 PSU: 350 watts
Aaand we did it, gang. We're done with the numbuahz!!
Considering the 5700 XT alone has the same TDP and that doesn't even account for anything else aside from GDDR6, and delivers much lower performance than PS5 (especially when you adjust PS5's performance to relative RDNA1 equivalents, aka 12.84357 TF of relative RDNA1 equivalent performance), this is very impressive. It also leaves a wattage headroom of 125 watts on the PSU, which sounds about right considering what we talked about in the beginning with these systems likely not reducing heat generation or dissipation despite power consumption reduction, as heat and electricity do not share a linear relationship.
It is likely that I have undershot the TDP in some cases, mainly due to not knowing exactly what the power consumption in that 295 MHz range beyond the adjusted peak of the new sweetspot would actually look like. If so, however, I highly doubt it's been undershot by anything more than 25 watts, and that's on the extreme end, while still leaving 100 watts of headroom for the PSU. Now hopefully people can see why it was necessary to hammer out some speculative calculations on aspects of the PS5's power consumption here, as it helped us to also figure out the likely sweetspot range for RDNA2 on 7nm DUV Enhanced process, see some relationships (linear and non-linear) between a few things, and even kind of figure out the likely size of the GPU portion of the APU. Some of the Series X info we already have was also helpful here, but the info on 5700 XT was even more helpful, providing just enough to help us figure this stuff out.
With that out of the way, we can finally move on to the more interesting stuff: the PlayStation 5 Pro, and Series X-2 and Series S refresh. For the latter two, there'll be some (very brief) number work on some aspects of Series X included just to serve as a basis of reference there, but by and large it's extremely easy to figure out things like the die size and GPU portion sizes both because MS have outright provided such numbers (through listings and graphs), and because some of the other stuff can simply be deduced by what we know about Series X, such as how we deduced the PS5's GPU size here from the Series X's GPU size. This also applies to GPU power consumption amounts, which for something like Series X, surprisingly, might actually fall roughly in range with PS5's despite some of their other differences, for reasons I'll touch on when we get to those mid-gen refreshes.
So yeah, I hope I'll have Part 2: PS5 Pro, Series X-2 And Series S-2 Mid-Gen Refreshes up very soon (will also link that one here when it's ready).
Hope you all enjoyed this and weren't put off by all the data crunching. Give it a try yourselves or just share in what ballpark ranges you personally think some aspects of PS5's GPU (power consumption amounts, power-to-frequency ratio scaling ranges, assumed RDNA2 sweetspot ranges etc.) fall in. While at it, what ideas do you have in mind for the mid-gen PS5, Series X and Series S refreshes? Are you expecting any at all? Do you think they'll replicate the PS4 Pro and One X or go in a different direction? Sound off!
------------
Even with the next-gen systems arriving next month, some of us are already anticipating the mid-gen refreshes. Some of us might even already be anticipating the 10th-gen consoles (PlayStation 6, Xbox Series X-Next)...and I'll get to doing something like this for those, too. But for now, and primarily inspired by this thread over on B3D (I'll try cross-posting the PS6/Series X-Next parts there too, to keep in the general discussion over on that site. Plus I'm interested in if any posters there have thoughts to add, counter-points etc. to whatever I end up posting), I started thinking about what the next-next gen systems could be in terms of specs, design philosophy, business strategy approaches, features etc. It's been a LOT of research and numbers involved, and I realized in order to best guess at what things could be, we need to have a better understanding of things as they are right now, and what things could be with the mid-gen refreshes.
So where do we start? Well, we have the PS5 and Series X (and Series S) releasing in little over a month from now, and while I have some very interesting ideas regarding mid-gen refreshes for the latter two as well, the realization hit that it'd actually be easier to start with PS5. Why? Well, it's true there's still a lot of stuff about the system we don't know of yet, but we do technically have a roughly equivalent GPU on the market that can serve as a reference: the 5700 XT. Now before anyone starts, no this isn't a good reference due to any silly "RDNA 1.5" gossip; I think we should all accept that we're dealing with a RDNA2 architecture when it comes to PS5's GPU.
Rather, the reason the 5700 XT is a good reference is because it's literally about the same size as the PS5's GPU: 40 CUs, with all 40 active. The PS5's GPU is also 40 CUs, but 4 are disabled for yields, leaving 36 active. We already know a lot of similar data between the two chips: clock speeds, IPC, peak TF etc. We can also infer certain PS5 GPU features from what the 5700 XT (and indeed the Series X) has, such as 64 ROPs and perhaps roughly equivalent L0$ and L1$ amounts to the 5700 XT.
There are also a few things regarding the PS5 GPU we can roughly infer from guessimates of other things. For example, we know the Series X APU is about 360 mm^2, and estimates pin the PS5's APu at about 320 mm^2. Estimates from the Series X die diagrams (via Hot Chips) peg its GPU at roughly 45% of the total APU size. We know that the Series X has 16 more CUs than the PS5, or 44% more CUs. Therefore, if the Series X's GPU is roughly 162 mm^2 (360 mm^2 * .45% = 162 mm^2), we can simply do (162 mm^2 * .44% = (162 mm^2 - 71.28 mm^2 =), giving us 90.72 mm^2....although this isn't a 100% accurate number. The reason being because, we don't necessarily know the size of a given CU on PS5 compared to Series X, and there are parts of the GPU which are not CUs and would not scale with a percentage reduction. For example, 44% less CUs doesn't mean you suddenly lose a Primitive Shader unit or display unit! Therefore it may be more accurate to say that while the PS5 has 44% less CUs than Series X, the actual GPU is probably only about 35% smaller. So that would give us (162 mm^2 * .35% = (162 mm^2 - 56.7mm^2 =) 105.3mm^2, which sounds like a more accurate number.
One of the biggest questions surrounding PS5's GPU, though, is the exact amount of power it's actually consuming. This probably won't be a number that's ever readily provided, but we can try to calculate roughly what it is. Now one thing we should keep in mind immediately is that the PS5 has a 350 watt PSU. We can infer that Sony are at least aiming for a cooling solution as good as the Xbox One X's (most likely better), and we take a look at that console, its max power consumption was 170 watts on a 245 watt PSU. That gives a power consumption headroom of 75 watts. We could easily then say that if the PS5's cooling solution were to aim to be better than that console's, it should need less power consumption headroom, right? Welll.....
If the next-gen consoles were mainframes...
See, both PS5 and Series X are packing much higher densities in smaller spaces due to massively reduced nodes. Truthfully speaking, there is no linear correlation between the relationship of power(electricity) and heat. In other words, you don't need a certain amount of electricity to produce a certain amount of heat, and you aren't guaranteed a certain amount of heat based on a certain amount of generated power! So, even with the benefits of a 7nm process and the power consumption savings it brings, that doesn't guarantee your heat generation and heat dissipation amounts scale down with it! So even if the next-gen consoles had "only" a One X's level of cooling in place, it's very likely they would need a bit more than 75 watts on the PSU as headroom. This should be important to keep in mind because I'm going to give an estimate for the PS5's system TDP later on.
Now in order to try figuring out the PS5 GPU's TDP, I looked at the 5700 XT's TDP (courtesy via Techpoweredup), listed at 225 watts. Keep in mind this also includes the 8 GB of GDDR6 RAM, which we couldn't include going forward. The common GDDR6 module consumes 1 to 1.5 watts of energy. Taking the most lenient route I did (8 * 1.5), to shave off 12 watts, leaving the GPU itself to 213 watts. Now to figure a few other things out, I took another big liberty; seeing as how the PS5 only has 36 active CUs, I needed to try figuring the wattage per CU on the 5700 XT. (213 wTDP / 40 CUs) gave me 5.325 watts per CU. This isn't a perfect metric, but it's a "good enough" one and, again, is pretty lenient here. That further shaved down the TDP to 191.7 watts (213 - (5.325 * 4)).
From there it got a bit trickier; the PS5 is an RDNA2 chip, so it's going to have SOME power consumption reduction over the RDNA1 5700 XT. Going by some logical guesses, we can probably say that the PS5 (and most likely Series X and S) are 7nm DUV Enhanced chips. This means they won't likely enjoy the full benefit of power consumption reduction over 7nm a 7nm EUV chip would, but a happy middle-ground is possible. 7nm EUV brings a 15% power consumption reduction over 7nm, so this would put 7nm DUV at a 7.5% power consumption reduction. Decent enough. However, this reduction is ONLY for clock-for-clock, so if the two items are of differing clocks, you'd have to adjust. And we clearly know the PS5's GPU is much HIGHER clocked than a 5700 XT's, to the point where this 7.5% is likely negated. Thankfully, there is still a sizable power consumption reduction for RDNA2 over RDNA1...though maybe not as high as AMD's claimed 50% PPW increase.
If we look at RDNA1's gains over Vega, we saw a 14% performance increase with a 23% power consumption reduction clock-for-clock. RDNA1 also saw a 25% performance increase over GCN (which Vega belonged to, though was a later-gen GCN). IIRC, AMD have claimed a 50% PPW increase and 50% IPC gain for RDNA2 over RDNA1; I think at least the 50% IPC gain claim is slightly exaggerated. The 50% PPW improvement might be realistic, but I imagine this is on 7nm EUV chips, and I don't think the PS5 and Series systems are on this exact process (additionally we don't know to what frequency range, i.e in-sweetspot or out-of-sweetspot, that 50% PPW improvement is based around). That all said, I clearly cannot see AMD having gone backwards in improvement gains, so I figured to be friendly (if slightly reserved) and give RDNA2 a 30% power consumption reduction and 25% IPC gain over RDNA, on 7nm DUV Enhanced, clock-for-clock. These percentages will serve as basis for both Sony and MS's systems going forward.
The last (and most challenging) thing to try and figure out here is how exactly the curve for power-to-frequency gain works for RDNA2 beyond its sweetspot. Part of the challenge here is that we don't actually know what RDNA2's sweetspot range is at. However, I actually think this is somewhat easy to calculate. We can use the power consumption reduction of 7nm DUV Enhanced (7.5%) as a modifier to RDNA1's known sweetspot low and high (1.7 GHz, 1.8 GHz), to get:
>1.7 GHz * 7.5% = 1.8275 GHz
>1.8 GHz * 7.5% = 1.935 GHz
These are what I suspect are the new sweetspot points for a 7nm RDNA2 GPU on the DUV Enhanced process. As we can see here, the 5700 XT, if it were an RDNA2 GPU, would fall right in the near the upper end of this new sweetspot, while the PS5, as an RDNA2 GPU, actually exceeds it by some 295 MHz. The last part of this puzzle, then, is to figure out what the power-to-frequency ratio would curve like within that 295 MHz range. We know the absolute peak ratio at the peak of that range, given to us by Cerny himself, at 5:1. But that doesn't mean it remains 5:1 for the entire curve. In fact, it would most likely ramp up exponentially over a wide spread from 1.935 GHz to 2.23 GHz. This spread is probably more "clustered" or sharper for a 7nm DUV Enhanced part than it is a 7nm EUV part, as well. Unfortunately, there's no such graphic to show a similar power-to-frequency ratio spread on a curve graph for the 5700 XT AFAIK. We can assume that at its peak, the 5700 XT probably had a somewhat harsher power-to-frequency ratio, though, at perhaps 6:1 or even 7:1, and that was with a narrow MHz clock range to push up to.
After a hard....some measure of time's....of crunching numbers, eh?.......but we ain't done yet!!
With these other numbers settled on, we can start to actually figure what the PS5 GPU's power consumption is likely at. Going back to the 191.7 watt number (5700 XT's GPU @ 1.905 GHz with 36 CUs active), we can simply apply our (somewhat conservative, but likely probable) power consumption percentage for RDNA2 (on 7nm DUV enhanced) over RDNA1, 30%, as a modifier. That gives us (191.7 watts *.30% = (191.7 watts - 57.51 watts =)) 134.19 watts, which we can probably round down to 134 watts. However, keep in mind that's clock-for-clock, and the PS5 GPU's clock is not the same as the 5700 XT's; it's much faster. 295 MHz faster, in fact. If you divide 2230 MHz by 295 MHz, you get a percentage of 7.6%. Now again, we're going to have to treat things with a bit of liberty. Being very lenient and assuming 1:1 power-to-frequency (which, for anything beyond the sweetspot, would not be the case and we'll correct for this in a moment), we can add that 7.6% (representing the amount of additional frequency as a conversion to wattage increase in a theoretical 1:1 relationship) as a wattage amount, 7.6 watts, to the 134.19 watts calculated earlier, giving us 141.75 watts.
However, it doesn't actually end there. We still have to account for the gain the PS5 GPU has over a 5700 XT when their CUs are adjusted to even, at their native GPU clocks. A 5700 XT @ 1.905 GHz (Boost clock) with 36 CUs active gives 8.778 TF of performance. Meanwhile, a PS5 with 36 CUs active and an "sustained boost" (these are Cerny's own words) of 2.23 GHz gives 10.275 TF of performance. That accounts for a difference of 1.35852 TF. Dividing this difference into the PS5's TF total gives a percentage ratio of 7.56%. Again, we're going to be a bit lenient and assume a 1:1 linear relationship here, this time being clock frequency to floating point operations per second. However, in this case we should ideally calculate the 5700 XT with the idea of it as an RDNA2 chip, meaning we should check to see if its actual Boost clock is beyond the new sweetspot peak. It is not, so we should instead calculate its performance here @ the 1.935 GHz clock, since we're trying to account for the power consumption at the difference of clocks between a 5700 XT in the sweetspot on 7nm DUV Enhanced, and a PS5 outside of the sweetspot on the same process.
36 CUs * 64 ROPs * 2 IPC * 1.935 GHz gives 8.91648 TF. Subtract this from PS5's 10.275 TF and it leaves us with 1.35936 TF, which if you divide into 10.275 TF gives a percentage of...7.559% While we still needed to take this step, it actually changes much nothing from the calculation in the above paragraph, we've just merely confirmed it was correct. Simplifying things, then, we can round up 7.559 to 7.56 without being egregious. In any case, we now need to add this 7.56% percent as if to represent a wattage increase for obtaining it (by assuming a linear ratio between performance and power, which we know for anything beyond the sweetspot this isn't the case). 141.75 watts + 7.56 watts = 149.31 watts. From here we simply need to give a final assumed wattage increase to account for the actual non-linear scaling of power-to-frequency we have at clocks beyond the sweetspot, which we'll figure is about 10 watts, 11 watts tops, by taking the aforementioned 7.6% and 7.56% figures, adding them, then multiplying by 2/3, as I wouldn't think of such a correction on the wattage TDP to be driven by an even amount from those two figures, but anything approaching the sum of those figures would likely be too aggressive.
In total, adding the 11 watts to 149 watts, we get a PS5 GPU with a watts TDP of 160 watts. When looking at it from this context, it fits in nicely after we start to do some rough calculations for other system components and their power consumption costs, to arrive at what the total system TDP might be. Start by taking the aforementioned 12 watts for the 8 GB of GDDR6, add in about 22 watts for the Zen 2 processor (both PS5 and Series X are likely based on the 4800U Zen 2 APU, which has a CPU rated at 25 watts with max clock of 4.2 GHz. The PS5's CPU clock is a bit lower than that, at 3.5 GHz, so I shave off 3 watts), 8 watts for the Tempest Engine, 15 watts for the SSD I/O Block (it's been stated to be equivalent to 9 Zen 2 cores in processing power, though I'd say with some stuff removed that an actual Zen 2 CPU would have, so I shave off 10 watts from the 4800U's CPU TDP assuming the performance of the I/O block here is probably referencing this particular CPU), and 6 watts for the NAND chip modules (assuming half a watt for each of the 12 64 GB modules), and you get a total system TDP budget of 225 watts.
SO, a TL;DR:
>PS5 GPU Power Consumption: 160 watts
>PS5 GPU Die Area: ~ 105 mm^2
>PS5 System Power Consumption: 225 watts
>PS5 PSU: 350 watts
Aaand we did it, gang. We're done with the numbuahz!!
Considering the 5700 XT alone has the same TDP and that doesn't even account for anything else aside from GDDR6, and delivers much lower performance than PS5 (especially when you adjust PS5's performance to relative RDNA1 equivalents, aka 12.84357 TF of relative RDNA1 equivalent performance), this is very impressive. It also leaves a wattage headroom of 125 watts on the PSU, which sounds about right considering what we talked about in the beginning with these systems likely not reducing heat generation or dissipation despite power consumption reduction, as heat and electricity do not share a linear relationship.
It is likely that I have undershot the TDP in some cases, mainly due to not knowing exactly what the power consumption in that 295 MHz range beyond the adjusted peak of the new sweetspot would actually look like. If so, however, I highly doubt it's been undershot by anything more than 25 watts, and that's on the extreme end, while still leaving 100 watts of headroom for the PSU. Now hopefully people can see why it was necessary to hammer out some speculative calculations on aspects of the PS5's power consumption here, as it helped us to also figure out the likely sweetspot range for RDNA2 on 7nm DUV Enhanced process, see some relationships (linear and non-linear) between a few things, and even kind of figure out the likely size of the GPU portion of the APU. Some of the Series X info we already have was also helpful here, but the info on 5700 XT was even more helpful, providing just enough to help us figure this stuff out.
With that out of the way, we can finally move on to the more interesting stuff: the PlayStation 5 Pro, and Series X-2 and Series S refresh. For the latter two, there'll be some (very brief) number work on some aspects of Series X included just to serve as a basis of reference there, but by and large it's extremely easy to figure out things like the die size and GPU portion sizes both because MS have outright provided such numbers (through listings and graphs), and because some of the other stuff can simply be deduced by what we know about Series X, such as how we deduced the PS5's GPU size here from the Series X's GPU size. This also applies to GPU power consumption amounts, which for something like Series X, surprisingly, might actually fall roughly in range with PS5's despite some of their other differences, for reasons I'll touch on when we get to those mid-gen refreshes.
So yeah, I hope I'll have Part 2: PS5 Pro, Series X-2 And Series S-2 Mid-Gen Refreshes up very soon (will also link that one here when it's ready).
Hope you all enjoyed this and weren't put off by all the data crunching. Give it a try yourselves or just share in what ballpark ranges you personally think some aspects of PS5's GPU (power consumption amounts, power-to-frequency ratio scaling ranges, assumed RDNA2 sweetspot ranges etc.) fall in. While at it, what ideas do you have in mind for the mid-gen PS5, Series X and Series S refreshes? Are you expecting any at all? Do you think they'll replicate the PS4 Pro and One X or go in a different direction? Sound off!