Chips aren't born equal. When you break up a wafer into chips, different chips will have variant quality. One chip will have a messed-up CU, another will have all CUs in perfect condition but will draw 200W in order to run at 1.8Ghz and another will be exactly the same but will draw only 150W. Because of that, GPU makers tier their cards. 5700 and 5700 XT are the exact same chip born on the same wafer. AMD had tested both and found one to be of higher quality so they've put it in the 5700 XT bin while the other chip was of lower quality, so it went in the 5700 bin. That's how AMD uses "the whole buffalo", they set a few bars for different SKUs and divide the chips between those SKU according to their quality. 5700 XT has to have 40 out of 40 active CUs and reach 1905Mhz while drawing 225W tops and 5700 has to have only 36 out of 40 active CUs and reach 1725Mhz while drawing 180W tops. If a chip is tested and it needs 280W to reach 1905Mhz, it goes in the 5700 bin even if all 40 CUs are perfect because it couldn't hit 225W. If a chip has a messed up CU, it goes in the 5700 bin even if it can reach 1905Mhz while using just 130W because it can't reach 40CUs.
Consoles, on the other hand, don't have the luxury of SKUs. If you want to take the same chip as the 5700/5700 XT and use it in a console, you know that your baseline has to be low enough because you don't have a cheaper SKU that uses all the chips that don't make the cut. You have to turn off some CUs, you have to decide on a low enough clock speed, heat, and power draw targets so that the vast majority of chip that are made can be used. After all, the more chips you have to throw away, the more money you've spent making your APU. Maybe your PS4's GPU has perfect 20CUs and it can hit 1000Mhz easily (2.56TF), but it won't because all PS4 consoles have to be the same and your neighbor's PS4's GPU can't hit those numbers.
Now that we know that, let's look at these "power draw/clocks" graphs and tables and think about what they actually mean. Someone took a 5700 XT, undervolted it and had pasted the results in a table or a graph. But what do these numbers really mean? First, they always use the 5700 XT, not the 5700. As we already know, a 5700 XT is the best of the best that came off the assembly line so obviously it will show the best overclocking and thermal performance. So they are basically taking the best-case scenario and presenting it as if it's the average scenario, that's the first failing of these graphs.
The second failing is that the 5700 XT has a 225W TBP label, but why is that? Haven't AMD heard about under-volting? Are they wasting power just for the hell of it? Not really. We've already talked about how not all chips are born equal, and even though the 5700 XT chips are the best chips on every wafer, they still vary in quality. One can hit 1905Mhz while using 140W and another chip will use 220W in order to hit the same clock speed. AMD had tested them and realized that the baseline should be 225W. It doesn't mean that the card you've bought will actually need 225W, but some of them do so the 5700 XT generally speaking does need the 225W label, even if your card doesn't. The concept of under-volting is based on that you've won the silicon lottery, your chip can achieve the desired clock while using a lower voltage than the lowest quality 5700 XT that are out there. So the whole concept of under-volting is totally irrelevant to consoles. Sony can't count on the numbers in those graphs and table, after all a lot of 5700 XT can't sustain them and probably all 5700 can't either. So what will Sony do? Use @AegonSnake's tables and throw away 70% of the chips to the bin?
That's why these tables and graphs are irrelevant to consoles. There are no different SKUs to utilize the lower quality chips and the graphs are made based on high-quality chips that provide figures that most of the chips on the wafer just can't reach, thus irrelevant to console makers.