Tqaulity
Member
How many people here believe that Sony under delivered with the PS5's graphical capabilities based on the 10.3 TFLOP metric? Do you subscribe to the notion that the PS5's GPU is nothing more than a RX 5700/XT card or that it's only on par with midrange GPUs today?
This post is about providing real data and educated estimations to dispel the notion that the PS5 GPU is only a "midrange" GPU that is not on par with today's top tier commercial GPUs. Indeed, looking at the TFLOP number in isolation is indeed very misleading and the truth about the actual performance of the GPU points paints a very different picture. I know many of you don't know me but I can say that I am not just pulling this info from nowhere. I have over 15 years experience working in gaming and have spent nearly 5 years of my career doing critical analysis of GPU performance. Take it for what it is.
Before I begin, a full disclaimer: this post is not about a comparison to or commentary on the Xbox Series X. No console fanboyism here please. The fact is, the Xbox Series X has a bigger GPU with more theoretical horsepower. Period. Nobody is refuting that so please no Xbox defense force please.
Like many, I too was initially somewhat disappointed to when I first heard the PS5 specs mainly because there was so much information before hand that pointed to more performance being a possibility. We've all been hearing about the 12-14 TFLOP monster that Sony was building and honestly it's not about the raw numbers that matter. I was more happy about the idea that both consoles would come out being really close in power which benefits gamers by establishing a high baseline where neither machine will have subpar 3rd party releases. But after taking some time to process the specs and information Sony released as well as doing some in depth analysis, I am pretty happy with what Sony ended up with from a GPU standpoint.
Let me be clear: the goal of what I'm presenting here is not to define an absolute performance metric for PS5 with a given piece of content. In other words, I am not trying to predict that PS5 can run game X at Y Fps specifically. That is impossible since there are some many variables affecting overall performance that we do not know about: CPU, memory, driver, other console specific optimizations etc. Instead what I am doing is establishing a realistic expectation of a baseline of performance of the GPU specifically by looking at known real world performance data from comparable hardware.
How am I doing this? Let me break it down:
Key Takeaways:
Every should be excited but please stop spreading FUD about PS5 performance
This post is about providing real data and educated estimations to dispel the notion that the PS5 GPU is only a "midrange" GPU that is not on par with today's top tier commercial GPUs. Indeed, looking at the TFLOP number in isolation is indeed very misleading and the truth about the actual performance of the GPU points paints a very different picture. I know many of you don't know me but I can say that I am not just pulling this info from nowhere. I have over 15 years experience working in gaming and have spent nearly 5 years of my career doing critical analysis of GPU performance. Take it for what it is.
Before I begin, a full disclaimer: this post is not about a comparison to or commentary on the Xbox Series X. No console fanboyism here please. The fact is, the Xbox Series X has a bigger GPU with more theoretical horsepower. Period. Nobody is refuting that so please no Xbox defense force please.
Like many, I too was initially somewhat disappointed to when I first heard the PS5 specs mainly because there was so much information before hand that pointed to more performance being a possibility. We've all been hearing about the 12-14 TFLOP monster that Sony was building and honestly it's not about the raw numbers that matter. I was more happy about the idea that both consoles would come out being really close in power which benefits gamers by establishing a high baseline where neither machine will have subpar 3rd party releases. But after taking some time to process the specs and information Sony released as well as doing some in depth analysis, I am pretty happy with what Sony ended up with from a GPU standpoint.
Let me be clear: the goal of what I'm presenting here is not to define an absolute performance metric for PS5 with a given piece of content. In other words, I am not trying to predict that PS5 can run game X at Y Fps specifically. That is impossible since there are some many variables affecting overall performance that we do not know about: CPU, memory, driver, other console specific optimizations etc. Instead what I am doing is establishing a realistic expectation of a baseline of performance of the GPU specifically by looking at known real world performance data from comparable hardware.
How am I doing this? Let me break it down:
- Let's establish a comparison mapping to other known GPUs based on their GPU architectures and theoretical computation power based on what we know:
- We know that AMD's RDNA architecture is a general 25% increase in performance per clock when compared to GCN -> 1TFLOP (RDNA) = 1.25 TFLOP (GCN)
- We know that RDNA 2 will be even more efficient than RDNA (i.e. perf per clock and per watt will be better). Now we can guess how much more efficient based on some actual hints from Sony and AMD:
- Mark Cerny himself during the PS5 tech dive revealed that the size of each CU in the PS5 GPU is roughly 62% larger than a PS4 CU. Thus, there is the equivalent of 58 PS4 CUs in the PS5. So 36 CU (PS5) = 58 CU (PS4). Now 58 CUs running at the PS5's 2.23 Ghz frequency => ~16.55 TFLOP (GCN). So what is the conversion factor to get from 10.28 TFLOP (RDNA 2) to 16.55 TFLOP (GCN)? Well it turns out that the additional perf per clock to reach that ratio is precisely 17%. So by this data: 1 TFLOP (RDNA 2) = 1.17 TFLOP (RDNA 1)
- AMD has already said that they are pushing to deliver a similar improvement with RDNA 2 over RDNA 1 as saw from GCN to RDNA 1. They have also confirmed that RDNA 2 will see a 50% improvement in perf/watt over RDNA 1. GCN to RDNA 1 saw a 50% perf/watt and 25% perf/clock increase. A 25% further increase in perf/clock in RDNA 2 sounds pretty ambitious and i will be more conservative. But we can use this as an upper bound.
- AMD has talked about mirroring their GPU progression to that of their CPU. They have specifically talked about increasing CPU IPC by roughly 15% every 12-18 months. The 10-15% range is typical of GPU generational transitions in the past
- Using the 25% ratio of RDNA to GCN and a 15% ratio of RDNA 2 to RDNA 1, we can calculate the equivalent amount of theoretical performance (i.e TFLOPs) for the PS5 GPU in terms of both RDNA performance and GCN performance:
PS5 TFLOP = 10.28
PS5 TFLOP (RDNA 1) = 12.09 (used to compare against RX 5700 and RX 5700 XT)
PS5 TFLOP (GCN) = 16.13 (used to compare against Radeon VII, PS4)
2. We can also note that it is actually easier to guessestimate the PS5 GPU performance since there is a GPU on the market very similar to it in the RX 5700. The GPU config in terms of CU count, number of shader cores, memory bus size, memory bandwidth etc is exactly a match for the PS5. At a high level, the PS5 is simply an extremely overclocked RX 5700 in terms of hardware. Now typically on PC, overclocking a GPU gives limited returns due to power issues and system design limitation that will not exist in a console. So if we calculate that the PS5's typical GPU clock of 2.23 Ghz is indeed ~34% higher than the typical GPU clock of the RX 5700 at 1.670 Ghz, we can extrapolate PS5 as being roughly 35% higher than that of an RX 5700. However, doing that raw translation does not account for RDNA 2 additional efficiencies. So if we add the 15% uplift in efficiency, we can get a pretty good idea of the PS5 GPU performance. It turns out that this projected value is pretty much identical to the TFLOP conversion factors I computed above PS5 TFLOP (RDNA 1) = 12.09 (used to compare against RX 5700 and RX 5700 XT)
PS5 TFLOP (GCN) = 16.13 (used to compare against Radeon VII, PS4)
3. Now that we have a quantitative comparison point, we can calculate a PS5 projected performance target based on theoretical performance from comparable GPUs. For example, RX 5700 XT = 9.7 TFLOPs (RDNA 1) and PS5 = 12.09 TFLOP (RDNA 1) That puts the PS5 projected performance at ~25% higher than a RX 5700 XT. Using these calculations for other GPUs as reference points we get the following:PS5 vs Xbox Series X = -15% (PS5 is 15% slower)
PS5 vs RX 5700 = 153% (PS5 is 53% faster)
PS5 vs RX 5700 XT = 125% (PS5 is 25% faster)
PS5 vs Radeon VII = 120 % (PS5 is 20% faster)
PS5 vs PS4= 8.76x (PS5 is nearly 9x faster)
4. Finally, now that we have a performance factor for some common GPUs across various AMD architectures, we can see where a projected PS5 performance will rank compared to the fastest cards on the market including Nvidia cards. I've looked at several industry aggregate sites such as Eurogamer, TechpowerUP, and GPUCheck (numerous games tested) as well as a couple of high profile games such as DOOM Eternal, Call of Duty Modern Warfare, and Red Dead Redemption 2 to look at where the PS5 performance will fall. I've done this analysis across numerous performance metrics, resolutions, and difference GPU references defined above to see if the data was consistent. The goal here was to identify which GPU currently on the market had the closest performance to a projected PS5 performance. I've highlighted the 4K rows since 4K is the target resolution for the PS5. The summery table shows which GPUs came closest to the projected PS5 performance at different resolutions. The raw results are below:PS5 vs RX 5700 = 153% (PS5 is 53% faster)
PS5 vs RX 5700 XT = 125% (PS5 is 25% faster)
PS5 vs Radeon VII = 120 % (PS5 is 20% faster)
PS5 vs PS4= 8.76x (PS5 is nearly 9x faster)
**Note: Game performance was captured from TechpowerUp benchmark analysis using max settings at all resolutionsKey Takeaways:
- General takeaway is that in most cases at higher resolutions, the PS5 performance is actually slightly higher than that of the RTX 2080 Super.
- Note that the 1080p values are a bit misleading since some games are CPU bound at that resolution. Thus, most GPUs exhibit lower perf which is why the RTX 2080 Ti was the closet at 1080p.
- These numbers do no take into account other factors that can improve PS5 GPU performance even further such as: GPU specific optimizations, console specific optimizations, lower level driver compared to PC, I/O throughput improvements in PS5, memory subsystem etc
- This analysis is just a rough estimate and again is not to be taken literally in terms of actual performance in games. There are still a ton of variables and unknown factors. But it does account for known information to give a good relative performance baseline to set expectations on how much performance the PS5 GPU may possess. The answer is that it is definitely not "just an RX 5700 XT" and will likely have more performance than a 2070 Super
- My analysis went well beyond these websites, game titles, and reference GPUs. I presented the highlights but the overall takeaways is the same from the additional data: performance is most in line with a RTX 2080 Super.
Every should be excited but please stop spreading FUD about PS5 performance
Last edited: