One thing that I still don't get is that Ryzen is supposedly better at 1440p and 4k gaming because the GPU becomes the bottleneck, but how does that work exactly. Is there no point where the CPU is the bottleneck once again? I'm just a little confused with how that works.
It's not that Ryzen performs
better at higher resolutions, it's that the gap is narrowed because faster CPUs are more affected by it.
Say we run a test at 720p where the GPU load stays low at all times.
CPU A can run the game at 100 FPS
CPU B can run the game at 130 FPS
This shows us that CPU B is 30% faster than CPU A in this game.
Now we run the test at 1080p, which increases the GPU load.
CPU A still runs the game at 100 FPS, but GPU load is high - though still below 99%.
CPU B now runs the game at 115 FPS because the GPU load is maxed out in the test.
CPU B is still 30% faster than CPU A, but using the results from this test it only appears to be 15% faster.
Run the test at 4K instead of 1080p now, which maxes out the GPU on both systems.
CPU A runs the game at 50 FPS
CPU B also runs the game at 50 FPS
CPU B is still 30% faster, but the GPU bottleneck is preventing you from seeing the difference between them, since it prevents the CPU from being fully utilized.
Now you might say that these results don't matter, because you don't care about gaming at 130 FPS anyway, but most people seem to keep their CPUs for ~5 years now, while upgrading their GPU every 2-3 years. (potentially eliminating GPU bottlenecks if they existed)
In newer or more demanding games, the difference might not be 100 FPS vs 130 FPS, it might be 50 FPS vs 65 FPS instead.
If your goal is to stay above 60 FPS at all times, CPU B is the only one which can achieve that.
Part of the problem is that many sites seem to test the same set of games as each other, and many of them are older games that are less demanding.
I've not seen one site run a test with Dishonored 2 to see how the CPUs compare there.
I really want to know if an i7-7700K, R7-1700, or i7-6900K/i7-6950X can even keep that game above a minimum of 60 FPS at all times.
It can be such a CPU-intensive game that I'm not convinced any of them will be able to achieve that in certain locations.
And that's the other thing. The tests set up in a game matter too.
Some areas of games are considerably more demanding than others.
If everyone just tests the first are of a game, that may not be giving a realistic view of performance.
I've seen tests in games where minimum framerates were a good 30-40% higher than I measured using the same CPU, because they didn't actually play through the game and select the most demanding area from it to test.
How does this work? How is the Intel chip getting better GPU usage? Is this because the CPU is feeding the GPU some kind of data faster?
Bingo. In games, one of the CPU's most important roles is preparing data for the GPU.
If the CPU can't do this fast enough, it limits the GPU performance.
Minimum framerate/percentile tests are where this becomes most noticeable.