No, I'm using an i7-5820K, which probably isn't much better, but I wanted to clarify that.
Having said that, because I game at 4K, a modern CPU actually wouldn't be very beneficial for me in gaming since gaming at 4K is bound by GPUs rather than CPUs; my RTX 3080 is fully saturated (at 98% to 100% usage) when gaming at 4K, which means that my i7-5820K is sufficiently supplying it with data and instructions.
For example, below are two videos of "Control" running on PCs at 4K with maximum settings via DLSS Quality Mode (1440p). The PC in the first video includes an i9-9900K and an RTX 3080, and the PC in the second video is my PC; the frame rate is in the 50s most of the time in both videos, even dropping to the very high 40s in both. The only difference is that in the first video, the player ventures through areas that are mostly very dark and lacking detail, which makes them easy to render and results in frame rates at or a little above 60 frames per second. However, in brightly lit environments, the frame rates are similar in both videos.