My first graphics card (3D accelerator) TNT2 was already running pretty much every game at 1024x768 but in 16bit color. When I bought the Geforce 3 in 2002, my fps in Quake 3 at 1600x1200x32 went from 7fps to 80fps, so I was absolutely blown away. This was just 3 years of progress in GPU technology, can you imagine something like that today? But very soon newer games started pushing my Geforce 3 harder and harder (PS2 ports), so I started playing at 1024x768x32 again. I build my next PC in 2007 (Q6600, 8800Ultra) and I started playing at 1680x1050x32. The vast majority of games run at well over 60fps, some games even at 1440p.
Raytracing reminds me of shaders. Back then shaders was as demanding as RT today, but people wanted to play games with better graphics. Now PC gamers have changed a lot and if games arent running at 4K native and real 120fps they arent willing to use features that elevate graphics fidelity to a next level, like ray tracing.
I bought fairly good PC in June 2025 to play RT games and I'm very happy with the experience even though some neogafers try to tell me 120fps at 4K DLSSQuality + FGx2 (in the most demadning RT games) is not good enough. I wonder what they will tell console gamers who play at 30-60fps at even lower resolutions? Sometimes I think console gamers are more happy than PCMR, because the typical console gamer can just focus on enjoying the game (even at 30fps), while PC gamers are just looking for things to complain about. For example you
rofif
played Black Myth Wukong on the PS5Pro at 40fps (if I remember correctly). Try to tell PCMR guys that was indeed playable and they will not believe you, in fact they may even laugh at you, just to feel better.