This is going to sound a bit odd, but I'm a little disappointed the only videogame benefits to having a new super-de-duper PC is IQ and framerate.....super IQ makes some games look real rough as the geometry stands out a lot more.
I'm going to second that. Subjective diminishing returns.
I started playing PC games on a 2002 HP Pavilion with a 2.6GHz Celeron, 512MB of RAM and a GeForce MX 440 AGP (64MB) in 2010. I did my best to optimize it for Minecraft. I tried and failed a million times to run Half-Life 2 and BioShock. I hooked the composite output to an old CRT TV and played SNES games with a 360 pad. I played tons of stuff from FreeIndieGam.es.
In late 2013, I upgraded to a 3GHz dual-core and a 1GB discrete card. It was incredible. Almost everything (Just Cause 2, XCOM:EU, Dark Souls, Team Fortress 2) at 1080p and 30FPS - an almost-literal breath of fresh air after years in musty darkness.
It's all downhill from your first discrete card. In early 2015, I got an SSD and made a marginal CPU upgrade - the minimum for a GeForce 960 2GB. I went from 1080p/~30FPS to 50-60FPS. I ran into a CPU bottleneck early on, and completely rebuilt later that year.
Compared to that old HP, my current PC has four times the CPU core count at nearly double the clock speed, thirty-two times the memory, one hundred twenty-eight times the GPU memory, and (hilariously) ~60% more storage. The primary benefit in games: image quality.
So, yeah. Eventually, the pursuit of speed and capability outruns reason. Especially now that you can build a fantastic, effectively-no-compromise system for $600.