But that's the thing, I can get stable 60 if I turn turn off enough graphics options.
Okay, so here you have one section of gameplay. Each frame is listed in milliseconds how long it took to render.
Over on the left, we have the low settings, and on the right, we have high settings.
In both instances, you have the game totally chugging around the same few frames, let's say an explosion happens that causes this. Despite both getting around the same poor performance for that situation, we have a drastically different average.
For the Low Settings, if you average out the length it took to render each frame, it is 20.8ms, which is about 48 frames per second.
For the High Settings, the average is 25.2ms, which is about 39 frames per second.
So in the case of turning down graphic settings, you will see an increase of "frames per second", but that's because frames per second is a really really inaccurate way to measure performance when you are talking about something that is insanely precise as our ability to perceive the illusion of motion.
Despite one being "clearly better", that 300-400ms of gameplay is going to feel like shit in either case, because of those absolutely nasty frames where the time to render goes well above 50ms. Now imagine if it were data from a full 1000ms. Now imagine data from 60 seconds, a total of 60,000ms.
The processor is making the game chug, but how much it's chugging is being hidden because those frames are being averaged out with a bunch of other frames that came out fine over the course of a full second.
So when you say, "I'm getting a stable 60fps", that's not
quite accurate. The game could very well be spitting out frames lower than 16.7ms (60fps), but then you're still having frames that are going well above that. They average out to be above or right around 60fps at even a minimum, but that isn't the whole story, as you can see above.