Guys, I don't know if this is the right place or not, but I'm confused about CPU/GPU utilization.
As I understand it, if you are getting sub optimal frame rates and your GPU utilization is at 99%, then your GPU can't keep up. If you are getting 99% CPU utilization and the GPU utilization is far lower, then the CPU can't feed the video card information fast enough in that game and you have a CPU bottleneck.
What the hell is this then:
Neither my CPU nor GPU is maxed out, yet I'm only at 48 FPS in that shot. Highest CPU core is at 83.9%. GPU at 69%.
That is Witcher 2 BTW, at the inn in Vergen.
I'm on a Phenom II X4 965 @3.8 Ghz, 8GB RAM, EVGA Nvidia 960 4GB with latest drivers. Running fullscreen in a borderless window with vsync off. Ultra settings (no Ubersampling).
As I understand it, if you are getting sub optimal frame rates and your GPU utilization is at 99%, then your GPU can't keep up. If you are getting 99% CPU utilization and the GPU utilization is far lower, then the CPU can't feed the video card information fast enough in that game and you have a CPU bottleneck.
What the hell is this then:

Neither my CPU nor GPU is maxed out, yet I'm only at 48 FPS in that shot. Highest CPU core is at 83.9%. GPU at 69%.
That is Witcher 2 BTW, at the inn in Vergen.
I'm on a Phenom II X4 965 @3.8 Ghz, 8GB RAM, EVGA Nvidia 960 4GB with latest drivers. Running fullscreen in a borderless window with vsync off. Ultra settings (no Ubersampling).