Bavarian_Sloth
Member
As far as I am informed, both methods mentioned above always alternate an image natively calculated by the GPU with an "artificially" generated image.
At 90 FPS this would result in a 50% percentage distribution of 45/45 FPS.
Now in a number of games I have, for example 58 FPS “native”. After enabling FG, the frame rate increases to “only” 90 FPS.
I'm wondering now what data a game or the NVIDIA software uses to decide at what intervals artificial intermediate images are created.
Neogaf engineers?
Thanks
At 90 FPS this would result in a 50% percentage distribution of 45/45 FPS.
Now in a number of games I have, for example 58 FPS “native”. After enabling FG, the frame rate increases to “only” 90 FPS.
I'm wondering now what data a game or the NVIDIA software uses to decide at what intervals artificial intermediate images are created.
Neogaf engineers?
Thanks