• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Frame Generation/FSR 3 - Technical question

Bavarian_Sloth

Gold Member
As far as I am informed, both methods mentioned above always alternate an image natively calculated by the GPU with an "artificially" generated image.
At 90 FPS this would result in a 50% percentage distribution of 45/45 FPS.

Now in a number of games I have, for example 58 FPS “native”. After enabling FG, the frame rate increases to “only” 90 FPS.
I'm wondering now what data a game or the NVIDIA software uses to decide at what intervals artificial intermediate images are created.

Neogaf engineers?

Thanks
 

winjer

Gold Member
As far as I am informed, both methods mentioned above always alternate an image natively calculated by the GPU with an "artificially" generated image.
At 90 FPS this would result in a 50% percentage distribution of 45/45 FPS.

Now in a number of games I have, for example 58 FPS “native”. After enabling FG, the frame rate increases to “only” 90 FPS.
I'm wondering now what data a game or the NVIDIA software uses to decide at what intervals artificial intermediate images are created.

Neogaf engineers?

Thanks

Images with Frame Generation are alternated. One real, one generated, then one real again, and so on.
A good way to setup a game is to set a frame cap to half of your monitor refresh rate.
 

Bojji

Member
As far as I am informed, both methods mentioned above always alternate an image natively calculated by the GPU with an "artificially" generated image.
At 90 FPS this would result in a 50% percentage distribution of 45/45 FPS.

Now in a number of games I have, for example 58 FPS “native”. After enabling FG, the frame rate increases to “only” 90 FPS.
I'm wondering now what data a game or the NVIDIA software uses to decide at what intervals artificial intermediate images are created.

Neogaf engineers?

Thanks

I think you should add cost of frame gen. In theory game with 58fps needs to drop 45 because FG calculations costs that much. That's why 100% more FPS is possible only in CPU limited scenarios.

FSR3 seems to have less cost and generates more "FPS". So why nvidia needed this exclusive hardware in ada lovelace for DLSS3? LOL, such bullshit.
 

West Texas CEO

GAF's Nicest Lunch Thief and Nosiest Dildo Archeologist
200.gif
 

mhirano

Member
I think you should add cost of frame gen. In theory game with 58fps needs to drop 45 because FG calculations costs that much. That's why 100% more FPS is possible only in CPU limited scenarios.

FSR3 seems to have less cost and generates more "FPS". So why nvidia needed this exclusive hardware in ada lovelace for DLSS3? LOL, such bullshit.
Nvidia dedicated hardware allow a cleaner image with much less artifacts.
In the same way, Lossless Scaling’ frame generation gets even more artifacts than FSR3, because the app access the pipeline much later than the driver approaches
 
Top Bottom