I admit it, I was wrong about framegen

I put together a pc two years ago and did a lot of research beforehand. So many people just flat out said don't use ray tracing, it's not worth the performance hit. I ended up with a 4070ti and I'm able to run Cyberpunk at 1440p at around 140 fps with all the settings completely maxed out with dlss and frame gen.

I won't be buying 50 series but 4x frame gen sounds like an awesome feature for people with oled monitors.
 
Eh, sounds like what you want isn't frame gen but something like this.
Tried it, looks good but it's too picky about monitors. My 240hz monitor is not fully compatible because... 8bit something, i don't remember. So even though i do get the clear CRT like motion, i also get some weird stripes.

Disappointed Kevin Sorbo GIF
 
I was convinced by it after testing it in the Forspoken demo, but then Doom TDA hit with FSR FG and I got back to not believing again... Hell, even AFMF looks WAY better, they fucked something in the implementation
 
Tried it, looks good but it's too picky about monitors. My 240hz monitor is not fully compatible because... 8bit something, i don't remember. So even though i do get the clear CRT like motion, i also get some weird stripes.

Disappointed Kevin Sorbo GIF
Yeah but in the near future probably any and every monitor will be able to use it fine with even higher hz/hdr/color depth/whatever as standard trickling down even for low spec/budget monitors etc.
 
Last edited:
2x is fine as long as its used to get higher framerates when you already have well above 80, however it is now used a crutch to boost "performance" in poorly unoptimized UE5 trash and 3x and 5x introduce severe artifacting as well as noticeable delay, besides the fact that realistically no game should ever require so much framegen.
 
Only tried it in Cyberpunk 2077 path tracing since I got the 5070 Ti and its incredible yes.

I'm thinking of upgrading my monitor to truely get 3x & 4x benefit so I would need 240 Hz monitor at the least.
 
I think it needs 2 more major driver iterations to prove haterz wrong big time

I tried mfg on portal rtx, with my wireless high sensitivity rAzer mouse, i can notice the added latency

Wired > 2000hz wireless > mfg

I also noticed the macro blocking around the gun but that's minor to over look
 
Only tried it in Cyberpunk 2077 path tracing since I got the 5070 Ti and its incredible yes.

I'm thinking of upgrading my monitor to truely get 3x & 4x benefit so I would need 240 Hz monitor at the least.
I Just played through Doom DA at 200fps+ maxed out using x2 and it was god like.
 
Ive been saying this for a year now.

When done right (spiderman 2 is a great example) - it is absolutely a game change. You do need a high refresh rate monitor to take full advantage though.

This tech should already be in every modern game on console, its absurd it hasnt been implemented
 
Motion resolution is extremly underrated. I have had about 9+ plasmas, now i have downsized unfortunatly. i use lcd with good strobing. (tcl mini leds with inerpolation and bfi looks better in motion than any oled with bfi, tcl supports 120hz bfi on their mini leds, looks mint, it shits on my old kuro and vt60). But i made a cool little discovery when i picked up a VR headset a few days back.

Lately i have been using an oculus 2 vr headset with virtual desktop to run normal games. Its so god damn good, almost crt levels of motion clarity. you just have to match at least 90hz or 120hz for it to work.

But the ufo test at blur busters is crystal clear at 960 pixels a second with the quest 2 headset.

So now i am totaly convinced this is the way to go for me, just ordered a 5070 ti today to upgrade my rig, cant wait to explore more VR and its motion clarity. and try out framegen with a proper nvidia card
 
Last edited:
I used to be firmly on the "fake frames" side of the aisle but i've recently changed my mind - with some caveats.

Frame gen still won't do anything about input latency. Its decline with increasing framerate is the main reason why higher framerates feel so good. A 30fps game multi frame gen'd up to 120+fps is still going to respond like a 30fps game.

In more basic terms you could say that if the game felt like shit already, it's still going to feel like shit with frame gen.

However, it does prompt a bit of thinking about what framerates feel "good enough". For many people nowadays, 40fps appears to be the floor. It's deemed a golden number for good reasons, as despite being only 10fps more than 30fps, its frametime sits between 30 and 60fps:

W23Sw6m.jpeg


To me it seems like 40fps is the perfect compromise. In terms of latency, it's low enough to not feel bad for the majority of modern gamers like 30fps does. The only problem remaining is that it can still look juddery.

Even if frame gen does nothing for latency, it works very well at improving motion smoothness. Combining that smoothness with the "good enough" 40fps works really well at tricking your brain into thinking the framerate is higher than it is, because the game has now passed a sort of threshold of acceptability. Nobody is going to turn around and say it's unplayable like I often hear people say about 30fps and nobody is going to complain about judder on an OLED TV.

This post seems more like it's extolling 40fps more than it is frame gen, but I really think the tech only works with framerates above that number. In games i've played where I relied on FG heavily, my brain stops being tricked when the input latency gets too high. There's a weird disconnect between the game starting to feel like soup versus seeing a 60fps read-out on screen (if it's 2xFG then it's likely operating from a 30fps base at that point).

The other thing about 40fps + frame gen is that it makes things like path tracing completely viable on mid-range hardware. You don't need a 5090 to get path tracing in F1 25 or Indiana Jones, and it isn't going to be 19fps.

As long as an appropriate DLSS preset is chosen to hit around 40fps as a base, my 4070 Super performs more than good enough for path tracing:

h57yQyR.jpeg
 
Playing Doom right now and getting a 'free' 240fps. At the current quality of the DLSS components I cannot see any artifacts (granted this is a pretty dark game).
 
Tried it, looks good but it's too picky about monitors. My 240hz monitor is not fully compatible because... 8bit something, i don't remember. So even though i do get the clear CRT like motion, i also get some weird stripes.

Disappointed Kevin Sorbo GIF
Did you get 3-4 wandering beams slowly moving up across the picture? If that's the case... It's a feature! I just learned you can turn it off in Shader Parameters, set Scan Direction to Zero. Combined Geom-Deluxe & BFI and it looks great! The fast scrolling text in Dynamite Headdy is readable.

OLED would be better of course, but I'll take it.
 
Did you get 3-4 wandering beams slowly moving up across the picture? If that's the case... It's a feature!
It was more like one or two stripes that completely changed the colors. Meaning it looked too wrong to be something intended. I also remember posting about it and the author said my monitor does this because it's 8bit. It's been a while now though and i have given up but maybe i could try an updated version of this.
 
Tried it on two games, the input lag was crazy. I don't get it.
I did have it capped at 60fps though, maybe that's it.
 
Last edited:
Tried it on two games, the input lag was crazy. I don't get it.
I did have it capped at 60fps though, maybe that's it.

Frame gen is only to be used if you have a base FPS of 50fps or more. It's for boosting 50+ FPS up to 100+/150+/200+ fps, not for boosting 30-40 fps up to 60fps.


Latency is extremely minimal when you use it the right way.
 
Frame gen is only to be used if you have a base FPS of 50fps or more. It's for boosting 50+ FPS up to 100+/150+/200+ fps, not for boosting 30-40 fps up to 60fps.


Latency is extremely minimal when you use it the right way.
I was hitting 60fps without it, but turned it on and it just felt shit. Maybe it'll feel better uncapped.
Edit: It's not terrible with unlocked fps. Tried with Oblivion.
 
Last edited:
I have only tried it on one game while playing Rune Factory: Guardians of Azuma on the Steam Deck docked to a TV. I can turn frame gen on and leave mesh quality at high. Or I can turn frame gen off and lower mesh quality. I prefer to leave it on because the pop in bothers me a lot more than anything I have noticed with frame gen on. I am not a big proponent of frame generation but I am no longer a hater either. Like most things, it probably has its uses if not overdone or used for the wrong reasons.
 
Its fiercely under rated ita fantastic in most games ive used it in like black myth, wuchang and shadows of course all third person. Whilst no it doesnt give you lower input delay it definitely makes the game look smoother which has value in itself.
 
Frame gen and motion smoothing is another Nvidia's gift for gamers.

As long as you can hit >60fps base, turn that shit on. The extra visual smoothness of your mouse cursor or in game camera, is very noticeable
 
I was hitting 60fps without it, but turned it on and it just felt shit. Maybe it'll feel better uncapped.
Edit: It's not terrible with unlocked fps. Tried with Oblivion.
It's generally the opposite in fact, for FSR FG especially although with DLSS it can also be the case - the added input lag will be lower if you cap the FPS below full GPU utilization. It's what Reflex does automatically and dynamically for you.
It's also highly implementation specific. Some games run well with FG even with a 40 fps baseline (tend to be path traced games too), while some require 60+ for the added latency to be less noticeable.
Generally though I've yet to see a game where not using DLSS FG to get from 60-70 to 120-140 would be better for game feel than using it. FSR FG is way more finicky and with that you may be better not using it more often probably.
 
It's cool to mess about with for emulation and games stuck at 60, but honestly anything below 60 sucks with framgen, I'd even go as far to say anything under 80-90fps sucks with it too. I am sensitive to input lag though. Plus the artifacting isn't great.

I hardly use it thesedays. Properly frame paced 30fps with decent motion blur is much better than interpolated frames to 60 or higher imo.
 
Top Bottom