I admit it, I was wrong about framegen

marjo

Member
Ever since it was introduced, I always thought frame gen was a pointless feature. You needed a relatively high base frame to even turn it on, and when you did, it lowered your responsiveness, could potentially introduce artifacts, in exchange for additional 'smoothness'. This seemed like a bad trade given the aforementioned relatively high base frame needed to begin with.

Well, I was wrong. Let me preface this by stating that I'm no Nvidia fanboy. I think they're a shitty company. Their deceptive marketing practices and attempts to strong arm reviewers into trying to treat generated frames as equivalent to real frames is obviously bullshit. They're not the same, and should never be treated as such.

Still, framegen, even multi-framegen has legitimate uses, and can improve the overall experience. What changed my mind was the realization that improved motion resolution, and not a vague concept of 'smoothness' is the main benefit. As most of you know, with modern sample and hold displays, objects in motion look like shit at low refresh rates.

4udqrnL.png


To get the benefit, you need both a display capable of a high refresh rate, as well as high frame rates. The first is easily handled by getting the right display, framegen makes it easier for the second.

In my testing, I've found that having a base frame rate of 90ish fps and enabling 4x framegen, will allow me to hit 240fps with virtually no visiable additional artifacts. With my 5070ti and 4k display, this is often possible if I use DLSS super resolution on performance mode. What I've found is that I get about the same latency if I play at DLSS quality with no framegen, versus DLSS performance with 4x framegen. The latter has a slightly softer image, but the increase in motion clarity more than makes up for it.

To summarize, frame gen is actually useful if you have a high refresh display and are are able/willing to adjust settings to get a high base frame rate.
 
If you're sensitive to sample and hold motion blur, you can improve it even further by getting a monitor that has backlight strobing.
I've tried BFI, but felt the compromises (flicker, lower brightness, no VRR) were not worth it. I feel like the compromises with framegen are, at least for now, significantly lower.
 
Ever since it was introduced, I always thought frame gen was a pointless feature. You needed a relatively high base frame to even turn it on, and when you did, it lowered your responsiveness, could potentially introduce artifacts, in exchange for additional 'smoothness'. This seemed like a bad trade given the aforementioned relatively high base frame needed to begin with.

Well, I was wrong. Let me preface this by stating that I'm no Nvidia fanboy. I think they're a shitty company. Their deceptive marketing practices and attempts to strong arm reviewers into trying to treat generated frames as equivalent to real frames is obviously bullshit. They're not the same, and should never be treated as such.

Still, framegen, even multi-framegen has legitimate uses, and can improve the overall experience. What changed my mind was the realization that improved motion resolution, and not a vague concept of 'smoothness' is the main benefit. As most of you know, with modern sample and hold displays, objects in motion look like shit at low refresh rates.

4udqrnL.png


To get the benefit, you need both a display capable of a high refresh rate, as well as high frame rates. The first is easily handled by getting the right display, framegen makes it easier for the second.

In my testing, I've found that having a base frame rate of 90ish fps and enabling 4x framegen, will allow me to hit 240fps with virtually no visiable additional artifacts. With my 5070ti and 4k display, this is often possible if I use DLSS super resolution on performance mode. What I've found is that I get about the same latency if I play at DLSS quality with no framegen, versus DLSS performance with 4x framegen. The latter has a slightly softer image, but the increase in motion clarity more than makes up for it.

To summarize, frame gen is actually useful if you have a high refresh display and are are able/willing to adjust settings to get a high base frame rate.

Pathetic LCDs my CRT remains supreme
 
I have to say, I dislike framegen but I ended up turning it on for Doom and when the game is running at 120hz+, it looks amazing on my 4K/240hz screen.

I always turn it off without I'm getting around 60 or 70, since I can't stand the artifacts. But at that framerate I'm only getting extra "smoothness" and it looks great to me.
 
I've tried BFI, but felt the compromises (flicker, lower brightness, no VRR) were not worth it. I feel like the compromises with framegen are, at least for now, significantly lower.

It depends what monitor you were using. Quality is very variable between brands. There's also a difference between BFI, which is mostly an OLED solution, vs strobing, which is an LCD solution. The newer strobing monitors are bright enough, but you do miss out on the benefits of OLED. Flicker can be an issue at 60fps, but if you're running 120fps+, it's not so bad. VRR with strobing is still not possible on any monitor yet, but in Q3 the G-Sync Pulsar enabled monitors will come out that allow you to do both at the same time.
 
While the added input lag sucks and it is something you need to get used to, it allowed me to play Cyberpunk 2077 at 4K with path tracing and it was something I've yet to experience somewhere else since then. So I'm grateful for it.
 
I played all the way through Avowed with Framegen enabled. RTX 4070 with DLSS and framegen and it did not have any weird artifacts or lag or issues.
BUT have to play using a controller. Mouse and keyboard will get you much different (worse) results.
 
I find the latency is a bit hit and miss but I can essentially tweak my system to mitigate the effects and the overall better option is to enable FG.
 
Last edited:
Yeah it is effective. It can be difficult to get that base framerate though. With my 5070 Ti at 1440p Ultrawide I'm not getting 90 frames in Alan Wake 2 unless I enable DLSS. Add framegen and that's two layers of potential artifacting/softening/latency. It does feels like a compromise, but it isn't the hoodwink people are making it out to be.
 
Last edited:
The added latency was always overblown.
Has DF ever done a latency test?

I always use it in the living room when I can't get stable or high enough framerate. I have no gsync/vrr there. In some games I can see some UI flicker from the generated frame but for the most part it does the job extremely well.
 
The problem is that it's useless when you need it the most and it works best when you don't really want it. Don't get me wrong, I've found situations where it works ok, but very often it creates more issues than it solves.
 
I really like G-Sync too on my new monitor. Too long have i lived without it.
Also i have a 165hz monitor which works great.
 
Last edited:
The problem with framegen has always been people using it at lower frame-rates, giving themselves crap input latency, in addition to games having borked implementation.

If you can get to mid 70something fps without it, than you can turn it on for better motion clarity, and input latency will feel close enough to a 60fps experience.

Once I get a 240hz display sometime in the future, I'll probably care more about it.
 
Yeah it is effective. It can be difficult to get that base framerate though. With my 5070 Ti at 1440p Ultrawide I'm not getting 90 frames in Alan Wake 2 unless I enable DLSS. Add framegen and that's two layers of potential artifacting/softening/latency. It does feels like a compromise, but it isn't the hoodwink people are making it out to be.

I find that DLSS performance + 4x frame gen gives me overall better image quality (in motion) than DLSS quality with no frame gen. The caveat is that this is on a 4k 240 Hz oled monitor.
 
I think Frame Gen highly depends on the Screen you are using.

When I use frame Gen on my 27" LG 1440p Gaming monitor, it looks like ass

When I put frame Gen on my 65" Samsung Oled- It works fantastic and looks great.
 
It depends of the base framerate. Sure at 90fps base it won't be noticeable by many but at 30/40fps base (like on consoles) it's terrible.
Yeah, pretty much. I don't recommend using it as a crutch to get to 60, but if you're at 80 and want 120, use it.

It's also straight-up not usable in fighters or competitive shooters, but those run on coffee machines anyway.
 
Last edited:
It's also straight-up not usable in fighters or competitive shooters, but those run on coffee machines anyway.
Yeah a lot of people in competitive shooter scene play at 1080p and the highest fps they can get at that resolution. Using framegen would be like tying a hand behind your back.

I think, like with most things in tech, there's an expectation that it be perfect from first implementation and if it's not, it's automatically seen as bad forever. Something like this and any of the other ML based methods of improving image quality and performance will always improve with time. We just need to be patient.
 
It depends of the base framerate. Sure at 90fps base it won't be noticeable by many but at 30/40fps base (like on consoles) it's terrible.
Absolutely. I don't look at framegen as a way to get improved performance. In some ways framegen is the opposite of DLSS super resolution.

With super resolution, you're trading in image quality to get better performance.

With framegen, you're trading performance (in terms of latency), to get better image quality (in terms of motion resolution).

I find using both in tandem gives the best results (assuming you have a high refresh screen and can get high enough performance without framegen).
 
anyone tried using AFMF 2.1 or Smooth Motion on fighting games then force refex on with specialK? Wonder if that would mitigate input lag enough to enjoy 120fps fighting games; I am surprised there is no coverage on such. I tried LSFG myself, the fluidity in animation is unbelievable in 120fps, especially on Guilty Gear strive, its 24fps animation is super smooth and input lag is not too bad but I am a casual players; would like to see some real numbers from input lag measurements.
 
It's obviously not black and white. having one frame generated per legit frame will usually look pretty much the same. But doing 10 not so much. The problem is that they now advertise the amount of fake frames. Next gen they'll talk about 20x more frames and how the 6070 is 5x better than the 5090. But fuck that.
 
There is no way that you don't get artifacts on dynamic light sources no matter how high your base frame rate is.
This is unfortunately true.
When Oculus rolled out their version of this in late 2016, it worked great - actually amplifying the framerate and lowering latency (the benefits of frame-gen being tied to ultra-low latency sensor inputs - it was extrapolating frames instead of interpolating) - BUT it completely fell apart if you had dynamic lighting in play. In the game I worked on - night scenes would have you use a flashlight that just looked bizzarely broken with generated frames.
 
Last edited:
Seems like a good enough route if you really want to use your high refresh monitor and don't have $1000 for a gpu.

For my part, getting visual frames when the game logic update has not actually responded... pointless. Fine if you think otherwise but they so hand-in-hand for me.
 
Eh, sounds like what you want isn't frame gen but something like this.
 
Last edited:
Bought a key for Hogwarts cheap on CDKeys, I have other games to play, but I played the tutorial bit to see how it runs.
1440/120 Ultra settings, RT and 2x DLSS framegen. The image is immaculate at 240fps with incredible flow and spits in the face of anyone whinging about native.

Then a cutscene video plays...

ew-disgust.gif
 
Top Bottom