• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Frame Gen will gets updated on 40 and 50 series with a new AI model that is faster and uses less VRAM (~10% performance increase on 40 series)

LectureMaster

Gold Member


GhMRpFfWAAA3uqX
 
I decided to try DLSS FG on Ghost of Tsushima.

90FPS without FG feels smoother than 120FPS with FG.

Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.

The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.
 
Last edited:
I decided to try DLSS FG on Ghost of Tsushima.

90FPS without FG feels smoother than 120FPS with FG.

Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.

The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.

If GoT doesnt have reflex then thats the reason why it feels less smooth. Most games should have it by default on when FG is enabled. It also makes no sense why your game would feel less smooth with FG on when you can reach 90 FPS by default without it. It's only incredibly noticeable if your nonfg framerate is very low, not 90.
 

Gaiff

SBI’s Resident Gaslighter
I decided to try DLSS FG on Ghost of Tsushima.

90FPS without FG feels smoother than 120FPS with FG.

Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.

The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.
You notice 10ms in a game like GOT. Really?
 
If GoT doesnt have reflex then thats the reason why it feels less smooth. Most games should have it by default on when FG is enabled. It also makes no sense why your game would feel less smooth with FG on when you can reach 90 FPS by default without it. It's only incredibly noticeable if your nonfg framerate is very low, not 90.
This is with Reflux On + Boost.

Without FG and Reflex off, the latency is still lower by 10-13msec than the above combination

You notice 10ms in a game like GOT. Really?
Yes. There is a delay in movement when panning the camera with framegen. It simply does not feel as smooth as FG disabled.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
This is with Reflux On + Boost.

Without FG and Reflex off, the latency is still lower by 10-13msec than the above combination


Yes. There is a delay in movement when panning the camera with framegen. It simply does not feel as smooth as FG disabled.
It has to be more than 10ms. Nobody would ever notice such a short amount of time in a game of that speed.
 
You must have super human sense to notice the difference of MS between those two.
Subjectively playing the game without FG feels smoother and more responsive. Of course I also recognise bias may play a role here. It would be better if this was a blinded test and I was not aware of which has FG on and off.

We can't argue that FG adds latency though.
 
Last edited:

Xcell Miguel

Gold Member
I decided to retest. Added latency seems to be approx 12 msec.

4K - no DLSS
NhhVBZw.jpeg


4K DLSS Framegen + Reflex Boost
FAo29hC.jpeg
Is your monitor/TV limited to 120 Hz ?
If so, that may be normal that you feel an input delay. The 116 FPS of your FG screenshot look like a VSync limitation with VRR.

You already had 90 FPS without FG, if you enable FG but are limited to your screen's refresh rate, the game will have to render less frames so that FG frames and rendered frames can alternate until max refresh rate, causing higher delay between real frames, thus more input delay.
Try without VSync, or up some details or resolution to get a better ratio between FG and no FG.

90 real frames will have less input delay than 120 with FG, but then compare 60-70 real frames to 120 with FG.
When you already have 80+ fps you should only use FG if it won't reach your monitor's refresh limit, or you'll get a bit more of input delay caused by less rendered frames as it have to alternate to avoid frametime spikes.
 
Is your monitor/TV limited to 120 Hz ?
If so, that may be normal that you feel an input delay. The 116 FPS of your FG screenshot look like a VSync limitation with VRR.

You already had 90 FPS without FG, if you enable FG but are limited to your screen's refresh rate, the game will have to render less frames so that FG frames and rendered frames can alternate until max refresh rate, causing higher delay between real frames, thus more input delay.
Try without VSync, or up some details or resolution to get a better ratio between FG and no FG.

90 real frames will have less input delay than 120 with FG, but then compare 60-70 real frames to 120 with FG.
When you already have 80+ fps you should only use FG if it won't reach your monitor's refresh limit, or you'll get a bit more of input delay caused by less rendered frames as it have to alternate to avoid frametime spikes.
Yes - 120Hz G-Sync display.
 

Utherellus

Member
One game that certainly needs it is Indiana Jones. Even at 1440p native with DLAA and full path tracing my 4090 can't hit 100fps.
My thoughts exactly. That game is seriously VRAM hungry. I've never ever been worried about vram in games with my 4070, it was just never full.

In Indiana Jones, vram bottlenecked me even on 1080p. Bizarre experience.
 

TrebleShot

Member
Am I losing touch here, I have the money and ability to get a 5090 but the only reason so far legit seems to be based around the physical size of the card?

Also do we expect there to be a notable uptick in perf on things like the new LG UW (5k)
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Am I losing touch here, I have the money and ability to get a 5090 but the only reason so far legit seems to be based around the physical size of the card?

Also do we expect there to be a notable uptick in perf on things like the new LG UW (5k)

Its the usual gen on gen uptick. ~30%

vgLEZ1s.jpeg


Assuming you already have a 4080+ class card, you likely dont need to upgrade.
Follow the usual logic of skipping a generation so you can "feel" the upgrade.
Gen on gen is rarely worth it outside of braging rights.
If you are building fresh, then obviously go for what you wallet allows.
 
Last edited:

TrebleShot

Member
Its the usual gen on gen uptick. ~30%

vgLEZ1s.jpeg


Assuming you already have a 4080+ class card, you likely dont need to upgrade.
Follow the usual logic of skipping a generation so you can "feel" the upgrade.
Gen on gen is rarely worth it outside of braging rights.
If you are building fresh, then obviously go for what you wallet allows.
Yeah I am on a 4090, but was considering it on the new screens. Usually like maxing out fps on my displays. For instance 120hz LG C2 as a monitor, 4K 120hz is possible on most games with some upscaling.

The new monitors are 165hz 5k so imagine it will be difficult to do that with new games, but maybe with frame gen?
 

Thebonehead

Gold Member
I decided to retest. Added latency seems to be approx 12 msec.

Mouse and keyboard or controller?

Can remember controller felt gloopy compared to m+kB last time I tried it. It definitely felt laggy via controller.

A bit like the reverse of tlou1 on pc when it came out. Controller was ultra smooth but m+kb was a stuttery mess.
 
Mouse and keyboard or controller?

Can remember controller felt gloopy compared to m+kB last time I tried it. It definitely felt laggy via controller.

A bit like the reverse of tlou1 on pc when it came out. Controller was ultra smooth but m+kb was a stuttery mess.
Controller. I agree; M/KB combo feels much more responsive than controllers overall. Too bad I hate using a M/KB for games.
 
Last edited:
Not even remotely close to the same thing. ADDING input lag vs without is a worsening of what came before whereas moving to texture based geometry mapping improves image quality while costing significantly less performance vs pure polygon count boosts to match quality.
 

DirtInUrEye

Member
I'd still rather just lock to 60 real frames than ever choose a pretend 120 with extra input lag. I just don't get the appeal.

People say, "well FG is really nice if you're coming from higher base fps". Like 80 or 90? Both of those feel much more pleasing natively than double phoney digits of it.

I say this as someone with a 4000 series card.
 
Last edited:

analog_future

Resident Crybaby
How much faster is it compared to this:


?

Safe to say it is significantly faster, with better image quality, and much lower latency.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
Now look at them console-bros, that's the way you do it
You play at 4K/60fps on the TV
That ain't workin', that's the way you do it
Performance for nothing and your frames for free

Now that ain't workin', that's the way you do it
Lemme tell ya, them fat guys ain't dumb
Maybe get an extra ms on your little finger
Maybe get an artifact on your hair

We got to install our own disc drives, vertical stands deliveries
We got to pay to play online, we got to wait these pro patches to come

See the little pcbro with the steam deck and the rgb lights
Yeah, buddy, that's his own chair
That little pcbro got his own 5090
That little pcbro, he's a millionaire

etc etc
 

proandrad

Member
I decided to try DLSS FG on Ghost of Tsushima.

90FPS without FG feels smoother than 120FPS with FG.

Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.

The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.
Are so also seeing frame judder? I would assume you would if you are capping your frames. The generated frames wouldn’t be able to be evenly distributed between the real frames, any time your base framerate fluctuates. I think you have to uncap your framerate to see the ideal results. It also could be dropping your base frame rate to 60 and doubling it to match the 120 cap you have, that could explain the increase input lag.
 
Last edited:

T-800

Neo Member
The latency was crazy when darktide patched in some kind of framgen and enabled it by default.
 

buenoblue

Member
If GoT doesnt have reflex then thats the reason why it feels less smooth. Most games should have it by default on when FG is enabled. It also makes no sense why your game would feel less smooth with FG on when you can reach 90 FPS by default without it. It's only incredibly noticeable if your nonfg framerate is very low, not 90.
You real FPS is exactly half of what current frame gen gives you. So if you lock your fps to a 120hz monitor, your real frame rate, and your input lag will be 60fps/Hz. That's why 90fps feels so much better.

Frame gen only really makes sense to me if your struggling to hit 60fps or you have a really high refresh display. There is a cost to frame gen and you don't get double the FPS you get without it.

For instance I'm playing Hogwarts legacy on a 4070 ti with dllss on 4k quality maxed out with no ray tracing. Without frame gen I get 65 to 70 FPS but it can drop into the 50s. This doesn't seem that smooth even with gsync.

If I use frame gen I get average 105 FPS with drops to the 90s. This seems much much smoother. But obviously my input lag is akin to playing 53-to 45fps. Input lag doesn't feel that bad playing a single player game with a controller, but it doesn't feel like 100fps lag at all.

This is on a 120hz screen btw.
 

llien

Banned
Haha I wanted to try this but Samsung game motion plus is disabled @120hz 😂😭
Where is your "challenge accepted" attitude, citizen?

Need a game that card runs at below 30fps.
Then use this:


in combination of DLSS FG.

And than let Samsung generate frames on top. (this was available on TVs for ages, by the way)

Results should be epic.
 

Zathalus

Member
Haha I wanted to try this but Samsung game motion plus is disabled @120hz 😂😭
It’s only semi useful for 30-60 fps interpolation. Don’t expect miracles though, latency does make a large jump and you do pick up on some graphical issues. Just like you shouldn’t expect your TVs built in upscaling to match DLSS/XeSS/PSSR you shouldn’t expect it to replace frame generation.
 

buenoblue

Member
Where is your "challenge accepted" attitude, citizen?

Need a game that card runs at below 30fps.
Then use this:


in combination of DLSS FG.

And than let Samsung generate frames on top. (this was available on TVs for ages, by the way)

Results should be epic.
So for a laugh I ran elden ring at 30fps then used Samsung game motion plus, then added dlss frame gen on top...then ran lossless scaling 20x on top of that😂
This is what I got
yAVpfRf.gif
 
Top Bottom