I decided to try DLSS FG on Ghost of Tsushima.
90FPS without FG feels smoother than 120FPS with FG.
Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.
The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.
For FG to reduce lag it needs to go "into the future", predicting frames.I decided to try DLSS FG on Ghost of Tsushima.
90FPS without FG feels smoother than 120FPS with FG.
You notice 10ms in a game like GOT. Really?I decided to try DLSS FG on Ghost of Tsushima.
90FPS without FG feels smoother than 120FPS with FG.
Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.
The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.
This is with Reflux On + Boost.If GoT doesnt have reflex then thats the reason why it feels less smooth. Most games should have it by default on when FG is enabled. It also makes no sense why your game would feel less smooth with FG on when you can reach 90 FPS by default without it. It's only incredibly noticeable if your nonfg framerate is very low, not 90.
Yes. There is a delay in movement when panning the camera with framegen. It simply does not feel as smooth as FG disabled.You notice 10ms in a game like GOT. Really?
It has to be more than 10ms. Nobody would ever notice such a short amount of time in a game of that speed.This is with Reflux On + Boost.
Without FG and Reflex off, the latency is still lower by 10-13msec than the above combination
Yes. There is a delay in movement when panning the camera with framegen. It simply does not feel as smooth as FG disabled.
I decided to retest. Added latency seems to be approx 12 msec.It has to be more than 10ms. Nobody would ever notice such a short amount of time in a game of that speed.
I decided to retest. Added latency seems to be approx 12 msec.
I decided to retest. Added latency seems to be approx 12 msec.
4K - no DLSS
![]()
4K DLSS Framegen + Reflex Boost
![]()
On the one hand, it’s 50% more latency. On the other hand, you’re at 37ms which is very, very low. GOT also isn’t especially fast.I decided to retest. Added latency seems to be approx 12 msec.
4K - no DLSS
![]()
4K DLSS Framegen + Reflex Boost
![]()
Subjectively playing the game without FG feels smoother and more responsive. Of course I also recognise bias may play a role here. It would be better if this was a blinded test and I was not aware of which has FG on and off.You must have super human sense to notice the difference of MS between those two.
Is your monitor/TV limited to 120 Hz ?I decided to retest. Added latency seems to be approx 12 msec.
4K - no DLSS
![]()
4K DLSS Framegen + Reflex Boost
![]()
Yes - 120Hz G-Sync display.Is your monitor/TV limited to 120 Hz ?
If so, that may be normal that you feel an input delay. The 116 FPS of your FG screenshot look like a VSync limitation with VRR.
You already had 90 FPS without FG, if you enable FG but are limited to your screen's refresh rate, the game will have to render less frames so that FG frames and rendered frames can alternate until max refresh rate, causing higher delay between real frames, thus more input delay.
Try without VSync, or up some details or resolution to get a better ratio between FG and no FG.
90 real frames will have less input delay than 120 with FG, but then compare 60-70 real frames to 120 with FG.
When you already have 80+ fps you should only use FG if it won't reach your monitor's refresh limit, or you'll get a bit more of input delay caused by less rendered frames as it have to alternate to avoid frametime spikes.
My thoughts exactly. That game is seriously VRAM hungry. I've never ever been worried about vram in games with my 4070, it was just never full.One game that certainly needs it is Indiana Jones. Even at 1440p native with DLAA and full path tracing my 4090 can't hit 100fps.
Am I losing touch here, I have the money and ability to get a 5090 but the only reason so far legit seems to be based around the physical size of the card?
Also do we expect there to be a notable uptick in perf on things like the new LG UW (5k)
Yeah I am on a 4090, but was considering it on the new screens. Usually like maxing out fps on my displays. For instance 120hz LG C2 as a monitor, 4K 120hz is possible on most games with some upscaling.Its the usual gen on gen uptick. ~30%
![]()
Assuming you already have a 4080+ class card, you likely dont need to upgrade.
Follow the usual logic of skipping a generation so you can "feel" the upgrade.
Gen on gen is rarely worth it outside of braging rights.
If you are building fresh, then obviously go for what you wallet allows.
I decided to retest. Added latency seems to be approx 12 msec.
Controller. I agree; M/KB combo feels much more responsive than controllers overall. Too bad I hate using a M/KB for games.Mouse and keyboard or controller?
Can remember controller felt gloopy compared to m+kB last time I tried it. It definitely felt laggy via controller.
A bit like the reverse of tlou1 on pc when it came out. Controller was ultra smooth but m+kb was a stuttery mess.
Not even remotely close to the same thing. ADDING input lag vs without is a worsening of what came before whereas moving to texture based geometry mapping improves image quality while costing significantly less performance vs pure polygon count boosts to match quality.
Yeah.. It's the same song and dance every time a new tech is introduced. Some people have absolutely no vision. Supposedly MLFG is the same as the interpolation that TVs did back in the days.. It's so dumb it beggars belief.
How much faster is it compared to this:
![]()
New Lossless Scaling 2.9 update introduces 3x Frame Interpolation
https://store.steampowered.com/news/app/993090/view/4145080305033108761 Introducing the X3 frame generation mode. LSFG 2.1 can now generate two intermediate frames, effectively tripling the framerate. X3 has increased GPU load by approximately 1.7 times compared to X2 mode. At the same time...www.neogaf.com
?
Are so also seeing frame judder? I would assume you would if you are capping your frames. The generated frames wouldn’t be able to be evenly distributed between the real frames, any time your base framerate fluctuates. I think you have to uncap your framerate to see the ideal results. It also could be dropping your base frame rate to 60 and doubling it to match the 120 cap you have, that could explain the increase input lag.I decided to try DLSS FG on Ghost of Tsushima.
90FPS without FG feels smoother than 120FPS with FG.
Using the NVIDIA LAT, it appears FG adds 10msec more input lag than normal rasterisation. Furthermore I noticed an increase in motion artefacts.
The tech sucks if you are very sensitive to input lag and care about the best possible image clarity. I hope this is not the future of GPUs.
Can you combine amazing NV frame gen with Samsung's?
You real FPS is exactly half of what current frame gen gives you. So if you lock your fps to a 120hz monitor, your real frame rate, and your input lag will be 60fps/Hz. That's why 90fps feels so much better.If GoT doesnt have reflex then thats the reason why it feels less smooth. Most games should have it by default on when FG is enabled. It also makes no sense why your game would feel less smooth with FG on when you can reach 90 FPS by default without it. It's only incredibly noticeable if your nonfg framerate is very low, not 90.
Can you combine amazing NV frame gen with Samsung's?
Asking for a friend.
Where is your "challenge accepted" attitude, citizen?Haha I wanted to try this but Samsung game motion plus is disabled @120hz![]()
It’s only semi useful for 30-60 fps interpolation. Don’t expect miracles though, latency does make a large jump and you do pick up on some graphical issues. Just like you shouldn’t expect your TVs built in upscaling to match DLSS/XeSS/PSSR you shouldn’t expect it to replace frame generation.Haha I wanted to try this but Samsung game motion plus is disabled @120hz![]()
So for a laugh I ran elden ring at 30fps then used Samsung game motion plus, then added dlss frame gen on top...then ran lossless scaling 20x on top of thatWhere is your "challenge accepted" attitude, citizen?
Need a game that card runs at below 30fps.
Then use this:
![]()
New Lossless Scaling 2.9 update introduces 3x Frame Interpolation
https://store.steampowered.com/news/app/993090/view/4145080305033108761 Introducing the X3 frame generation mode. LSFG 2.1 can now generate two intermediate frames, effectively tripling the framerate. X3 has increased GPU load by approximately 1.7 times compared to X2 mode. At the same time...www.neogaf.com
in combination of DLSS FG.
And than let Samsung generate frames on top. (this was available on TVs for ages, by the way)
Results should be epic.