First RTX 5090 Benchmark: FG vs No FG

MikeM

Member
First benchmark of the 5090 using CP2077:



Seems to up latency from about 35ms to about 45ms going from DLSS only to DLSS + 4x FG max settings including pathtracing. Not bad…

Tested at 4k:
No DLSS: ~28fps, ~65ms latency
DLSS Quality: ~66fps, ~33ms latency
DLSS frame gen 3x: ~168 fps, ~46ms latency
DLSS frame gen 4x: ~209fps, ~48ms latency

Nvidia did work on frametime consistency with FG. No graphs provided though.
 
Last edited:
Fox Tv Popcorn GIF by The Four
 
The thing that caught my ear was him mentioning Nvidia saying that at 4K DLSS 4.0 Performance is more like current DLSS balanced in terms of quality. Amazing stuff if true!
 
Last edited:
The thing that caught my ear was him saying Nvidia saying that at 4K DLSS performance is more like DLSS balanced in terms of quality. Amazing shit!
So, better image reconstruction even from lower resolutions? Very nice, should be a nice push for the most demanding games.
 
performance-pt-3840-2160.png


Granted Phantom Liberty is more taxing than the base game, so my guess is the 4090 averages something closer to 23-25fps in the area that the guy was showing off.

Looks like we're consistently seeing ~30-40% gains from the 4090 to the 5090 in pure hardware.
 
Last edited:
Look, I agree that no Nvidia card should be 2k dollars. No excuse, end of story.

But "native" is so unnecessary right now. If they achieve higher FPS with AI, DLSS, or FG without compromising the image quality I'm all for it. I've seen some bad examples of FG and DLSS. But I've seen so much good examples as well. When I play with these good implementations I really enjoy the game.

That's the most important thing: Enjoying the game with good executed technologies.
 
performance-pt-3840-2160.png


Granted Phantom Liberty is more taxing than the base game, so my guess is the 4090 averages something closer to 23-25fps in the area that the guy was showing off.
My 4070 all the way down in the single digits. I have to put CP77 in 1080P mode to make it playable with Path Tracing and DLSS, NO Frame Gen.
 
Raster performance is kind of what I expected and the uplift from 4090 looks about right I guess.

Although I'm sure some will hate it, gaming is clearly going in the direction of using A.I to help smooth out frame rates and provide as close to native image quality as possible with upscaling. I'm fine with it to be honest, frames are frames and I couldn't give a shit if they're 'fake frames'.

If this allows developers to focus more of their time on creating interesting games, it's worth it to me.

I'm looking forward to seeing more on the 5080 and 5070ti since these are cards are likely what the majority of people will be looking to upgrade to.
 
Raster performance is kind of what I expected and the uplift from 4090 looks about right I guess.

Although I'm sure some will hate it, gaming is clearly going in the direction of using A.I to help smooth out frame rates and provide as close to native image quality as possible with upscaling. I'm fine with it to be honest, frames are frames and I couldn't give a shit if they're 'fake frames'.

If this allows developers to focus more of their time on creating interesting games, it's worth it to me.

I'm looking forward to seeing more on the 5080 and 5070ti since these are cards are likely what the majority of people will be looking to upgrade to.
Im curious to know if less TOPS means more latency versus DLSS only. I know it will mean less frames but I want to know if TOPS loss means higher latency penalties.
 
Last edited:
Anyone using Frameview a lot might be able to tell me, how do you get the statistics against a grey/black background like in this video mine is always transparent, so in some game its hard to see.
 
I genuinely don't understand how they can pull off the latency at 100 fps when the frame gen is starting at 28 fps. Fake inputs? I just don't get it. But the current frame gen has worked just fine IMO.

bg,f8f8f8-flat,750x,075,f-pad,750x1000,f8f8f8.jpg
 
Look, I agree that no Nvidia card should be 2k dollars. No excuse, end of story.

But "native" is so unnecessary right now. If they achieve higher FPS with AI, DLSS, or FG without compromising the image quality I'm all for it. I've seen some bad examples of FG and DLSS. But I've seen so much good examples as well. When I play with these good implementations I really enjoy the game.

That's the most important thing: Enjoying the game with good executed technologies.

I don't have any issues with DLSS and sometimes frame doubling is a bit wonky on the 40series, but the new transformer model seems to be much improved but seems to be the way forward. But naive numbers are still important just as a comparison as a baseline of image quality for comparison performance of prior cards. 4K at full path tracing near 30fps is impressive on its own compared to where we were just a few years ago. Just noting an observation.
 
That's pretty decent although I'd like to see how well that scales down the 50 series. Also, after watching most of the video with path tracing on I was a bit shocked how bad it looks in comparison when he turned RT off.
 
I genuinely don't understand how they can pull off the latency at 100 fps when the frame gen is starting at 28 fps. Fake inputs? I just don't get it. But the current frame gen has worked just fine IMO.

bg,f8f8f8-flat,750x,075,f-pad,750x1000,f8f8f8.jpg


It's not starting FG from that 28fps. First DLSS SR brings it around ~60 - THEN frame gen can work correctly.

I didn't watch the video but NV always included DLSS SR in their comparisons vs. "DLSS off".
 
Last edited:
looks like I am buying a 5090... I think..

I just dont feel like spending that much lol. I still feel like if I can get a 4090 used for half of that price or a bit more then its worth it ? but the AI thing.... I am 100% sure the 5090 will be compatible with whatever 6090 will offer in terms of AI for having that hardware needed while the 4090 will struggle.

sigh..
 
Last edited:
I genuinely don't understand how they can pull off the latency at 100 fps when the frame gen is starting at 28 fps. Fake inputs? I just don't get it. But the current frame gen has worked just fine IMO.

bg,f8f8f8-flat,750x,075,f-pad,750x1000,f8f8f8.jpg

My guess is that these FG benchmarks are including the other AI scaling options with DLSS 4. If that's the case, there could be lower latency due to parts of image being downscaled to lower resolutions.
 
This shouldn't be surprising.

My guess is that these FG benchmarks are including the other AI scaling options with DLSS 4. If that's the case, there could be lower latency due to parts of image being downscaled to lower resolutions.

It would necessarily have to have a lower internal raster in the buffer compared to native, otherwise there would be no increase in more frame rate or frame time. That coupled with the integration of synthesized images results in increased frames.
 
Last edited:
Those raw performance figures are really underwhelming. I didn't realize PL was that demanding of a game.

More than anything, this is discouraging from the standpoint of: frame gen and all of this other garbage that's destroying image quality is going to become the standard. It's going to get worse before it gets better again.
 
Full path tracing and 0 DLSS at native 4K....
This is an important point. Not all games use full path tracing at 4K but also, no other cards can get close to the number. So really, yes, the video card is worth the money unless there is another card that can beat it, and significantly so. People don't realize the taxing nature of running 4K natively, not only that but throwing Path Tracing on top of it you are never going to hit 30, let alone 60fps on any card. Meanwhile, without Path Tracing, the game runs ultra at 66fps which is rather significant given the quality of the assets. Ray Tracing in general diminishes all returns and is probably best used with DLSS. This is all very reasonable.

In order for consoles to achieve similar results they use medium to high assets, reduce the post processing effects, rely on scaling, and never truly hit native 4K. Same goes for lower spec video cards.
 
Last edited:
I genuinely don't understand how they can pull off the latency at 100 fps when the frame gen is starting at 28 fps. Fake inputs? I just don't get it. But the current frame gen has worked just fine IMO.

bg,f8f8f8-flat,750x,075,f-pad,750x1000,f8f8f8.jpg
They aren't. It's using dlss quality as a start. Which isn't 28fps. Native is 28fps.
 
Tested at 4k:
No DLSS: ~28fps, ~65ms latency
DLSS Quality: ~66fps, ~33ms latency
DLSS frame gen 3x: ~168 fps, ~46ms latency
DLSS frame gen 4x: ~209fps, ~48ms latency
This is what i wanted to see. This kicks the living balls off Lossless scaling. Granted Lossless scaling is only 10 dollars.
 
I still feel like if I can get a 4090 used for half of that price or a bit more then its worth it ?
i might just go 4090 too

i was friggin jazzed for the massive 512 bus, but everything else seems fairly lackluster.
FG seems dumb... looks like 200fps but feels like 40fps... ehhh. id probably rather just have 60.
really want a massive jump in RT performance.
 
Top Bottom