Black Myth: Wukong PC benchmark tool available on Steam

Dlss SR plus reflex has the lowest input lag, if you add FG to it input latency increases. There is no way around it.



It's not. You can use dlss SR and FSR3 frame generation in tandem in few Sony games and quality of fake frames is almost the same compared to dlss FG (even DF confirmed it). Biggest difference is that dlss FG uses MASSIVE amounts of VRAM that kills 8 and 12gb cards on the spot. While FSR3 is very light.

This game for some reason (💵) locks ampere and turing users from being able to combine FSR3.1 and dlss.
The quality of the generated frames looks equally good to me in both DLSS FG and FSR3 FG. However, the smoothness is not the same. FSR3 can almost double the framerate (I think FSR FG is even faster than DLSS FG), but I saw motion judder when I played games with it and the aiming wasn't as responsive during fast motion. I was not sold on frame generation technology when I still had my old GPU and was using the FSR FG, but the DLSS FG has changed my mind. FSR FG tries to mimic what DLSS FG does, but it's clearly not on the same level.

But yes, there's a price to pay if you want to play with DLSS FG, as VRAM usage increases A LOT. This technology is supposed to help the lower/mid-range RTX 40 GPUs the most, but it's not really that usable on 8GB VRAM GPUs. In some games not even 12GB is enough (I saw 14GB VRAM usage in Ratchet and clank with FG ON).
 
DLSS quality + Reflex + boost will give me better lag than DLSS quality + FG + Reflex + boost.
Show me.

This feels like moving the goalpost. At first we talked about native. Then native + boost. Then DLSS quality was lower res. And now this. It's better than native, which was the discussion at first.
 
Show me.

This feels like moving the goalpost. At first we talked about native. Then native + boost. Then DLSS quality was lower res. And now this. It's better than native, which was the discussion at first.
Show what?

A DLSS + FG will always have more lag than the same DLSS alone.
 
Last edited:
If FG is in a game, so does Reflex + Boost. So you always have the best of both world. In theory you are right, but the tech is build in a way that your configuration (FG alone) never happen.



You can see in my screen that it uses 8.9GB for example, so 12gb card are fine. And if indeed you reach the memory limit, just lower a few settings. Not to mention that very few people buy a 4060 to play at 4K...

I thought this was obvious:

4060 (8gb): 1080p
4070 (12gb): 1080/2K
4080/4090 (16gb+): 2K/4K

I had 4070 and 4070ti. Alan wake and cyberpunk were going above VRAM with frame gen with path tracing, cyberpunk was doing that even in 1440p. And I was using Dlss performance in 4k so base resolution was 1080p.

UE5 games have much lower VRAM requirements in general so no wonder that in BMW lower end cards will be fine even with frame gen.

DLSS quality + Reflex + boost will give me better lag than DLSS quality + FG + Reflex + boost.

Yep. Funny thing is that in some new games there is no reflex option at all and it only turns itself on when frame generation is activated. You can't have the lowest input lag possible because Nvidia/developers forbids you from that to make FG look better.
 
I had 4070 and 4070ti. Alan wake and cyberpunk were going above VRAM with frame gen with path tracing, cyberpunk was doing that even in 1440p. And I was using Dlss performance in 4k so base resolution was 1080p.

UE5 games have much lower VRAM requirements in general so no wonder that in BMW lower end cards will be fine even with frame gen.
Wait.. Are you telling me than the most taxing games with PT turned on are... taxing to run?

Cbs What GIF by The Late Late Show with James Corden


Show what?

A DLSS + FG will always have more lag than the same DLSS alone.
Blablabla. As I said, moving the goalpost. You where wrong and it's ok, FG is amazing. We, who actually play games, dont care about 0.00002ms added when seeing the benefit of it.

And I want you to show me some bench so you can see how little the difference is. Totally negligible.
 
Wait.. Are you telling me than the most taxing games with PT turned on are... taxing to run?

Cbs What GIF by The Late Late Show with James Corden



Blablabla. As I said, moving the goalpost. You where wrong and it's ok, FG is amazing. We, who actually play games, dont care about 0.00002ms added when seeing the benefit of it.

And I want you to show me some bench so you can see how little the difference is. Totally negligible.
I am not moving any goalpost. Since the beginning I am saying that FG adds lag always. If you don't care or notice, good for you. And its far from only 0,00002ms. It's 10+.

 
Last edited:
I am not moving any goalpost. Since the beginning I am saying that FG adds lag always. If you don't care or notice, good for you.
Oh yeah, and you totally don't hate FG for whatever reason. I said what I had to say, you don't want to show any bench so I leave it here.
 
Blablabla. As I said, moving the goalpost. You where wrong and it's ok, FG is amazing. We, who actually play games, dont care about 0.00002ms added when seeing the benefit of it.

And I want you to show me some bench so you can see how little the difference is. Totally negligible.
This has been tested by several tech analysts, including DF where the results ranged from pretty much negligible to "ouch" on a game by game and settings basis ..... pretending that never happened is borderline trolling or simply stupid.
FG adds input lag, like any other addition to a pipeline.
 
Last edited:
pretending that never happened is borderline trolling or simply stupid
As I said earlier:
Some games have a bit higher latency but it's because Reflex + boost was very new
I'll also add the case of bad implementation which can always happen unfortunately. But pretending it's bad/unplayable or that it shouldn't be used is borderline trolling or simply stupid too.

It's a great tech, and we should welcome such innovation instead of getting triggered by 0.000002+ ms.
 
Oh yeah, and you totally don't hate FG for whatever reason. I said what I had to say, you don't want to show any bench so I leave it here.
I don't hate. And I don't show? Are you blind or didn't say the edit?
 
Last edited:
As I said earlier:

I'll also add the case of bad implementation which can always happen unfortunately. But pretending it's bad/unplayable or that it shouldn't be used is borderline trolling or simply stupid too.

It's a great tech, and we should welcome such innovation instead of getting triggered by 0.000002+ ms.
Again with the imaginary numbers... we`re talking 10-100ms..... depending on the game, implementation and settings. FG is far from "just turn it on and be happy". Why are you trying to pretend that there are no downsides?
 
Last edited:
Wait.. Are you telling me than the most taxing games with PT turned on are... taxing to run?

Cbs What GIF by The Late Late Show with James Corden

You can add bunch of Sony games to that list, FG is fucked in many situations even without PT due to high VRAM consumption:



GoT is actually super light on vram compared to other Sony games.

I don't even know what are you trying to prove here, I was talking about my experience with nvidia FG and that even on higher end card (4070ti) it caused some games to go out of vram. FSR3.1 seems to do the same thing with much less resources and can be combined with DLSS SR.
 
I don't hate. And I don't show? Are you blind or didn't say the edit?
I haven't seen it obviously. And this shows precisely what I'm talking about: it's negligible and when using DLSS it's even better. And seriously, WHO in his right mind will use FG but not DLSS? The tech is made to work together.

Again with the imaginary numbers... we`re talking 10-100ms..... depending on the game, implementation and settings. FG is far from "just turn it on and be happy". Why are you trying to pretend that there are no downsides?
As I said, some bad implementation and early versions. Like GOT recently. I never said it was perfect, it can happen, no need to downplay the technology for this.
Not to mention that 90% of the games are in the range of 5/10ms. Watch the video rodrigolfp linked. Even Spider-Man, a Sony game (isn't it Bojji Bojji ?) has less than 7ms added with DLSS on/FG on and DLSS off/FG on so stop pretending it's unplayable.

But "OMAGAD FG FAKE FRAMES BAD" I suppose.

You can add bunch of Sony games to that list, FG is fucked in many situations even without PT due to high VRAM consumption:



GoT is actually super light on vram compared to other Sony games.

I don't even know what are you trying to prove here, I was talking about my experience with nvidia FG and that even on higher end card (4070ti) it caused some games to go out of vram. FSR3.1 seems to do the same thing with much less resources and can be combined with DLSS SR.

Most (in not all) of them have been fixed. I don't try to prove anything, I just hate people acting like FG is the worst thing ever and pretend it only has downside. And FSR is shit, rather it is with visual quality or responsiveness.
 
This has been tested by several tech analysts, including DF where the results ranged from pretty much negligible to "ouch" on a game by game and settings basis ..... pretending that never happened is borderline trolling or simply stupid.
FG adds input lag, like any other addition to a pipeline.
Digital Foundry tested this feature a long time ago. Nvidia drivers could improved input lag since then.

The difference in input lag between DLSS FG on and off is probably still there, but it's not big enough to affect my aiming. The increased smoothness however improves my aiming because my eyes can track objects much easier. For example in Alan Wake 2, I found it much easier to aim with DLSS FG on.

Real fps will always be better, but if my GPU can only deliver 45-60fps instead of well over 60fps, I will always go for 70-100fps with DLSS FG.
 
Last edited:
so stop pretending it's unplayable.
Where did I do that?
Stop making up conversations in your head.
Not to mention that 90% of the games are in the range of 5/10ms.
Statistics from your bum.

These are NVIDIA LDAT measurements from CP2077, where that vid you`re referring to (with no disclosure as to how they measured at all) measured <5ms....
Zotac AMP Extreme Airo Geforce RTX 4080 SuperEnd-to-End-Latency (LDAT)
Fullscreen and Windowed ( with and without Nvidia Reflex)ca. 39 ms
DLSS 3.0 (Frame Generation, w/o Upsampling)ca. 61 ms
LSFG with Nvidia Reflexca. 90 ms
LSFG w/o Nvidia Reflexca. 103 ms
Cyberpunk 2077 nativ UHD/4K, Preset (very high, w/o Upsampling) ca. 62 Fps w/o FGLDAT-Benchmark with ca. 120 runs each
the article is from may 2024, so don`t expect current iterations of drivers/frameworks to be much different. So unless you`re gonna pretend that CP2077 also suffers from a case of "bad implementation" despite being THE showcase for anything new NVIDIA does on that front.......you may now have to reconsider the numbers in your head.....
It´s great tech but stop pretending it doesn´t come at a cost that can be felt.

But "OMAGAD FG FAKE FRAMES BAD" I suppose.
Are you 5?
 
Last edited:
Most (in not all) of them have been fixed. I don't try to prove anything, I just hate people acting like FG is the worst thing ever and pretend it only has downside. And FSR is shit, rather it is with visual quality or responsiveness.

I like frame generation, I played many games with it on, with reflex input lag is not an issue in majority of games.

But FSR 3.1 frame generation is not bad at all at this point and it allows to use Xess, dlss, TSR, fsr and even native res and TAA as input. Some games even allow reflex with that.

Biggest downside of Dlss FG for me was massive VRAM increase, other than that I don't have much problems, some visual artifacts here and there and that's it usually.
 
DLSS FG doesn't reduce input lag compared to real frames, but at least it doesn't reduce it like motion interpolation does on TVs (my TV adds over 100ms of input lag when I turn motion interpolation on), and that's what makes it so great.

Games running at around 40fps are still quite responsive (especially on the gamepad). The biggest problem for me is the lack of motion smoothness, because as soon as the frame rate drops below 60fps I see motion judder that ruins the experience. DLSS FG completely eliminates this therefore it makes gaming experience better (compared to framerare without FG).

I've played games with real 70-100fps and fake (FG generated frames) 70-100fps. The difference in my gaming experience was small. If I can only get 70-80fps with FG on, I know the game will be a joy to play.

FG adds ~12ms to a game with reflex. But still 60% less latency than a game that wouldn't have reflex. And let's not even add consoles with atrocious latency in this, not even comparable to PC + reflex + FG. So no, I don't think hitting on FG constantly because of latency is warranted. Its been debated to hell and back.
 
Last edited:
Where did I do that?
Stop making up conversations in your head.
My bad indeed, it was followed like this but I messed up somehow: unplayable/bad OR useless as I encapsulate other member too, it's a generality.

Statistics from your bum.

These are actual NVIDIA LDAT measurements from CP2077, where that vid you`re referring to measured <10ms....
Congrats, you find one of the game that land in the 10% that are worst. Look at every other games, minus Alan Wake 2 which is also on those 10% (fixed since then tho) and it range between 5 to 15ms maximum which is totally undetectable even at 300hz/fps.

Not to mention that you also need to take the games that were bad once they have been patched, not just 1.0.

Are you 5?
I'll try to do better, genuinely. I just think it's sad to resume this great tech to such ridicule number. For once, Nvidia is doing something good, pushing the industry forward and I'm just really happy to get 144hz/fps with every games that use this tech with PT and all the bells and whistles. It just look SO GOOD and when I turn FG off on some games like AW II, for me it start to be unplayable because the framerate fluctuates way too much.

I like frame generation, I played many games with it on, with reflex input lag is not an issue in majority of games.

But FSR 3.1 frame generation is not bad at all at this point and it allows to use Xess, dlss, TSR, fsr and even native res and TAA as input. Some games even allow reflex with that.

Biggest downside of Dlss FG for me was massive VRAM increase, other than that I don't have much problems, some visual artifacts here and there and that's it usually.
I confess I only tried FSR on Avatar and it was really horrendous. It was so bad I never bothered to try it again so maybe I should give it another go.

Were I totally agree, it's on the versatility of the software. The fact that, like you said, we can tweak everything anyway we want is amazing and is totally in line with the open PC mentality, which is a big part of why I'll never switch to consoles. As for the VRAM problem it is true unfortunately, but with some adjustments in the settings it should be fine. And also we should hope some studios will learn to properly optimize their games, looking at you Cities Skyline 2/Jedi Survivor or Wo Long... sometimes we can wonder if the devs are even bothering to try.

It is almost 2x the lag in that test
I think we posted at the exact same time haha. Anyway, I responded to this earlier (1.0 version + some bad implementation which always happen)

No one. Who do that?
Exactly!

FG adds ~12ms to a game with reflex. But still 60% less latency than a game that wouldn't have reflex. And let's not even add consoles with atrocious latency in this, not even comparable to PC + reflex + FG. So no, I don't think hitting on FG constantly because of latency is warranted. Its been debated to hell and back.
the-fountain-hugh-jackman.gif


You just summed up what I'm badly trying to say.
 
Last edited:
Anyway, here's my contribution to the thread:
mQIHiyP.png


Coincidentally, the FPS is about the same with High settings on my laptop, which has a 12800HX, 32gb DDR5-4800 and a 3070Ti 8gb 150w.
 
This game looks bloody stunning. Just hope it plays well. I suck at Souls-Like games but I really want to check this out. The whole setting just oozes cool and atmosphere.
 
Congrats, you find one of the game that land in the 10% that are worst.
The 22ms from CP2077 measured with LDAT in may 2024 (far from 1.0.....) is a simple truth and suddenly CP2077...THE NVIDIA tech showcase beloved and helped by Nvidia themselves is among the worst? Where are your LDAT numbers? because so far the only source for your claims has been a yt vid from a no name channel which directly contradicts the results from professionals with specialized measurement equipment.....

You could have just said that you simply don`t care about real world numbers from the get go.....This is no technical discussion anymore at this point.
 
Last edited:
The 22ms from CP2077 measured with LDAT in may 2024 (far from 1.0.....) is a simple truth and suddenly CP2077...THE NVIDIA tech showcase beloved and helped by Nvidia themselves is among the worst? Where are your LDAT numbers? because so far the only source for your claims has been a yt vid from a no name channel which directly contradicts the results from professionals with specialized measurement equipment.....

You could have just said that you simply don`t care about real world numbers from the get go.....This is no technical discussion anymore at this point.
Sure I know the CP case is paradoxical but to be fair we had to wait forever for a date and I wouldn't be surprised it was hard to implement in the Red Engine. Because that was a marketing deal at first for sure, not a technical one. Meaning they probably took a big bag of money without looking first if it was not too hard to do. We know CP2077 was heavily updated.

And I do care about number but I'm maintaining that in 90% of the cases with Reflex + boost the difference is totally negligible.
 
Anybody with a rtx 3080 10gb test it? How was it? I've got a 7800x3d CPU so I'm good there.

3080ti here but game is not using a lot of vram so 10GB isn't the problem.

Simple answer: you won't be able to play with RT but high settings in 4k with DLSS performance will work quite ok (over 60fps average). You can add frame generation to that while using TSR instead of DLSS.
 
Sure I know the CP case is paradoxical but to be fair we had to wait forever for a date and I wouldn't be surprised it was hard to implement in the Red Engine. Because that was a marketing deal at first for sure, not a technical one. Meaning they probably took a big bag of money without looking first if it was not too hard to do. We know CP2077 was heavily updated.

And I do care about number but I'm maintaining that in 90% of the cases with Reflex + boost the difference is totally negligible.
And I'm maintaining that you are still pulling statistics/numbers out of thin air. I'll repeat myself, where are your measurements coming from? Because as I see it you are simply speaking from a personal preference standpoint without any actual technological facts as backup. 20sth ms might be negligible in a game like baldurs gate 3, but they are very much an issue in something like an arena shooter.
You are simply generalising too much...
 
Due to Lumen I don't see a need to turn on Full Raytracing.

Agree. Lumen at the higher settings is really good. It does take a solid chunk of performance, but not even sniffing the penalty of this rt.

Looking at a side by side, the shadows are noticeably better with rt but I would lose the pepsi challenge on the rest of the image.
 
And I'm maintaining that you are still pulling statistics/numbers out of thin air. I'll repeat myself, where are your measurements coming from? Because as I see it you are simply speaking from a personal preference standpoint without any actual technological facts as backup. 20sth ms might be negligible in a game like baldurs gate 3, but they are very much an issue in something like an arena shooter.
You are simply generalising too much...
We are not talking about 20ms, but 5 to 15. I'm starting work so I can't take pictures but just watch the games other than CP2077 or AW II in the video above and you will see the results are those I said.

And that video is 8 months old so I'm sure I can easily find new games with great results too. We played games with native ms without you or anyone else complaining and now we have better latency and yet you are here saying it's noticable. Gaming lust have been hard the last 25 years since we had horrendous latency... Right?
 
Where did I do that?
Stop making up conversations in your head.

Statistics from your bum.

These are NVIDIA LDAT measurements from CP2077, where that vid you`re referring to (with no disclosure as to how they measured at all) measured <5ms....
Zotac AMP Extreme Airo Geforce RTX 4080 SuperEnd-to-End-Latency (LDAT)
Fullscreen and Windowed ( with and without Nvidia Reflex)ca. 39 ms
DLSS 3.0 (Frame Generation, w/o Upsampling)ca. 61 ms
LSFG with Nvidia Reflexca. 90 ms
LSFG w/o Nvidia Reflexca. 103 ms
Cyberpunk 2077 nativ UHD/4K, Preset (very high, w/o Upsampling) ca. 62 Fps w/o FGLDAT-Benchmark with ca. 120 runs each
the article is from may 2024, so don`t expect current iterations of drivers/frameworks to be much different. So unless you`re gonna pretend that CP2077 also suffers from a case of "bad implementation" despite being THE showcase for anything new NVIDIA does on that front.......you may now have to reconsider the numbers in your head.....
It´s great tech but stop pretending it doesn´t come at a cost that can be felt.


Are you 5?
The last one is with 62fps cap? I cant open this article to find out, but any fps cap will increase input lag drastically when DLSS FG is turned on.

This is DLSS Frame Generation input lag test in cyberpunk with high speed camera. The game is more responsive with DLSS FG compared to reflex off.



Input lag in this game is much higher on consoles (around 100ms average latency) and people are still happy with the experience. With DLSS FG on PC you get around 40-60ms depending on the game. I'm sensitive to input lag, but DLSS FG still offers an enjoyable experience (especially on gamepad), and because it increases smoothness my eyes can track moving objects a lot easier compared to base / real framerare without FG. This technology enhances my gaming experience and I see no reason to play without it in SP games, and especially in the most demanding PT / RT games where the GPU needs some help.
 
Last edited:
I'm not gonna even bother with the tool, the final game is what matter.

But it seems like rtx is off the table for 4k60 with a 4080...
 
TsUQ80K.png

This is my best result so far. Considering I do not really mind frame generation, I would have been satisfied at these results... before I got a 165hz screen...
 
I'm not gonna even bother with the tool, the final game is what matter.

But it seems like rtx is off the table for 4k60 with a 4080...
Not with maxed out settings. However, you can get around 60fps (even without FG) if you are willing to use DLSS upscaling and adjust some settings.

4K DLSS performance, Medium RT, high settings, no FG

ClnRRfc.jpeg




The same settings but with DLSS balanced


U2udfBT.jpeg




DLSS Quality


D0U9qMM.jpeg
 
Last edited:
Not with maxed out settings. However, you can get around 60fps (even without FG) if you are willing to use DLSS upscaling and adjust some settings.

4K DLSS performance, Medium RT, high settings, no FG

ClnRRfc.jpeg




The same settings but with DLSS balanced


U2udfBT.jpeg




DLSS Quality


D0U9qMM.jpeg
Yeah sure i have no problem using dlss, but i never go below dlss quality, and i don't really care for rtx so yeah i'm gonna cut that one and some shadows settings and some other minor thing to get what i need.
 
Guys you are not going to belive this.
It runs like shit on my mobile 1050ti lol (that said it still gets like 55fps at low with 50% scaling at 1080p)
 
Yeah sure i have no problem using dlss, but i never go below dlss quality, and i don't really care for rtx so yeah i'm gonna cut that one and some shadows settings and some other minor thing to get what i need.
RTX4080 can achieve 84fps at 4K DLSS quality without RT


1o5Ewv7.jpeg



I will probably play at 4K DLSSQ with RT and FG on because water and shadows look a lot better and at 78fps the game should still be a joy to play.


F8DgHvu.jpeg
 
Last edited:
Some more comparisons here. RT reflections will be able to display particle effects, probably why the fire in the water scene has reflections. The benchmark is probably not a great comparison scene, it's probably also not going to be the heaviest parts of the game.


 
I remember when people were saying RT shadows were a waste of gpu resources. I'm glad this game is able to effectively highlight the importance of accurate shadow rendering.
 
Out of curiosity I tested the benchmark with UEVR Injector to see how (bad) it runs. With medium settings and high textures, 3648x2016 DLSS Performance + ASW I could reach somewhat "stable" 60fps.

Injector set to Synchronized Sequential because Native Stereo looks completely wrong.


Man the day we get these kind of graphics on standalone hardware... In 20 years maybe.
 
Last edited:
Yeah, so, you pretty much need frame generation to get a smooth frame rate. Otherwise, it's dropped frame time spikes galore.

gxjGuQb.png

Bt6uaZl.png
 
Top Bottom