Black Myth: Wukong PC benchmark tool available on Steam

Okay, the super resolution slider is definitely deceptive. It turns out it DOES NOT adjust rendering res based on the slider percentage, it still uses the Performance/Balance/Quality/DLAA modes, i.e., as long as your slider is within 62 - 88 range, the DLSS is always in quality mode with 1440p rendering resolution.

The reason 40 series has such a huge performance RT advantage over 30 series is because the RT pipeline uses OMM in the any hit detection.


Also, since even low RT is path tracing, I found it not worthy to take the performance hit to run very high RT. DLSS Quality + Medium RT will be my go to setting.
iVYhpch.jpeg
 
Last edited:
I ran the benchmark at native 8k maximum settings on my 3090.
I was getting like 3 seconds per frame instead of frames per second lol.
I guess I was struggling with vram because it took like 4 minutes to load after I clicked run and then the results show my vram being maxed out.

WwdJRUu.jpeg
8k, wtf.

Look, if you've got too much money that you don't know what to do with, I have a donation link.
 

i thought this was decent enough, theres no need for high RT unless you want best of the best or annoy yourself with crap framerates


*EDIT* I just preordered as a FU to DEI/ESG :)
 
Last edited:
mmmm
exactly 60fps average @1440p with everything set to high, RT on medium and DLSS Q on a RTX 3080.
I think I could live with that...
However I kinda doubt that benchmark will hold up in battle-situations.
 
Cinematic, RT Very High, DLSS Quality+FG, 1440p

MdVMOPg.png


Very High, RT Very High, DLSS Quality+FG, 1440p

m8CN5AF.png


Cinematic, RT Off, DLSS Quality+FG, 1440p

XGl9dh0.png


Very High, RT Off, DLSS Quality+FG, 1440p

u99wRd3.png


How did I do? :messenger_grinning_sweat:
 
Last edited:
1440p high with FSR @ 75% with no frame gen and no RT:

Screenshot-2024-08-14-194854.png


That's probably best case scenario for me, I'd expect it to drop a bit during any kind of action. Ram usage super low, a win for the 8GB crew.
 
Last edited:
Yeah this game will never see the light of day on Series S. And if by chance it does, it will be a fugly sight.

Sexy Hot Girl GIF by Cappa Video Productions

There's one or two users already running this on SD, if they can get the overall memory squared away, the gpu in the XSS should be fine. It'll just be soft.
 
Am I the only one feeling like installing these Chinese spyware is like banging someone with aids without a condom? Pretty nasty example but so it is how I see it... Definitely skipping this no matter how good it is, there's like bazillion more games out there
 
should have 6gb in total and IIRC the benchmark said i was using less than 5 so I'm not sure it's that, although it's possible since it's not a huge amount anyway for today's standards


yes
If the game sees that you only have 6GB of VRAM, it will allocate a little less, but your GPU can still be VRAM limited (you will see stuttering and sometimes textures not loading correctly). I have seen this many times on my old GPUs.

The reason 40 series has such a huge performance RT advantage over 30 series is because the RT pipeline uses OMM in the any hit detection.
According to NVIDIA, the OMM engine in the RT 40 series enables much faster ray tracing of alpha-tested textures, which are often used for foliage, particles and fences. There is a lot of foliage and vegetation in this game, which may explain the difference in performance between the RTX 30 and RTX40 series.
 
Last edited:
Am I the only one feeling like installing these Chinese spyware is like banging someone with aids without a condom? Pretty nasty example but so it is how I see it... Definitely skipping this no matter how good it is, there's like bazillion more games out there


Your government is more likely to fuck up with you than the Chinese. So, unless you tell me that you don't use Google and any social network/app, you are already infected with something far worse.

Enough of this bullshit.
 
It will be interesting to see if releasing the benchmark is a potent marketing tool for them. We might see more of it if it can be done spoiler free.
 
It's not about shitting on anything. "Fake" frames lies about the performance. Simple as that. Plus FSR and TSR also have frame gen.
If my performance is at least 70-80fps with DLSS FG, I don't care if these are are fake frames because the game will be still a joy to play and I will still have great experience. I wasnt sold on this technology when I tried FSR3 and lossless scaling FG (judder, noticeable lag) on my old GTX1080, but DLSS3 FG fixed all these issues and made me belive in this technology.

I have 50fps at 4K DLSS performance without FG with full RT. The game should be still playable even without DLSS FG.

4080S-4K.jpg


With FG however I will get much better experience for sure.

4-K-DLSS-P.jpg


At lower resolutions like 1440p I dont even need FG to get around 60fps.

1440p-DLSS-Q.jpg
 
Last edited:
Oof this game is heavy, and that vignetting effect is way too pronounced imo.

Anyway 2560x1440 DLAA with 100% super resolution/maxed out:

R6yH2pA.jpeg


Visually, maybe. In gameplay not, as it adds input lag.

Fortunately this is not really true, in fact with Nvidia Reflex + boost it's even lower than native:

3.png


You can find other examples on Youtube if you want. Some games have a bit higher latency but it's because Reflex + boost was very new. Now with recent games it's perfectly fine.
 
Last edited:
I was going insane, i had a bug where it would tell me that Frame Generation wasn't supported on my GPU if i turned it off and tried to turn it back on, and when i reset settings and it was on it was running without it but a visual glitch was saying it was on, finally fixed it this afternoon.
6OIkiwI.jpeg
 
Visually, maybe. In gameplay not, as it adds input lag.
DLSS FG doesn't reduce input lag compared to real frames, but at least it doesn't reduce it like motion interpolation does on TVs (my TV adds over 100ms of input lag when I turn motion interpolation on), and that's what makes it so great.

Games running at around 40fps are still quite responsive (especially on the gamepad). The biggest problem for me is the lack of motion smoothness, because as soon as the frame rate drops below 60fps I see motion judder that ruins the experience. DLSS FG completely eliminates this therefore it makes gaming experience better (compared to framerare without FG).

I've played games with real 70-100fps and fake (FG generated frames) 70-100fps. The difference in my gaming experience was small. If I can only get 70-80fps with FG on, I know the game will be a joy to play.
 
Last edited:
Oof this game is heavy, and that vignetting effect is way too pronounced imo.

Anyway 2560x1440 DLAA with 100% super resolution/maxed out:

R6yH2pA.jpeg




Fortunately this is not really true, in fact with Nvidia Reflex + boost it's even lower than native:

3.png


You can find other examples on Youtube if you want. Some games have a bit higher latency but it's because Reflex + boost was very new. Now with recent games it's perfectly fine.
That Native is not with Reflex on. You obviously will get better lag on native + reflex.
 
Oof this game is heavy, and that vignetting effect is way too pronounced imo.

Anyway 2560x1440 DLAA with 100% super resolution/maxed out:

R6yH2pA.jpeg




Fortunately this is not really true, in fact with Nvidia Reflex + boost it's even lower than native:

3.png


You can find other examples on Youtube if you want. Some games have a bit higher latency but it's because Reflex + boost was very new. Now with recent games it's perfectly fine.

Dlss SR plus reflex has the lowest input lag, if you add FG to it input latency increases. There is no way around it.

DLSS frame gen is levels above FSR Frame Gen or that software based one you can buy on Steam.

It's not. You can use dlss SR and FSR3 frame generation in tandem in few Sony games and quality of fake frames is almost the same compared to dlss FG (even DF confirmed it). Biggest difference is that dlss FG uses MASSIVE amounts of VRAM that kills 8 and 12gb cards on the spot. While FSR3 is very light.

This game for some reason (💵) locks ampere and turing users from being able to combine FSR3.1 and dlss.
 
That Native is not with Reflex on. You obviously will get better lag on native + reflex.
Well, in fact you can see that native + reflexe produce 63.5ms. When you activate DLSS you get even lower latency, the better result being with DLSS performance.

Even DLSS quality + FG is better than native + FG. This tech is amazing.
 
Dlss SR plus reflex has the lowest input lag, if you add FG to it input latency increases. There is no way around it.



It's not. You can use dlss SR and FSR3 frame generation in tandem in few Sony games and quality of fake frames is almost the same compared to dlss FG (even DF confirmed it). Biggest difference is that dlss FG uses MASSIVE amounts of VRAM that kills 8 and 12gb cards on the spot. While FSR3 is very light.

This game for some reason (💵) locks ampere and turing users from being able to combine FSR3.1 and dlss.

I've got 24gb of VRAM dude.
 
Well, in fact you can see that native + reflexe produce 63.5ms. When you activate DLSS you get even lower latency, the better result being with DLSS performance.

Even DLSS quality + FG is better than native + FG. This tech is amazing.
Yeah, my bad. Native + reflex off is in last. But you get the almost same lag because DLSS quality is lower resolution. FG alone always adds lag at the same resolution.
 
Yeah, my bad. Native + reflex off is in last. But you get the almost same lag because DLSS quality is lower resolution. FG alone always adds lag at the same resolution.
If FG is in a game, so does Reflex + Boost. So you always have the best of both world. In theory you are right, but the tech is build in a way that your configuration (FG alone) never happen.

YOU have.

Majority of Ada cards don't have that and Nvidia is still marketing dlss3 frame gen to their customers. Something that is unusable on cards with less than 12gb (and even has problems at 12gb in some games).

You can see in my screen that it uses 8.9GB for example, so 12gb card are fine. And if indeed you reach the memory limit, just lower a few settings. Not to mention that very few people buy a 4060 to play at 4K...

I thought this was obvious:

4060 (8gb): 1080p
4070 (12gb): 1080/2K
4080/4090 (16gb+): 2K/4K
 
Last edited:
If FG is in a game, so does Reflex + Boost. So you always have the best of both world. In theory you are right, but the tech is build in a way that your configuration (FG alone) never happen.
DLSS quality + Reflex + boost will give me better lag than DLSS quality + FG + Reflex + boost.
 
Top Bottom