Black Myth: Wukong PC benchmark tool available on Steam

I ran the benchmark at native 8k maximum settings on my 3090.
I was getting like 3 seconds per frame instead of frames per second lol.
I guess I was struggling with vram because it took like 4 minutes to load after I clicked run and then the results show my vram being maxed out.

WwdJRUu.jpeg
 
I wonder if RT set to Low just adds RT shadows while medium adds RT reflections and Very high the full hardware GI.

If RT shadows are the biggest win, id rather just have them on.

So with my 3080 10gb, I got a 17fps difference between RT Medium and RT Very High:

XWMtntT.png

9rdaXxn.png
 
Last edited:
I ran the benchmark at native 8k maximum settings on my 3090.
I was getting like 3 seconds per frame instead of frames per second lol.
I guess I was struggling with vram because it took like 4 minutes to load after I clicked run and then the results show my vram being maxed out.

WwdJRUu.jpeg
You are a sadist to your graphic card.
 
Genuinely impressed with SD performance.
5qO2YKF.jpeg

And it still looked great! Shadow draw distance was pretty short though. Game uses nanite right? The lack of pop in is a tremendous game changer for grounding the scene.
 
Last edited:
I only got a quick test before bed, 7800X3D, 4070Ti, 3440x1440 default settings(very high no RT) and got 93FPS, good enough for me. The image quality was great, didn't see any pixelation/artifacts.
 
I ran the benchmark at native 8k maximum settings on my 3090.
I was getting like 3 seconds per frame instead of frames per second lol.
I guess I was struggling with vram because it took like 4 minutes to load after I clicked run and then the results show my vram being maxed out.

WwdJRUu.jpeg
Why would you do this?

OxKrkUY.gif
 
Finding some actually playable settings for my 3090:

eqsnI8P.jpeg


ZpgJEcB.jpeg


WtZAhjP.jpeg


Running at native 4k even with ray tracing completely turned off was well below 30 fps so DLSS is a must for my old card
VRR for my monitor goes down to 40 hz so keeping my minimum above 40 fps should be the way to go (if I were going to buy it that is)
 
Last edited:
Ok I'm not a pc gamer (anymore) but there's barely anything happening in the benchmark. No destruction physics, combat, alpha effects, particles, fast cuts in the camera world placement… it's a camera flowing around a river. Is this your typical pc benchmark? How is this stretching hardware at any level? How would the results be representative of your frame rate during gameplay?
Stumbled across this, nanite might be taking the toll. But it is not uncommon for benchmark scene to not have combat, 2077 for example.

 
I'm surprised at how well it runs. 3090FE i9-9900k, I can put everything on the highest setting except for shadows on high at 2k resolution, path ray tracing off and DLSS on quality and get solid 60fps. I'm excited for next week. Hopefully this game clears up the GOTY race.
 
Last edited:
Too lazy to screen cap, so here's a shitty cell phone image. Edit: NVM. Forgot GAF never let me upload photos on here. Idk why that's even a feature since it's busted.

Tried a lot of various things on my 3080 like going to 1440p, certain things on high, RT low or off etc.

4K, high on all settings, texture on cinematic, and dlss right on the cut off for balanced and quality. (61?)

I'm going to still wait to see what the word is on the PS5 version in performance if it beats this or is on par.
 
Last edited:

7900 XTX VS RTX 4090 | R7 7800X3D | i9 14900KS



Seems like something is wrong on the AMD side with RT on, shadows are bugged or something because there is a massive difference in RT on AMD vs Nvidia quality wise.
 
Last edited:
Starting to think that even a 5090 will not be enough to adequately run upcoming next gen games on my DUHD UW.....
Holy shit, seeing a 4090 struggle like this in "normal" 4k without FG......
I`ll test this on my 3080 but somehow I think I won`t like the results as I´m kind of a "if I have to compromise I´d rather not play it at all"-person.
 
Last edited:
Starting to think that even a 5090 will not be enough to adequately run upcoming next gen games on my DUHD UW.....
Holy shit, seeing a 4090 struggle like this in "normal" 4k without FG......
I`ll test this on my 3080 but somehow I think I won`t like the results as I´m kind of a "if I have to compromise I´d rather not play it at all"-person.

Expect Stalker 2, which is also using Lumen and Nanite by default to tax GPUs even harder than this.
 
q7SgckN.jpeg


HppvpX8.jpeg


100FPS at native 3440x1440 with everything maxed or 127FPS with DLSS quality. Not bad

Edit - Apparently it turned RT off. Let me try that again.

63FPS native and 102 with DLSS with max RT
P1eb69G.jpeg
0HxxLbz.jpeg
 
Last edited:
Starting to think that even a 5090 will not be enough to adequately run upcoming next gen games on my DUHD UW.....
Holy shit, seeing a 4090 struggle like this in "normal" 4k without FG......
I`ll test this on my 3080 but somehow I think I won`t like the results as I´m kind of a "if I have to compromise I´d rather not play it at all"-person.

And this is just a trail cam with no action….

I fear for the performances of the game come Monday, I think this is where the sweet spot I found in the benchmark will be dialed down in settings to be actually playable. I fear the stutter…
 
Nope, in normal RT there usually is very little difference. With path tracing Ada shows bigger differences but not to this extend:

AW2, 4070 is 14% better

UajJmC5.jpeg


CP2077, 4070 is 20% better

ficmyB2.jpeg


BMW (this time in 1080p), 4070 is 42% better

zwAKIuR.jpeg


And this is normal RT average between those GPUs from TPU:

relative-performance-rt-1920-1080.png

Fair point

Could something be amiss though? Alan Wake 2 and Cyberpunk being on internal engines with good engineers behind VFX is already more optimistic to me than unreal 5 plug-ins. We don't have any complex path tracing + nanite games until this one right? Also, I don't know for Alan wake 2, but Cyberpunk is low on dynamic shaders and more computational heavy, perfect for in-line ray tracing, which means DXR 1.1 and why also AMD cards performed well on it. SER from Ada might have alleviated a bit of the workload compared to Ampere but not so much. Foliage and forests are basically dynamic shader galore, what if that's where SER comes in at its best?

I just don't believe into jumping a "engineers to be this way" conspiracy theory tin foil hat conclusion with so little information on why the game performs better on Ada. There's too many unknowns.
 
3080 boys where we at?
Tried several different things.

RT on max is pretty much impossible.

RT on low can work at 1440p dlss performance (720p internal resolution) and i can hit 60 fps at least.

RT off at 4k dlss performance sits around 50 fps. But turning down settings from very high to high gets you that locked 60 fps.
 
getting an average of 40 but some extreme slow down (9fps) for some reason...anyone that is much more knowledgeable know what it can be?maybe a slow hdd that slows everything down when the game has to load new textures? is that a thing? XD
 
Tried several different things.

RT on max is pretty much impossible.

RT on low can work at 1440p dlss performance (720p internal resolution) and i can hit 60 fps at least.

RT off at 4k dlss performance sits around 50 fps. But turning down settings from very high to high gets you that locked 60 fps.
Yep same kind of results I had.

I'm going to wait to see what the PS5 performance version fairs like from first hand impressions on here.

Have a feeling that'll be the better path.
 
Wtf this thing is horribly optimised.
Has to run at half rendering res to get high fps.

Will wait for a deep sale or when they optimise it some more.

Can only imagine what it will look like on PS5.
Seems like UE5 performance more than anything else honestly. GPUs are not powerful enough and too expensive.
 
Last edited:
Seems like UE5 performance more than anything else honestly. GPUs are not powerful enough and too expensive.

Yup, flabbergasted anyone is surprised by UE5 as of now. It's a hog.

Why would you not play most games at max settings if you have the latest and best gpu etc?

So OverHeat OverHeat and a few others

Arrested Development Tobias GIF


Not me :messenger_loudly_crying:, that 3080 Ti was nice when it was at the top 2 cards though, for a short moment.
 
Last edited:
getting an average of 40 but some extreme slow down (9fps) for some reason...anyone that is much more knowledgeable know what it can be?maybe a slow hdd that slows everything down when the game has to load new textures? is that a thing? XD
Are you playing on an HDD?
 
getting an average of 40 but some extreme slow down (9fps) for some reason...anyone that is much more knowledgeable know what it can be?maybe a slow hdd that slows everything down when the game has to load new textures? is that a thing? XD
Are you running out of VRAM?
This used to happen to me playing FF15 maxed out on my 3gb 780Ti back in the day
 
Last edited:
Sooo...... Anyone tried the infamous 4060???
I'm currently away and i hope i can achieve 2k dlss (playing on a c1 oled) medium settings and and some sort of rt.... Around 40 fps is fine for me ....
 
Starting to think that even a 5090 will not be enough to adequately run upcoming next gen games on my DUHD UW.....
Holy shit, seeing a 4090 struggle like this in "normal" 4k without FG......
I`ll test this on my 3080 but somehow I think I won`t like the results as I´m kind of a "if I have to compromise I´d rather not play it at all"-person.

will be interesting to see if we see another 2x performance increase with 5090 in RT titles. I hope it does.
 
The use of Frame Gen should be banned from all benchmark screenshots. The real FPS is like 0.5 what is didisplayed.
True fps will always be better, but DLSS FG can transform the experience if your GPU cannot deliver a high refresh rate without help.

For example I'm playing Alan Wake 2 right now. This game is extremely demanding with path tracing and it also has a very blurry image, so I play it at 4K DLSS performance downscaled (DLDSR) to 1440p to get a reasonably sharp image. At these settings I get around 45-60fps in the forest levels (over 60fps in NYC level). The game is still playable on my VRR monitor, but I can clearly see when performance drops below 60fps (I start seeing judder) and it can be annoying.

Normally I would cap the frame rate at 45fps to play a game like this, but now I dont need to. Thanks to DLSS frame generation the game always feels smooth and input lag isnt really that noticeable (especially on gamepad). I tried playing the game with real 80-100fps (FG and PT turned off) and my gameplay experience wasnt really that different, so I went back to playing with max PT settings and FG because the game looks better.

Thanks to DLSS FG I stoped looking at fps counter and worry if my performance dips below 60fps. I also plan to play Black Myth with FG on.
 
Processor Intel(R) Core(TM) i9-9900K CPU @ 3.60GHz 3.60 GHz

Installed RAM 32.0 GB

Nvidia EVGA 3080

Dell 24" GSYNC 165 hz 2560x1440 monitor


0jIi8v1.jpeg
hDLv9c4.jpeg
 
Top Bottom