• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Black Myth is a Myth it is hard to run. PC Urben Legends on hardware requriements?

Gamer79

Predicts the worst decade for Sony starting 2022
I keep seeing on every forum how this Game is Breaking Rigs and you need a thousand dollar card to run it so I did some testing on my own. I don't know if PC Enthusiast love to exaggerate but needed a Thousand Dollar Card to run games like this is complete bullshit. My card was a sub $500 card and I will let the result speak for themselves. Sure these results are using DLSS and Frame generation but to be honest, I cannot tell the difference between it and native. I have seen this trend on many games like Cyberpunk where there is these crazy legends about how much hardware you need to max these games. out. My rig total cost less than $1200 (self built with 3 tb nvme storage) and there isn't a game I can't run on very high settings with 60 fps, including Alan Wake 2. I don't know where this bullshit comes from but it scares people away with the perception of cost. Many people like myself before I educated myself listen to this bullshit and never wanted to build a computer because I thought I might need a mortage to do so.

Oh, but I am talking out of my ass right? There is no way that my shitty $500 4070 is running this game any decentlty according to the usual legend, right? Taken from my 65 inch oled, and I can attest it looks gorgeous!

Io5pHXd.jpeg
 

SlimySnake

Flashless at the Golden Globes
It’s not bad at all. Especially compared to other games this year.

Path tracing is optimized for the 4000 series cards. I can’t run it without hitching.

But other than that, it runs like a dream at high settings at 60 fps dlss 4k quality to performance based on the area. Looks insane too. Lumen gi is excellent.,
 

Gamer79

Predicts the worst decade for Sony starting 2022
You used frame generation, this post effectively has no point.

Whether it "looks the same" or not, you're still getting input latency equivalent to the REAL framerate, so if your PC is struggling to get 30-40fps, thats your actual input latency, which is absolutely abysmal to feel.
Again, I hear that and it's bullshit. I play the games and have been a native console player most of my life. I don't feel the input latency and again, in a blind test I would not be able to tell the difference. Only if I were playing competitive multiplayer would I guess it would make any difference.
 

buenoblue

Member
The truth is most pc games run much worse on release date. Then after numerous patches they run much better. Obviously the buzz and reviews are from the release version that runs worse. So yeah if you wait 6months/1 year you will have a better experience.
 
The main issue I had, and a lot of people had, is the stuttering issue, and that isn't related to hardware in this case. Unreal Engine 5 has made many games using it suffer from stuttering, and it can be bad enough to ruin your experience.

Unreal Engine 5 is a bit of a hog on its own to run. I think Black Myth on consoles even turned off certain features to try and help it run better, but in most cases, it still had stuttering issues.

This game is great looking even with lowered settings though. This isn't a game for PC vs console wars either. Again, it's the game engine that is making things difficult.
 
Last edited:

Danny Dudekisser

I paid good money for this Dynex!
I like how frame gen and the myriad upscaling techs out there have helped make people even less informed about how games should perform.

Those results are not good - if that's what you're getting with all of those band-aids applied, it's not encouraging.
 

FingerBang

Member
I keep seeing on every forum how this Game is Breaking Rigs and you need a thousand dollar card to run it so I did some testing on my own. I don't know if PC Enthusiast love to exaggerate but needed a Thousand Dollar Card to run games like this is complete bullshit. My card was a sub $500 card and I will let the result speak for themselves. Sure these results are using DLSS and Frame generation but to be honest, I cannot tell the difference between it and native. I have seen this trend on many games like Cyberpunk where there is these crazy legends about how much hardware you need to max these games. out. My rig total cost less than $1200 (self built with 3 tb nvme storage) and there isn't a game I can't run on very high settings with 60 fps, including Alan Wake 2. I don't know where this bullshit comes from but it scares people away with the perception of cost. Many people like myself before I educated myself listen to this bullshit and never wanted to build a computer because I thought I might need a mortage to do so.

Oh, but I am talking out of my ass right? There is no way that my shitty $500 4070 is running this game any decentlty according to the usual legend, right? Taken from my 65 inch oled, and I can attest it looks gorgeous!

Io5pHXd.jpeg
Frame gen on? You're going to love the 5070. It will run this game at 160fps for you.
 

rofif

Can’t Git Gud
Every Gaf pc glazer makes it seem like they can only play games at 4k ultra and console peasants are a joke at 1800p.

But most gafers seem to play at 1440p and lie about performance
internal 1080p with frame gen.
But I am the stupid peasant who plays 40fps a mode on ps5 pro.
Granted, he gets higher graphical settings
 
I keep seeing on every forum how this Game is Breaking Rigs and you need a thousand dollar card to run it so I did some testing on my own. I don't know if PC Enthusiast love to exaggerate but needed a Thousand Dollar Card to run games like this is complete bullshit. My card was a sub $500 card and I will let the result speak for themselves. Sure these results are using DLSS and Frame generation but to be honest, I cannot tell the difference between it and native. I have seen this trend on many games like Cyberpunk where there is these crazy legends about how much hardware you need to max these games. out. My rig total cost less than $1200 (self built with 3 tb nvme storage) and there isn't a game I can't run on very high settings with 60 fps, including Alan Wake 2. I don't know where this bullshit comes from but it scares people away with the perception of cost. Many people like myself before I educated myself listen to this bullshit and never wanted to build a computer because I thought I might need a mortage to do so.

Oh, but I am talking out of my ass right? There is no way that my shitty $500 4070 is running this game any decentlty according to the usual legend, right? Taken from my 65 inch oled, and I can attest it looks gorgeous!

Io5pHXd.jpeg
Or you can turn off ray tracing and have improved frametimes, 1% lows and input lag.

It barely makes a difference to the visuals in this game.
 

GymWolf

Member
I have a powerful pc but other than some rare stuttering whem you saw something new, the game run oike a dream on my end, vastly inferior looking games run much worse so criticizing arguagly the best looking game of the year for not running perfectly on every hardware, is retarded.

You can't have state of the art graphic running like your average indie.

Also the rtx is barely noticeable most of the times, you disable that and get a lot of frames in return.
 
Last edited:

GHG

Member
I keep seeing on every forum how this Game is Breaking Rigs and you need a thousand dollar card to run it so I did some testing on my own. I don't know if PC Enthusiast love to exaggerate but needed a Thousand Dollar Card to run games like this is complete bullshit. My card was a sub $500 card and I will let the result speak for themselves. Sure these results are using DLSS and Frame generation but to be honest, I cannot tell the difference between it and native. I have seen this trend on many games like Cyberpunk where there is these crazy legends about how much hardware you need to max these games. out. My rig total cost less than $1200 (self built with 3 tb nvme storage) and there isn't a game I can't run on very high settings with 60 fps, including Alan Wake 2. I don't know where this bullshit comes from but it scares people away with the perception of cost. Many people like myself before I educated myself listen to this bullshit and never wanted to build a computer because I thought I might need a mortage to do so.

Oh, but I am talking out of my ass right? There is no way that my shitty $500 4070 is running this game any decentlty according to the usual legend, right? Taken from my 65 inch oled, and I can attest it looks gorgeous!

Io5pHXd.jpeg

You're using frame generation and you're running it at 1440p on a 65 inch 4k TV.

Sorry, but that will look like ass.
 

MMaRsu

Member
Isnt Wu Kong just unoptimized?

I own the game but barely played it. Lots of small paths where you can walk for 5 seconds to hit a invisible wall. What a technical marvel
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Max settings?
Stupid PC gamers setting games to max settings then wondering why they arent getting 4K120fps.
They then turn around and call the game unoptimized when in truth we should be happy whenever a future looking game comes out......you shouldnt be able to hit 4K120 Max settings with an RTX 2060S.

Isnt Wu Kong just unoptimized?

No
 

PeteBull

Member
Oh, but I am talking out of my ass right? There is no way that my shitty $500 4070 is running this game any decentlty according to the usual legend, right? Taken from my 65 inch oled, and I can attest it looks gorgeous!

Io5pHXd.jpeg
Ur "shitty" 4070 is stronger than a ps5pr0, bro, ofc it will run the game well, ofc just like u showed u have to lower the settings(cinematic are extremly demanding, thats like ultra beyond ultra, many games hide those settings in ini.file even) and run game in 1440p with dlss and framegen on ;).
Checked on steam few months ago and u are easily in top10% steamusers in terms of gpu performance btw(even if we just say raw performance, aka include highend amd cards which obviously lack at rt :p ).
U can see exactly where 4070 is in gpu hierarchy, its sitting at 50% of current(still for 2 more weeks;p) BiS gpu, rtx 4090 (which is 2500usd+ for cheapest models even now)so its still quite good ;)
 
Last edited:

yamaci17

Member
Isnt Wu Kong just unoptimized?

I own the game but barely played it. Lots of small paths where you can walk for 5 seconds to hit a invisible wall. What a technical marvel
people just obsess with max or full ray tracing settings or try to find flaws in things all the time

I have two friends one with a 3070 and the other with a 6700xt. they played this game from start to finish, never heard any of them having any problem or whatsoever (and funny enough 3070 user plays at 1440p and 6700xt user plays at 1080p)
 
You're using frame generation and you're running it at 1440p on a 65 inch 4k TV.

Sorry, but that will look like ass.
1440p on a 4K monitor should be a crime.

He should disable raytracing (the most overrated feature to grace gaming - yes, I said it), play at 4K balanced or performance, and enjoy superior image quality and performance.
 
  • Like
Reactions: GHG

Stuart360

Member
Pretty sure people are talking about path tracing at native 4k.
Even my 1080ti when i had it ran that benchmark thing at 1080p/60 max settings minus RT.
 

rofif

Can’t Git Gud
Ehh??? I think you have no idea what you're talking about



Why would I want the effect at 0:17 ????

I like the effect. It saves 40 and even 60fps.
It’s weaker with more fps just to make the motion smoother.
Besides I described per object motion blur. Bit I like camera motion blur too.
If you enable motion blur at 240fps in doom for example, it looks the same as off because there are enough frames.
I don’t know of it works like this in uncharted.
But in Wukong at 40fps, it’s excellent
 

Gaiff

SBI’s Resident Gaslighter
Max settings?
Stupid PC gamers setting games to max settings then wondering why they arent getting 4K120fps.
They then turn around and call the game unoptimized when in truth we should be happy whenever a future looking game comes out......you shouldnt be able to hit 4K120 Max settings with an RTX 2060S.
It's not max settings. The cinematic preset is max settings.

As for Gamer79 Gamer79 I recommend disabling frame generation and Full Ray Tracing and cranking up DLSS to above 90%. You would be playing 1440p with DLAA. You can probably get a native 60fps or close. If you cannot, bump down the lighting to High and you should be there.
 
Last edited:

yamaci17

Member
So Wu Kong doesnt do camera motion blur?

I turned that shit off as fast as possible so I didnt know
most of the time developers just bundle all kinds of motion blur into one setting (something that alex from digital foundry hates). so you either get both per object blur and camera blur or none
 

Larxia

Member
You're using DLSS and Frame generation. Try running it at native and see how it performs.
DLSS should be something optional to boost performances, having it be mandatory is a modern problem with badly optimized games.
Frame generation shouldn't even be considered, this should just die and I find it sad if people get used to this and accept this as a mandatory setting.
Also weird to call a 4070 shitty.
 
Last edited:

Nvzman

Member
Again, I hear that and it's bullshit. I play the games and have been a native console player most of my life. I don't feel the input latency and again, in a blind test I would not be able to tell the difference. Only if I were playing competitive multiplayer would I guess it would make any difference.
Its not "bullshit", its exactly how frame generation works, it does NOT improve input latency with the fake frames added.

This is like looking at a 240p CRT and saying it looks like a 4K TV to your eyes and then putting your fingers in your ears and saying "I CANT TELL THE DIFFERENCE ITS BULLSHIT". Your terrible ability to tell the difference doesn't it make it not true. That's why people are complaining about the optimization, I personally cannot stand frame generation and nearly every game I play I can immediately tell when its on with the amount of smeary crap in the aliasing and the inputs feel off. Anyone who argues that the difference is negligible, especially when you're getting 70fps WITH IT ON (which means you must be getting a terrible real framerate), shouldn't even be posting like they know what they're talking about on a forum, because they clearly don't.
 

proandrad

Member
You used frame generation, this post effectively has no point.

Whether it "looks the same" or not, you're still getting input latency equivalent to the REAL framerate, so if your PC is struggling to get 30-40fps, thats your actual input latency, which is absolutely abysmal to feel.
The framerate does help the input lag but that doesn’t necessarily mean the response time will feel bad. It’s highly game dependent. You can lower latency elsewhere and options like Nvidia reflex can effectively cancel out the ms differences between 30 to 60fps.
 

Nvzman

Member
The framerate does help the input lag but that doesn’t necessarily mean the response time will feel bad. It’s highly game dependent. You can lower latency elsewhere and options like Nvidia reflex can effectively cancel out the ms differences between 30 to 60fps.
It's true you can take mitigating measures to make it feel less crap, but I'm sorry there is no way to cancel it out. That's objectively false and subjective if it even "feels" any better. Also Nvidia reflex with FG off is going to feel significantly more responsive than with FG on, it doesn't help reduce latency induced by the fake frames.
 
I have a 4070 Ti Super and I run the game without ray tracing, no frame gen... with DLSS at 50 and + custom graphics settings to achieve 90-105 fps.

I still think that the game is too demanding for what it it.
 

KyoZz

Tag, you're it.
internal 1080p with frame gen.
But I am the stupid peasant who plays 40fps a mode on ps5 pro.
Granted, he gets higher graphical settings
Because on console you get even WORSE LATENCY, WORSE GRAPHICS, AND WORSE RESOLUTION than on PC, rather it is with FG or not.

We keep telling you that but somehow you don't want to listen. So please, PLEASE, enjoy your console and stop coming into PC thread to talk about things you don't understand. Just enjoy what you have! Stop trying to convince other people that your opinion is the only one that make sense.

Just enjoy your console!! And I know it's a lost cause because you always want to be right, but still, I want to try to explain to you one last time. Please Rofif: show that you can be smart!
 

poppabk

Cheeks Spread for Digital Only Future
Ehh??? I think you have no idea what you're talking about



Why would I want the effect at 0:17 ????

Both look crappy in different ways. Not sure what the framerate is here but without motion blur everything looks juddery, at least the motion blur makes things look somewhat smooth albeit the intensity is a little too high.
 

proandrad

Member
It's true you can take mitigating measures to make it feel less crap, but I'm sorry there is no way to cancel it out. That's objectively false and subjective if it even "feels" any better. Also Nvidia reflex with FG off is going to feel significantly more responsive than with FG on, it doesn't help reduce latency induced by the fake frames.
Again, it really depends on the game. It definitely more than cancels it out the ms difference. 30 to 60fps is a difference of 16.6ms, reflex can drop your total system latency by way more than that. Now you can use reflex without anything else but that’s a different argument.
 

rofif

Can’t Git Gud
Because on console you get even WORSE LATENCY, WORSE GRAPHICS, AND WORSE RESOLUTION than on PC, rather it is with FG or not.

We keep telling you that but somehow you don't want to listen. So please, PLEASE, enjoy your console and stop coming into PC thread to talk about things you don't understand. Just enjoy what you have! Stop trying to convince other people that your opinion is the only one that make sense.

Just enjoy your console!! And I know it's a lost cause because you always want to be right, but still, I want to try to explain to you one last time. Please Rofif: show that you can be smart!
No.
I’ve been playing and I am playing on pc for past 30 years.
Consoles are so good for years now. Such a care free good experience. I got tired of of bs for exact reasons you outline. Caring about all the latency, technical stuff and everything was more important than just playing the game.
Consoles freed me from this. No more worrying about anything and being nvidia shill.

of course I know pc experience is faster and better and everything but don’t care.
Maybe one day you will get years of video games experience when you will realise what’s important. Console is a vacation from pc gaming.
Now that I have means to get any pc I want, I could go back to pc gaming high end. But why? To care about stuff I don’t have to care about right now?

Please respect my choices. I play on console for all the reasons that are not monetary. And consoles nowadays are great. Now more than ever, it’s not a laggy experience. But I was fine even with Xbox 360 20fps games… at least for exclusives back then.
 

MikeM

Member
Higher number at the cost of latency and potential artifacts. Glad you enjoy it but im too sensitive to latency to play that way.

Not everyone will enjoy a frame gen experience.
 

MMaRsu

Member
Both look crappy in different ways. Not sure what the framerate is here but without motion blur everything looks juddery, at least the motion blur makes things look somewhat smooth albeit the intensity is a little too high.

Well im pretty sure this is on consoles.. Ofcourse it will look better in motion on PC.
 
Top Bottom