• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] Monster Hunter Wilds - we can't recommend the PC version

Long analytical article, some snippets below.


This is where I have to start theorising. I can't be 100 percent sure, but if I had to guess, I would say that this behaviour is due to DirectStorage GPU decompression. The game's files do include DirectStorage DLLs, so it's possible that the game is streaming (and potentially decompressing) textures on the GPU in an overly aggressive manner when turning the camera, despite the VRAM headroom shown by RTSS and CapFrameX, leading to the frame-rate impact that's felt most strongly on compute-challenged cards like the RTX 4060.

This is all theoretical, of course, but the measured behaviour is sufficient for us to simply not recommend this game to anyone with a graphics card with 8GB of VRAM or less. Having to decide between high textures that cause stutters while looking sub-par and medium textures that look even worse, neither is a choice we can particularly recommend. I also saw similar issues on the RTX 2070 Super, RX 5700 and Arc A770, with Intel's GPU in particular running extremely poorly, at around 20fps with non-loading textures despite similar settings.

For higher-end GPUs I would also advise caution. Playing through the introduction on the RTX 4070 I saw consistent stuttering even on the high texture setting, so it would be logical to assume this could get even worse with the higher-res texture pack installed - and that's backed up by results from the benchmark tool. Based on my measurements in the base game, I believe the stutters are not related to running out of VRAM, but could in fact be related to constant streaming and perhaps real-time decompression.

To sum up, lower-end graphics cards with lower VRAM allocations should avoid Monster Hunter Wilds until these issues are rectified. Higher-end hardware can brute-force the game's shortcomings to some degree, but it's still hard to recommend such a flawed technical outing - or even derive optimised settings. For now then, we'll leave things there, but I hope we see some much-needed improvements before too long.




Full article:
That's a surprise when our initial impressions of the game on PC are fairly positive. The game uses a long shader compilation step on first launch, taking around six minutes on a Ryzen 7 9800X3D, and more than 13 minutes on a Ryzen 5 3600. The graphical settings menu is also nicely designed, with fine-grained options, a VRAM metre, component-specific performance implications and preview images for particular settings.

However, there's some weirdness too. The game prompts you to enable frame generation before it even starts with a pop-up, even on low-end machines where frame generation doesn't really make sense. (Neither AMD nor Nvidia recommends using frame-gen with a low base frame-rate - 30fps, for example.) If you decline this offer, the game makes sure to tell you that you can turn it on in the settings menu, which as a reviewer paints a poor picture of the game's potential performance.

After all the pop-ups, like any casual user I used the game's default detected settings and set DLSS to balanced mode at 1440p on the RTX 4060. Loading up the game presented more issues, with PS3-level texture quality in many shots and many textures that looked like they had loaded incorrectly - one character's white coat was rendered multi-colour by mosaic artefacts due to poorly configured texture compression.

After getting character control following the intro cutscenes, each turn of the camera was greeted by noticeable stutters. This sort of stutter is often triggered by streaming textures into VRAM, but the VRAM metre in the settings was showing good performance - making it far from a helpful indicator for a casual user. Turning down the setting from the default high to medium solved the camera turn stutter issue on the RTX 4060 8GB, but I saw similar texture quality problems on the RTX 4070 12GB too. It's also worth noting that users of higher-end GPUs can download "highest" quality textures as separate DLC, but this wasn't available during our testing before release.

Back on the RTX 4060 with medium textures in place, despite lacking the huge stutters we had before, the results still aren't great. The actual texture quality is now reminiscent of games from the early 2000s, and the frame-rate was still dipping noticeably with a hit to frame health when standing in place and turning the camera. If we turn the camera extremely gradually, the sinusoidal rhythm we experienced previously becomes smoother as frame health improves. This is a very odd performance characteristic in what is essentially an empty desert with tiled sand textures, and not something we expect to see in any game, let alone a triple-A release in the year of our lord 2025.

I've seen similar behaviour before with games that exhbited occlusion culling issues on slower CPUs, but this game is running on a Ryzen 7 9800X3D. Otherwise, I've only seen this sort of issue when the game is doing something noticeably suboptimal when streaming things onto the graphics card, and this is true even when dropping the texture resolution to its lowest setting. This could be a problem for the RTX 4060 in particular, but the 4070 shows similar issues, albeit with higher base frame-rates.

This is where I have to start theorising. I can't be 100 percent sure, but if I had to guess, I would say that this behaviour is due to DirectStorage GPU decompression. The game's files do include DirectStorage DLLs, so it's possible that the game is streaming (and potentially decompressing) textures on the GPU in an overly aggressive manner when turning the camera, despite the VRAM headroom shown by RTSS and CapFrameX, leading to the frame-rate impact that's felt most strongly on compute-challenged cards like the RTX 4060.

This is all theoretical, of course, but the measured behaviour is sufficient for us to simply not recommend this game to anyone with a graphics card with 8GB of VRAM or less. Having to decide between high textures that cause stutters while looking sub-par and medium textures that look even worse, neither is a choice we can particularly recommend. I also saw similar issues on the RTX 2070 Super, RX 5700 and Arc A770, with Intel's GPU in particular running extremely poorly, at around 20fps with non-loading textures despite similar settings.

For higher-end GPUs I would also advise caution. Playing through the introduction on the RTX 4070 I saw consistent stuttering even on the high texture setting, so it would be logical to assume this could get even worse with the higher-res texture pack installed - and that's backed up by results from the benchmark tool. Based on my measurements in the base game, I believe the stutters are not related to running out of VRAM, but could in fact be related to constant streaming and perhaps real-time decompression.

To sum up, lower-end graphics cards with lower VRAM allocations should avoid Monster Hunter Wilds until these issues are rectified. Higher-end hardware can brute-force the game's shortcomings to some degree, but it's still hard to recommend such a flawed technical outing - or even derive optimised settings. For now then, we'll leave things there, but I hope we see some much-needed improvements before too long.
 
9lwcdv.jpg


mzv7y1X.png
 

Fbh

Member
This is people saying they are fine with terribly optimized game, and okay with a future where games NEED to use upscaling and frame generation to even be "playable". I find this sad to be honest. AAA optimization has been so terrible lately, and it's only going to get worse...

MH has basically achieved Pokemon status. Capcom has given fans a horribly optimized game on every platform but fans simply don't care, so they have literally zero incentives to improve the tech.

I guess congrats to Capcom. There aren't a lot of devs besides them and Game freak that can get away with it.
 

Kacho

Gold Member
This is people saying they are fine with terribly optimized game, and okay with a future where games NEED to use upscaling and frame generation to even be "playable". I find this sad to be honest. AAA optimization has been so terrible lately, and it's only going to get worse...
tbf Monster Hunter is the exception not the norm. Kinda similar to Cyberpunk, which was also busted but posted insane CCUs.
 
Capcom have to either ditch RE engine for open world games or they need to fix their engine and update it significantly so it can work on open world games. Dragons dogma 2 and monster hunter wilds have both run like crap on launch. Gotta do better.
 

Larxia

Member
MH has basically achieved Pokemon status. Capcom has given fans a horribly optimized game on every platform but fans simply don't care, so they have literally zero incentives to improve the tech.

I guess congrats to Capcom. There aren't a lot of devs besides them and Game freak that can get away with it.
The pokemon comparison is a good example yes, I actually told that to a friend who pre ordered the game just after trying the benchmark which ran terribly on his computer... lol
By doing this, people have sentenced the franchise to be a low effort forever. No reason for them to do better if it sells like crazy no matter what.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
How much vram does the 4070 have? Because I've already played for ~12 hours and had zero stutters on my 7800xt, which is supposed to be slower than a 4070.
Yeah it seems dumb that they theorize this is a VRAM issue, backed up by medium textures solving the problem on the 4060 but then don't test a higher VRAM card.
 
His theory is hilarious when the apparent fix is to simply disable Reflex and FG at the moment as they are broken in the engine and cause massive stuttering

Also people have found, assuming you can hit the framerate cap, that the in-game framerate cap also causes stuttering and the solution there is to use Vsync/Gsync instead and disable the in-game framerate cap
 

Rambone

Member
I'm finding DLSS just blows in this game. For nvidia cards just switch to AMD FS3, native, frame doubleing in the game settings. Getting 60-90fps now with my 3080ti, everything on high (textures on highest with texture pack) and a reasonable ultrawide resolution 3440x1440. Runs butter smooth and no longer looks like shit.
 
Last edited:
I'm finding DLSS just blows in this game. For nvidia cards just switch to AMD FS3, native, frame doubleing in the game settings. Getting 60-90fps now with my 3080ti, everything on high (textures on highest with texture pack) and a reasonable ultrawide resolution 3440x1440. Runs butter smooth and no longer looks like shit.
Or just force DLSS 4 with multiple different possible methods (including Nvidia App after the latest driver update) and get amazing IQ compared to DLSS 3. I have no fucking idea why anyone would choose FSR 3 over even DLSS 3 much less DLSS 4 that sounds like choosing to eat cat food instead of Wagyu steak
 

Rambone

Member
Or just force DLSS 4 with multiple different possible methods (including Nvidia App after the latest driver update) and get amazing IQ compared to DLSS 3. I have no fucking idea why anyone would choose FSR 3 over even DLSS 3 much less DLSS 4 that sounds like choosing to eat cat food instead of Wagyu steak
I'll give it a shot but it was all about performance.
 

xenosys

Member
tbf Monster Hunter is the exception not the norm. Kinda similar to Cyberpunk, which was also busted but posted insane CCUs.

CDPR's share price took a massive hit back then. One they still haven't fully recovered from, even with 4 years of damage control, patches, great DLC and a rabid online community that have tried to gaslight the rest of the industry into thinking that their game wasn't a technical mess for almost 3 years.

Wrong place, wrong time is all it'll take for Capcom's to go the same way.
 
Last edited:

Agent_4Seven

Tears of Nintendo
So, what the fuck was that all about? Am I blind or the game runs perfectly on my rig with locked 30 FPS in both maxed out (incl RT) native 1440p DLAA and 4K DLSS Quality (I'm using the latest DLSS version)? Literally ZERO stutters during camera movement both fast and slow, same goes for traversal and combat. So WTF? Why hasn't Alex tested the locked 30 FPS option via Riva? Also was it without day one patch and released game ready drivers by NVIDIA yesterday? I just don't get it. My rig for reference: 8700K @4.8GHz All Cores, 32GB 3600 RAM (4 by 8), Z390 Aorus Master, RTX 3080Ti 12GB VRAM, Windows 11 23H2, Direct Storage is enabled and I'm running the game from high end NVMe.

As for visuals. Like, I'm sure modders will fix this in no time (or in not that distant future), but the game has exactly the same visual flaws as DD''2'' had - complete lack of RT local shadows, shadows from the sun, proper AO to blend objects with environment, proper material interaction with lighting and textures are very bad (SF6 also has this problem). I mean, RE Engine was sure as fuck not build for open world games, it's so obvious that only the blind can see it. It works exceptionally well for more linear games like RE and such and I'm sure that the new Onimusha game will be great performance-wise. So, I just don't fuckin' get it, why Capcom using RE Engine for open world games instead of making completely new engine just for that maybe based on RE engine, but completely reworked to work a lot better and not require such insane hardware to run at anything above 30FPS with lots of image reconstruction and other shit to ruin the visual quality of the game in the end (not to mention gameplay experience with FG)?

whatever-shrug.gif


Anyway, the game itself is great and story is already way better at the beginning that anything in DD''2'' story-wise. It's my first MH game so clearly I need to learn a lot before I can call myself even remotely competent at playing it right:messenger_relieved:
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
So, what the fuck was that all about? Am I blind or the game runs perfectly on my rig with locked 30 FPS in both maxed out (incl RT) native 1440p DLAA and 4K DLSS Quality (I'm using the latest DLSS version)? Literally ZERO stutters during camera movement both fast and slow, same goes for traversal and combat. So WTF? Why hasn't Alex tested the locked 30 FPS option via Riva? Also was it without day one patch and released game ready drivers by NVIDIA yesterday? I just don't get it. My rig for reference: 8700K @4.8GHz All Cores, 32GB 3600 RAM (4 by 8), Z390 Aorus Master, RTX 3080Ti 12GB VRAM, Windows 11 23H2, Direct Storage is enabled and I'm running the game from high end NVMe.

As for visuals. Like, I'm sure modders will fix this in no time (or in not that distant future), but the game has exactly the same visual flaws as DD''2'' had - complete lack of RT local shadows, shadows from the sun, proper AO to blend objects with environment, proper material interaction with lighting and textures are very bad (SF6 also has this problem). I mean, RE Engine was sure as fuck not build for open world games, it's so obvious that only the blind can see it. It works exceptionally well for more linear games like RE and such and I'm sure that the new Onimusha game will be great performance-wise. So, I just don't fuckin' get it, why Capcom using RE Engine for open world games instead of making completely new engine just for that maybe based on RE engine, but completely reworked to work a lot better and not require such insane hardware to run at anything above 30FPS with lots of image reconstruction and other shit to ruin the visual quality of the game in the end (not to mention gameplay experience with FG)?

whatever-shrug.gif


Anyway, the game itself is great and story is already way better at the beginning that anything in DD''2'' story-wise. It's my first MH game so clearly I need to learn a lot before I can call myself even remotely competent at playing it right:messenger_relieved:
Who plays at 30fps on a 3080 Ti?
 

Agent_4Seven

Tears of Nintendo
how tf u can use those at same time ?
Obviously not at the same time, maybe my wording is wrong. Anyway, I tested both Native 1400p and 4K DLSS Quality.

Who plays at 30fps on a 3080 Ti?
Try and run this game at anything above 30FPS on a comparible rig to mine at max settings at native 1440p DLAA or 4K DLSS Quality and you'll get why I play like that, spoiler - cuz it's perfectly stable and anything beyond that is unplayable.

giphy.gif
 
Last edited:
So, what the fuck was that all about? Am I blind or the game runs perfectly on my rig with locked 30 FPS in both maxed out (incl RT) native 1440p DLAA and 4K DLSS Quality (I'm using the latest DLSS version)? Literally ZERO stutters during camera movement both fast and slow, same goes for traversal and combat. So WTF? Why hasn't Alex tested the locked 30 FPS option via Riva? Also was it without day one patch and released game ready drivers by NVIDIA yesterday? I just don't get it. My rig for reference: 8700K @4.8GHz All Cores, 32GB 3600 RAM (4 by 8), Z390 Aorus Master, RTX 3080Ti 12GB VRAM, Windows 11 23H2, Direct Storage is enabled and I'm running the game from high end NVMe.

As for visuals. Like, I'm sure modders will fix this in no time (or in not that distant future), but the game has exactly the same visual flaws as DD''2'' had - complete lack of RT local shadows, shadows from the sun, proper AO to blend objects with environment, proper material interaction with lighting and textures are very bad (SF6 also has this problem). I mean, RE Engine was sure as fuck not build for open world games, it's so obvious that only the blind can see it. It works exceptionally well for more linear games like RE and such and I'm sure that the new Onimusha game will be great performance-wise. So, I just don't fuckin' get it, why Capcom using RE Engine for open world games instead of making completely new engine just for that maybe based on RE engine, but completely reworked to work a lot better and not require such insane hardware to run at anything above 30FPS with lots of image reconstruction and other shit to ruin the visual quality of the game in the end (not to mention gameplay experience with FG)?

whatever-shrug.gif


Anyway, the game itself is great and story is already way better at the beginning that anything in DD''2'' story-wise. It's my first MH game so clearly I need to learn a lot before I can call myself even remotely competent at playing it right:messenger_relieved:
Limiting yourself to 30 fps seems like an exercise in self-flagellation to me

I have a 5800X3D and a 3090 and I'm letting it run uncapped with Vsync/Gsync and at 4K with DLSS 4 Performance using Preset K (Transformer mode) it hardly looks any different from native except that sometimes hair and grass have a bit of graininess which is inherent in how TAA-based upscalers work. DLSS 4 is absolutely amazing, if you're trying to play on native you're not accomplishing anything besides lowering your framerate for no good reason. My slightly altered Ultra settings (no Motion Blur, no Vignette) are easily getting me a solid 50-60 fps even in the Base Camp, and I'm getting plenty smooth gameplay with no stuttering issues at all with the High Resolution Texture Pack enabled

DF are looking more and more stupid here, the game doesn't look great at times but it runs just fine for me and the vast majority of other people judging from how there's been at least 1.3 million PC players and they are all too busy having fun to look at frame time graphs. I guess "works on my machine" is the best I can do for the complainers and DF
 
Last edited:

Agent_4Seven

Tears of Nintendo
Limiting yourself to 30 fps seems like an exercise in self-flagellation to me
My CPU and platform is 5 yo, there's only so much I can do with what I have. Also unlike 60FPS purists, I'm totally okay with perfectly stable 30FPS experience with the best visuals possible and I'm using a controller on PC to play 95% of games cuz I hate digital WASD movement with passion.

I have a 5800X3D and a 3090 and I'm letting it run uncapped with Vsync/Gsync and at 4K with DLSS 4 Performance using Preset K (Transformer mode) it hardly looks any different from native except that sometimes hair and grass have a bit of graininess which is inherent in how TAA-based upscalers work. DLSS 4 is absolutely amazing, if you're trying to play on native you're not accomplishing anything besides lowering your framerate for no good reason. My slightly altered Ultra settings (no Motion Blur, no Vignette) are easily getting me a solid 50-60 fps even in the Base Camp,
I'm playing @4K DLSS Quality with turned off vignette and motion blur, so there's that. I tested native 1440p DLAA uncapped and it's not a good experience, barely hitting 40 FPS at the camp and frametimes are trash. Native 4K DLAA is unplayeble woth 20+ FPS. Perfectly stable 30 FPS @4K DLSS Quality is the only option for me.

and I'm getting plenty smooth gameplay with no stuttering issues at all with the High Resolution Texture Pack enabled
Not an option for 3080Ti cuz it has x2 time less VRAM and they recommend 16GB at minimum. And... 70 GBs? WTF? Modders will do a much better job at some point just like they did for DD''2'', so fuck that 70GB Capcom shit 24/7.

DF are looking more and more stupid here, the game doesn't look great at times but it runs just fine for me and the vast majority of other people judging from how there's been at least 1.3 million PC players and they are all too busy having fun to look at frame time graphs. I guess "works on my machine" is the best I can do for the complainers and DF
Right? I just don't get it. Where's all those camera stutters too? I literally haven't had any at all and cutscenes are perfect as well.

SpecialK has made some testing and found the issue is the DRM.
This game uses Denuvo and might also use Capcom's own DRM on top. And they are spamming the CPU with this in every thread they create.



And some people wonder why gamers hate crap like Denuvo...

No shit. Looks like we have another RE8 here.
 
Last edited:

ebevan91

Member
His theory is hilarious when the apparent fix is to simply disable Reflex and FG at the moment as they are broken in the engine and cause massive stuttering

Also people have found, assuming you can hit the framerate cap, that the in-game framerate cap also causes stuttering and the solution there is to use Vsync/Gsync instead and disable the in-game framerate cap

I was excited to try this (disabling Reflex) but it was already disabled in my game :(
 

Scrawnton

Member
Downloading the hi res texture pack and playing the game on my 4070s at 4k, ultra settings, no RTX, DLSS balanced, and frame gen on gave this game quite the facelift. It's gorgeous. The hi res texture update is mandatory for pc.
 
Downloading the hi res texture pack and playing the game on my 4070s at 4k, ultra settings, no RTX, DLSS balanced, and frame gen on gave this game quite the facelift. It's gorgeous. The hi res texture update is mandatory for pc.
Those cutscenes though. If that was those post process effects are in realtime, it would be a damn good looker.
 

Garibaldi

Member
This is people saying they are fine with terribly optimized game, and okay with a future where games NEED to use upscaling and frame generation to even be "playable". I find this sad to be honest. AAA optimization has been so terrible lately, and it's only going to get worse...
Totally agree. I had it preordered on cdkeys but ultimately cancelled as performance was too variable on my 3090 for the price they were asking. My mate's 3080 was largely unplayable he said and we usually play together.

Will wait to see if they fix it up. Plenty more games to be playing in the meantime.
 
Last edited:

rofif

Can’t Git Gud
All the bitching mirrors the same we heard with dragons dogma 2... and again, my buddy with a 4080 is really happy, game runs great.
You guys are happy, game runs great. I am happy, benchmark runs great.

so wtf? Pc gamers forgot how to pc game.
It used to be absolutely normal that 2 years old pc was straight up trash. I got that 3dfc in 1998? well good luck with it in 2000. Max Payne would not even boot.
2-3 years was the usual rotation.
Now people have their 5 years old + parts and they are moaning to heavens about the game running bad?

Yeah, the benchmark ran like trash on my pc... but my 3700x was from 2019 and 3080 from 2020. TF I am in no place to complain.
And guess what? I upgraded to 5700x3d and the benchmark runs GREAT. textures 1 level down. Shadows 1 level down and the framepacing line is SMOOOOOOTH and flat.

you are a fucking pc gamer. Know your hardware. Know your limitations and know your settings and diagnostics. The settings are there for a reason. Not just to set it to ultra and complain that you 6 years old pc stutters.
Ffs do I have to say that?

And another thing is whining about bad graphics. Was again the same with dragons dogma 2 and I think it looked incredible. Very tangible lighting. Felt grounded. I loved how that game looks and all I saw was complaining?
Can people not recognize what really good graphics look like anymore? Does it need 100 neon lights on the screen or instagram filter to "look" good?
Amateur, noveu fake pc gamers everywhere. Shameless.

That said, I will be playing on ps5 pro because it is my preference. If I didn't had a ps5, the pc version seemed great from the benchmark.
 

Buggy Loop

Member
MH has basically achieved Pokemon status. Capcom has given fans a horribly optimized game on every platform but fans simply don't care, so they have literally zero incentives to improve the tech.

I guess congrats to Capcom. There aren't a lot of devs besides them and Game freak that can get away with it.

Nailed it. Couldn’t have been a better comparison.
 

Sentenza

Member
Here we go again with DF and Eurogamer pretending that the issues they are spotting on a PC version magically do not apply in even worse forms to their console counterparts.
 
Last edited:

Bojji

Member
SpecialK has made some testing and found the issue is the DRM.
This game uses Denuvo and might also use Capcom's own DRM on top. And they are spamming the CPU with this in every thread they create.



And some people wonder why gamers hate crap like Denuvo...


That's classic Capcom:



So Denuvo isn't enough? Those morons will never learn.
 

Agent_4Seven

Tears of Nintendo
To add my 2 cents, I'm not even a fan of MH games and haven't played a single one before Wilds came out so don't Capcom fan me:messenger_beaming:

Should the game be playable at 60FPS given just how bad it looks in terms of what I have mentioned? Absolutely! Will Capcom fix everything "just like they did for DD''2''? Of course not. But the game is fully playable at 30FPS all maxed out if your hardware can do it and you can stand 30FPS or even 40-45.

I'm not saying that this is OK cuz it fuckin' isn't and I'd love to play the game at perfectly stable 60 without visual compromises, but it's just not gonna happen for anyone unless DRMs will be removed completely and a ton of optimization patches will be released (which is not gonna happen, see DD"2").
 
Top Bottom