Metal Gear Solid V: The Phantom Pain PC performance thread

Älg;178507088 said:
If I remember correctly I also get a lot of stuttering when using in-game v-sync. Try unlocking you framerate and forcing v-sync through NVCP, that might help.

You can read how to unlock the framerate here, it's super simple:

http://pcgamingwiki.com/wiki/Metal_Gear_Solid_V:_The_Phantom_Pain#High_frame_rate


ok did it and it seems to work. Only when snake does the dive (pressing X) does it drop a couple of frames. but man, this game is making my card run hot. Hovering around 75 degress. maybe one of my case fan is out.
 
Mhhh, thinking about it, it should be possible to unlock the framerate in-menu with some modding, huh? I think I'll look into this, would be nice to be able to change graphic settings without getting rid of the unlocked framerate
 
Mhhh, thinking about it, it should be possible to unlock the framerate in-menu with some modding, huh? I think I'll look into this, would be nice to be able to change graphic settings without getting rid of the unlocked framerate

This would be more than fantastic. Indeed, like how people in this very thread have made it so that "low" cvars turn on actual custom extra-extra high cvars. Could it not be possible to make the 30fps lock instead turn on variable instead of 30?
 
First time post, been lurking for a while. Hope you can help me as I'm really enjoying PP but this is killing my buzz.

3770k @ 4.2
EVGA 970 SC 4GB
16GB RAM
1440p DSR
HBAO+ Forced in NVCP, SSAO off in-game
All Settings Max except: High Screen Filtering + High Effects + Clouds Off

During the day its gravy everywhere I've been aside from known places/issues [Dragged under bed part of cutscene + gas mark reflection in prologue & strong dust effect], locked 60 fps with no drops. Showing 70-90% GPU usage in MSI Afterburner over, say, an hour's play.

Then at night, in mother base, on the R&D platform, when I look towards the centre of the platform the GPU usage goes to 95-100% and it drops the fps in-game to what looks like 30fps [I take it there's no way to get triple buffering without borderless fullscreen? Which gives me weird stutter issues and is always on top, annoying!], any ideas, I'm reluctant to drop the lighting to high, but could go to 2351x1323 if needed.

You guys think this just a weird game optimization issue/bug?
 
ok did it and it seems to work. Only when snake does the dive (pressing X) does it drop a couple of frames. but man, this game is making my card run hot. Hovering around 75 degress. maybe one of my case fan is out.

Do you have effects on Extra High? Because that setting makes alpha effects (such as the dust when you dive) ridiculously performance intensive. It'll easily drop you a couple of frames when diving or crawling, or anything of the sort.

75 degrees also isn't all that hot, unless of course you usually run a lot cooler than that.
 
First time post, been lurking for a while. Hope you can help me as I'm really enjoying PP but this is killing my buzz.

3770k @ 4.2
EVGA 970 SC 4GB
16GB RAM
1440p DSR
HBAO+ Forced in NVCP, SSAO off in-game
All Settings Max except: High Screen Filtering + High Effects + Clouds Off

During the day its gravy everywhere I've been aside from known places/issues [Dragged under bed part of cutscene + gas mark reflection in prologue & strong dust effect], locked 60 fps with no drops. Showing 70-90% GPU usage in MSI Afterburner over, say, an hour's play.

Then at night, in mother base, on the R&D platform, when I look towards the centre of the platform the GPU usage goes to 95-100% and it drops the fps in-game to what looks like 30fps [I take it there's no way to get triple buffering without borderless fullscreen? Which gives me weird stutter issues and is always on top, annoying!], any ideas, I'm reluctant to drop the lighting to high, but could go to 2351x1323 if needed.

You guys think this just a weird game optimization issue/bug?

Try lowering shadows. I'm guessing that's the problem at mother base since there are a lot of light sources and shadows cast.
 
Try lowering shadows. I'm guessing that's the problem at mother base since there are a lot of light sources and shadows cast.

It is a combo of both the light settings and the shadow setting. I would argue the "lighting" setting is more intensive than the shadow one. BUt the lighting one gives the game a massive advantage on PC. Try tweaking shadows like stated above.
 
Älg;178530371 said:
I'm gonna ask this again because it's super annoying:

Does anyone else have this texture issue?

http://i.imgur.com/5WAMC8G.png

http://i.imgur.com/HhlreMn.png

As I said earlier, that abrupt line points to it being a lower texture mip loading (just how games work!). That is a game setting that could probably be modded. But just to make sure it is not something else, do you have AF forced in your NVCP?
maybe I'm blind, but I can't see any texture issue here at all :S

Look at the side of the road texture how it suddenly becomes blurry and lower resolution.
 
As I said earlier, that abrupt line points to it being a lower texture mip loading (just how games work!). That is a game setting that could probably be modded. But just to make sure it is not something else, do you have AF forced in your NVCP?

Yeah, I have AF forced in NVCP. I tried turning it off as well, but it made no difference. Apart from worsening the AF, of course.

It feels like a strange issue, though. I haven't seen anyone else complain about it, but I feel as if it's too obvious of a issue for people to miss if they actually had it.

aha, I was looking in the wrong place!

I should probably have been more clear with what the issue was, haha. I guess I'm a bit annoyed right now.
 
Mhhh, thinking about it, it should be possible to unlock the framerate in-menu with some modding, huh? I think I'll look into this, would be nice to be able to change graphic settings without getting rid of the unlocked framerate

This would be more than fantastic. Indeed, like how people in this very thread have made it so that "low" cvars turn on actual custom extra-extra high cvars. Could it not be possible to make the 30fps lock instead turn on variable instead of 30?

I looked into it a bit and it seems like you might be able to do it through editing the executable with a hex editor.

If you open the exe in a hex editor you can find lines for variable and locked 30, etc. Just gotta figure out how to switch em around.
 
Factory OCed 980 Ti @1392Mhz/i7 4770k @4.2Ghz here.

Is it just me, or does the Extra High "Effects" setting seem to decrease performance exponentially at resolutions above 1080p?

Yes, I've seen others comment here on the fact that the alpha transparencies appear to be self-shadowed on Extra High, and the Geforce tweak guide mentions that without the "Effects" setting, a 980 Ti could run this at 4k 60 FPS easily, but shouldn't the "Effects" setting scale linearly with the resolution?

At 1080p, I have zero issue with Extra High "Effects"; Quiet can run in front of me all day, and it only brings my GPU usage from the average 45-50% to 60-70%. However, as soon as I DSR to 1440p, I can barely maintain 40 FPS in the same situation, with 99% GPU usage.

Seeing as 1440p has roughly 80% more pixels than 1080p, I'd expect roughly double the cost, not three or more times. It seems like certain shadow and effects settings were intended solely for a 1080p output, and raising the resolution above 1080p raises the internal resolution of said effects (separately from the base resolution) also.

After getting to the forest area behind the mansion
Code Talker
was kept in, and only getting around 45 FPS at 1440 DSR, I went back to 1080p, and haven't had a drop since.

Again, I'm not expecting NO performance decrease at higher resolutions, just a more linear one.
 
Factory OCed 980 Ti @1392Mhz/i7 4770k @4.2Ghz here.

Is it just me, or does the Extra High "Effects" setting seem to decrease performance exponentially at resolutions above 1080p?
It does scale completely normally, but check what it is doing to GPU usage when an effect spawns, even @ 1080p. You will be @ 40% GPU utilization in a scene and then an effect will spawn suddenly close to the camera and you can see the GPU utilization peak up to 90% for about 1 or 2 frames if you look at a graph... then it goes down to normal. It will still be 60 fps though since you are not cresting 99%. At higher resolutions then your minimum amount of GPU utilziation is higher so when that peak occurs for those 1 or two frames your framerate nose dives.

Is that a good way to design a particle system even on the highest settings? No, not really. Peaks and valleys should not really be so disparate IMO.
 
It does scale completely normally, but check what it is doing to GPU usage when an effect spawns, even @ 1080p. You will be @ 40% GPU utilization in a scene and then an effect will spawn suddenly close to the camera and you can see the GPU utilization peak up to 90% for about 1 or 2 frames if you look at a graph... then it goes down to normal. It will still be 60 fps though since you are not cresting 99%. At higher resolutions then your minimum amount of GPU utilziation is higher so when that peak occurs for those 1 or two frames your framerate nose dives.

Is that a good way to design a particle system even on the highest settings? No, not really. Peaks and valleys should not really be so disparate IMO.

Ah, yeah, I've been monitoring it in real-time via the Afterburner overlay. Although, for most games, I do monitor the frametime graphs after a long gameplay session to tweak settings further if needed.

Still, Extra High "Effects" do seem plain wasteful, and worst of all, short of a slow-mo video comparison between High and Extra High, it hard to see the difference. If it weren't for the noticeably better rain effects tied to the Extra High "Effects" setting, I would have dropped it down to High, and gone back to 1440 DSR.

It's too bad we can't isolate the higher quality rain, and create a hybrid "High" Effects setting via the lua files.

Oh well, first world problems and all that ;)
 
Ah, yeah, I've been monitoring it in real-time via the Afterburner overlay. Although, for most games, I do monitor the frametime graphs after a long gameplay session to tweak settings further if needed.

Still, Extra High "Effects" do seem plain wasteful, and worst of all, short of a slow-mo video comparison between High and Extra High, it hard to see the difference. If it weren't for the noticeably better rain effects tied to the Extra High "Effects" setting, I would have dropped it down to High, and gone back to 1440 DSR.

It's too bad we can't isolate the higher quality rain, and create a hybrid "High" Effects setting via the lua files.

Oh well, first world problems and all that ;)

I agree that rain looks considerably better on Extra High, but I still chose to drop effects to High so that I could afford a more consistent 1440p with my OCed Titan X. 1080p is much too blurry by comparison, and immediately apparent to me. In fact, I've recently elected 1440p as my default minimum resolution, and have been lowering a few settings in various games to compensate for that. :-)
 
I agree that rain looks considerably better on Extra High, but I still chose to drop effects to High so that I could afford a more consistent 1440p with my OCed Titan X. 1080p is much too blurry by comparison, and immediately apparent to me. In fact, I've recently elected 1440p as my default minimum resolution, and have been lowering a few settings in various games to compensate for that. :-)

Depends on what your display's native res is. Mine's 1080p, which, for this game, is actually sharper than 1440 DSR, thanks to its "wonderful" Gaussian scaling filter (what I wouldn't give for Nvidia to implement an optional Lanczos, or even bilinear filter for DSR). Of course, @1080p, I do lose some of the sub-pixel detail increases afforded by 1440p, as well as the decreased alpha transparency "grid" effect visible in hair, etc. Too bad this game doesn't have a better/temporal AA method.

I'd create a custom 1440 res to bypass the DSR blur, but ever since 353.62, custom res over HDMI has been broken for me. Nvidia isn't expected to fix it until the end of this month.
 

This patch seems to have changed the way the game handles downsampling under certain circumstances (for the worse). Previously (if you alt-tabbed out and back in, for some reason it needed that to kick in) the scaling was akin to bilinear resampling and actually provided a non-shit image when downsampling from a non-multiple of the native screen res, but this is absent in the latest patch. Both of these are 1440 > 1080, simulated as accurately as I could:

Ver. 1.01
Ver. 1.02 (beta 1.006)

Ver. 1.01
Ver. 1.02 (beta 1.006)
 
Depends on what your display's native res is. Mine's 1080p, which, for this game, is actually sharper than 1440 DSR, thanks to its "wonderful" Gaussian scaling filter (what I wouldn't give for Nvidia to implement an optional Lanczos, or even bilinear filter for DSR). Of course, @1080p, I do lose some of the sub-pixel detail increases afforded by 1440p, as well as the decreased alpha transparency "grid" effect visible in hair, etc. Too bad this game doesn't have a better/temporal AA method.

I'd create a custom 1440 res to bypass the DSR blur, but ever since 353.62, custom res over HDMI has been broken for me. Nvidia isn't expected to fix it until the end of this month.
Yeah, it's unfortunate that there isn't any real AA implemented. I simply can't afford to go beyond 1440p with my 970 and since that's my native screen resolution it doesn't even offer any "downsampling AA". Oh well.
 
I'd create a custom 1440 res to bypass the DSR blur, but ever since 353.62, custom res over HDMI has been broken for me. Nvidia isn't expected to fix it until the end of this month.

Yes, I noticed that too. Custom resolutions simply refuse to work if said resolution is not listed in your display's EDID. I wasn't sure if it was broken, though. I just assumed they intended to no longer allow driver downsampling through HDMI. In any case, I'm glad to hear they're going to fix it eventually.
 
This patch seems to have changed the way the game handles downsampling under certain circumstances (for the worse). Previously (if you alt-tabbed out and back in, for some reason it needed that to kick in) the scaling was akin to bilinear resampling and actually provided a non-shit image when downsampling from a non-multiple of the native screen res, but this is absent in the latest patch. Both of these are 1440 > 1080, simulated as accurately as I could:

Ver. 1.01
Ver. 1.02 (beta 1.006)

Ver. 1.01
Ver. 1.02 (beta 1.006)

I can 100% confirm this. Before, you could trick the game into thinking a DSR resolution was a normal downsampling res, and it applied bilinear filtering after a single alt+tab, as long as DSR smoothness was set to 0%. Now, with the bug fixed, resolutions such as 1440 DSR on a 1080p monitor appear as they do in your second screens. I'm afraid the only option now is to increase smoothness to 25% or up for non 4k DSR resolutions on a 1080p display.

It's one of the reasons I recently reverted back to my native 1080p in-game. Nvidia's custom res fix can't come soon enough. Man I wish there was a DX11 GeDoSaTo available.
 
Yeah, it's unfortunate that there isn't any real AA implemented. I simply can't afford to go beyond 1440p with my 970 and since that's my native screen resolution it doesn't even offer any "downsampling AA". Oh well.

That's why I haven't upgraded to a higher resolution display, even with my 980 Ti. I'm a 1:1 image ratio nut; I must have the option to run (at minimum) my screen's native res on more demanding games, and with almost every mainstream game targeting 1080p/60 FPS, a native 1080p screen is currently more flexible.

I'll probably take the 4k display plunge once it's more established (there aren't even 4k blu-rays yet, and broadcast TV is still stuck at 1080i maximum, sigh), and when single GPUs are more 4k/60 FPS viable.
 
Is there any way to turn off the temporary blur that occurs when you zoom in without turning off all the other postprocessing effects?
 
Anyone else get pixelated blood?

t05b.png
 
No but 87% holy shit dude. How much did you play this? I'm at 42% and I played for 55ish hours.
Then again, I haven't played much of the main missions. Maybe those add a lot of progression.
 
No but 87% holy shit dude. How much did you play this? I'm at 42% and I played for 55ish hours.
Then again, I haven't played much of the main missions. Maybe those add a lot of progression.

The optional objectives add up quite a bit. As do side ops.

So you don't get blood that looks like that? So it's probably a glitch then. Also noticed it during mission 43 (the
quarantine one
).
 
No but 87% holy shit dude. How much did you play this? I'm at 42% and I played for 55ish hours.
Then again, I haven't played much of the main missions. Maybe those add a lot of progression.

Heh, I'm at 48 hours and only 20%. I started playing Mad Max and had about 18 hours into it before I started TPP about three days later. Currently at 29 hours in that now. Not 77 hours total and not even two weeks... I'm not proud of this.

And I'm about to put more hours in.
 
1080p solid 60fps high settings on a i5 4670k @3.8 + 750ti, what a joy, this makes waiting for Pascal GPUs easy.

You are aware that upon release, the Pascal GPUs will almost certainly require a new type of motherboard, which, in turn, will require DDR4 RAM and a Skylake (or later) CPU, right?

So, basically, an entire PC upgrade (for the majority of current mid to high-end PC owners) is going to be required this time around, from how I understand it. It's one of the reasons I splurged on a 980 Ti; it was that, or hold off and buy an entirely new rig later next year.

I am eager to see what performance gains we'll see with that new architecture though.
 
You are aware that upon release, the Pascal GPUs will almost certainly require a new type of motherboard, which, in turn, will require DDR4 RAM and a Skylake (or later) CPU, right?

So, basically, an entire PC upgrade (for the majority of current mid to high-end PC owners) is going to be required this time around, from how I understand it. It's one of the reasons I splurged on a 980 Ti; it was that, or hold off and buy an entirely new rig later next year.

I am eager to see what performance gains we'll see with that new architecture though.
But will it really? I've never read about this in relation to Pascal GPUs and can't think of any reason why they would require a completely new MB setup. Then again, I don't know much about all the connectors and lanes etc
 
So if I force HBAO+ on through the nvidia control panel can I disable all ingame AO or should I leave it at extra high or something ?
 
A couple cut scenes significantly lower the frame rate. I get like 10-15 FPS. One was during a Helicopter ride, while the other was visiting Huey. It's pretty baffling because the rest of the game runs at a fluent 60. Has anyone else encountered this?

I'm running on a GTX 670 2gb vram
CPU: i5 3750 @ 4.2 ghz
Ram: 8gb
Win 10
 
This is almost definitely not going to be true.

That would be strange indeed (good luck to Nvidia) but there is this tidbit :
http://www.anandtech.com/show/7900/nvidia-updates-gpu-roadmap-unveils-pascal-architecture-for-2016

So NVLink will be ditching the slot in favor of what NVIDIA is labeling a mezzanine connector, the type of connector typically used to sandwich multiple PCBs together (think GTX 295). We haven’t seen the connector yet, but it goes without saying that this requires a major change in motherboard designs for the boards that will support NVLink.
 
Seems to be related to hbao.

Yup, I've read so.
Weird thing is that...
...the scene is just a dialogue inside the helicopter

And the whole game took a performance hit, since the menu was also running at that atrocious frame-rate.

Really really odd bottleneck.
Luckily was very short.
 
Top Bottom