GTA V PC Performance Thread

So, theoretically, instead of having 20 different song files in the folder, you could mix them into 1 bigger audio file and the game would perform better because it's basically just 1 file?

You know honestly I'm not sure, the only thing that's been verified so far is that having a large amount of music will cause the game to go into a loading loop upon starting, and FPS dropping while switching to Self Radio.
 
After doing a bit of searching, there are stuttering threads posted everywhere. It seems this issue is fairly widespread.

I did come across one crazy suggestion: enable long distance shadows and other advanced graphics options to reduce the amount of assets that need to be streamed in. Obviously this will hurt FPS, but less stuff will need to be drawn when moving fast.

My stuttering typically occurs when I'm driving fast or rotating the camera significantly, so maybe there is some truth to this as assets are being drawn in these situations. On foot looking forward the game is perfectly smooth.

Have yet to try it though.
 
depending on the rig I'd take AA off entirely, at 4K aa is kinda overkill imo. But then again im on a single card, im sure the SLI/Crossfire crowd would say differently.

Yeah, I'm running a single 290, here are my specs and settings if you want to know.

At 4K you either don't use AA at all or you use FXAA. The downsides of the latter are removed from FXAA and it allows you to get a slightly cleaner image at relatively low cost.

Thanks for the advice. I'll stick with FXAA for now.
 
After doing a bit of searching, there are stuttering threads posted everywhere. It seems this issue is fairly widespread.

I did come across one crazy suggestion: enable long distance shadows and other advanced graphics options to reduce the amount of assets that need to be streamed in. Obviously this will hurt FPS, but less stuff will need to be drawn when moving fast.

My stuttering typically occurs when I'm driving fast or rotating the camera significantly, so maybe there is some truth to this as assets are being drawn in these situations. On foot looking forward the game is perfectly smooth.

Have yet to try it though.

you're not using borderless windowed right?
 
So I've never played anything in 4k before. Is downsampling really that much of a gamechanger? I'm sure I could probably pull off locked 30 but is it worth the effort?
 
Just bought a 4k screen and go
fuck a duck
it's unfathomable.
With a single GTX 980 I can stay at 50-59 fps without AA and a few settings just below max. With 2 times AA on it looks just about perfect, but the framerate tanks to around 30-40 fps.

If I do crack and get another GTX 980 i will have put down around $1'650 bucks for this game. (3x GTA + 4k screen + GFX-card).

Worth it, though.
 
E: I had a similar reproducable crash on the same location yesterday, Garuda. Worked after restarting my computer twice, maybe it helps.

thanks for the help, but still crashed on that spot...

anybody here crashed after
detonating the sticky bomb on lost mc's trailers ?
it crashed directly after i enter trevor's car..
 
Anyone got any ideas about my issue:

I can run at locked 60fps at max settings and 4K. If I whack up the distance scaling though my framerate drops significantly to about 45fps. But my gpu and CPU usage don't massively increase and I can't tell what it is.

Any idea what's happening?

It's probably just to do with under utilization of your CPU. I'd bet it's maybe to do with DirectX limitations.
 
Fullscreen, adaptive vsync, 60 fps limit.

My only refresh option in the games settings is 59hz, but my fps does show at 60 in-game so I imagine (hope) that's not an issue.

The 59hz thing could be the issue. I've seen posts from people saying that fixing that fixed the stuttering. The other big culprit has been windowed borderless which is why I asked.
 
I'd go 4gb. This game eats vram and you'll only be doing yourself a favor for future games as well.

I'm sitting at 3.3ishGB at 1680x1050 with all other settings maxed aside from AA (FXAA) and reflection MSAA (2x). That said, given there are people with 2GB GPUs who are running the game with settings that blow out the reported VRAM usage I assume it's worst case scenario rather than a figure the game will be using the most of the time, but since you're buying a new GPU you don't want to go with 2GB as you'd find yourself in this very situation again in the not-too-distant future. I had to ditch 2x 2GB 670s for a 4GB 980 in order for AssCreed Unity to be playable at all.
4gb it is then.
Can't put this off any longer.
 
So, what does shader quality do? Does it affect bump mapping, those sort of things? Because I'm not really seeing too much of a difference between high and very high, though that might be because of the game needing to reset making it hard to compare.
 
The 59hz thing could be the issue. I've seen posts from people saying that fixing that fixed the stuttering. The other big culprit has been windowed borderless which is why I asked.

Is there a fix for getting 60 hz to show? I've only seen people suggest limiting fps in Riva to 60 (which I've done).
 
I'm trying to see it, but I don't think a Youtube vid is good enough to demonstrate what you're trying to show.

Anyways, I've gone back to indecisive settings hell again. I turned off MSAA and am now running like 1523p or something with FXAA. The drops in performance with MSAA(even at x2) when I get around grass was just getting too annoying. I've had to turn down a couple other settings, but I'm pretty happy with the framerate now. The game is a bit more shimmery than before, though my screenshots turn out nicer now.

Except I've gone and taken some screenshots at 4k and it looks *so* good. I'm always a performance>graphics person, but jesus, I *could* run this at 4k/30fps if I wanted to. This game feels like it's meant to be played at 4k honestly. Even at 1800p or so, the aliasing was still a bit bothersome, but at 4k, it feels like the game just magically cleans up and looks gorgeous, both in motion and in screens. Fuck.

download the file, dont stream it on google
 
I've got these weird moments where the game suddenly drops in fp upto 30fps. GPU and CPU utilization also drops heavily at those moments. Disabling and re-enabling vsync fixes this for a while.

Edit: Anyone found out how to remove CA yet?
 
Last night I overclocked my 7850's core clock all the way to 1050mhz (the max Afterburner would allow), while keeping the factory 1250mhz memory clock. Now I'm truly getting constant 60fps, and I even disabled the limiter and have set the game to exceed my VRAM by around 30MB, the difference is mind blowing.

I had already OC'ed to 1ghz, but for some reason those last 50mhz changed everything.

OC'ed and exceeding VRAM, everything smooth and stable. Afterburner showed I was using up to 1900MB, so I guess I still have some headroom.
 
Played for about 40 minutes today. I moved MSAA to 8x and enabled FXAA. I guess this could count as "completely maxing" out the game. AA is at it's highest here.

2560x1440, max settings, advanced graphic options maxed, 8x MSAA, FXAA ON, w/ GPU OC:


FulL89r.png


---

For comparison with my previous settings:

2560x1440, max settings, advanced graphic options maxed, 2x MSAA, TXAA ON, w/ GPU OC:

LpM3Jsh.png


4k, 4x MSAA, TXAA, max, stock GPU clocks
xXkhImz.png


Same settings with OC

oJevawC.png
 
Ugh so annoying. I left Prime95 running when I got in from work and it passed the 2-hour mark without any errors, yet I booted the game up just now and I barely got 20 minutes before a WHEA_UNCORRECTABLE_ERROR bugcheck.
 
If you're releasing a 60gb game you better make sure your download client isn't complete shit, so tired of this. Re-downloading some files to try and fix the random crashes/audio glitches, if I had the Steam version I would just download the whole game again but I'm not gonna try that with the Rockstar client and its blazing 150kb/s speeds.
 
*Poor implementation of antialiasing settings (Probably one of the worst implementations of FXAA I have seen which is unacceptable given that the other forms of AA tank performance on even the best of cards.)
Why do you believe that the FXAA implementation is bad?
 
3570K 4.4Ghz/970/16GB

So I've noticed that if I reboot my PC and play from a fresh startup, I can pull 65-80 FPS in the city and 55+ in the countryside with no deviation.

However, if after reading this thread for a while or general browsing, that average FPS seems to drop by around 10 and it also introduces a temporary stutter at larger intersections sometimes. This occurs even after closing Chrome/Spotify/etc. so that it is in effect the same as a fresh startup. The VRAM usage is still the same for both cases, around 220 before playing the game, and around 3200 in-game, and the CPU usage is still 100% across all 4 cores in both instances.

I cannot work it out at all. But hey, if it means I just have to reboot to play problem-free, it's not that much of an issue. Anyway, thought people might want to try playing immediately after a reboot and see if their performance improves a little.
 
So, in the countryside I have 62-75fps at 1080p. Only had to put the Grass Detail on Very High instead of Ultra, still looked decent. Modest i7 930, gtx 970, 6gb ddr3, Spinpoint F3 HDD with same win 7 install from around 2009/10.



Same settings as before except Grass is VH and res is 1080p
http://i.imgur.com/Ne5hMzK.jpg
http://i.imgur.com/4InfAFD.jpg
http://i.imgur.com/RdjqmqU.jpg
http://i.imgur.com/9JHxY6G.jpg

As a side note I think NVidia will find some performance like they did for AC Unity, seems the utilization is not high enough at times.

Have some of you guys tried unparking your CPU cores? Maybe you already tried it when Watch Dogs was doing the rounds?
 
Because with other games I have played the jaggies are less apparent with their implementations. I personally feel Rockstar should have offered SMAA as an alternative.

FXAA has a lot of wiggle in the implementation of its algorithm. Different versions/releases of FXAA have been tweaked since release, most of them to reduce the blurring of initial early releases. It used to blur a lot when it first released, but they have been refining it since.

I think individual devs have access to implement FXAA how they see fit as well. I think Rockstar opted for the "lowest" form of it, so that there's miniml blurring. It doesn't do quite as well of a job this way but at least it doesn't really blur the image.

I'm a fan of FXAA and I choose it over everything except SMAA basically. If its implemented right its not bad at all.
 
Does forced FXAA from nvidia inspector work? Because I always felt that worked pretty well. It just doesn't get captured in screenshots.

that's kind of a bad scene for a comparison

I guess it could be worse. Could be a first person shot of a random wall for 30 seconds....

But yeah, kinda bad. Almost no detail at all in that cut scene.
 
Played for about 40 minutes today. I moved MSAA to 8x and enabled FXAA. I guess this could count as "completely maxing" out the game. AA is at it's highest here.

2560x1440, max settings, advanced graphic options maxed, 8x MSAA, FXAA ON, w/ GPU OC:


FulL89r.png

I'm guessing the ultra grass is to blame for the 37 FPS drop? The foliage isn't well optimized in this game at all. "Very high" sort of alleviates the insane drops, but it still is quite unpredictable.
 
Ugh so annoying. I left Prime95 running when I got in from work and it passed the 2-hour mark without any errors, yet I booted the game up just now and I barely got 20 minutes before a WHEA_UNCORRECTABLE_ERROR bugcheck.

How did you run it? Best way is
Click blend
Click custom
Enter amount close to the available ram on your system, 5000 works well for a 8gb pc.
 
Somehow GTA has revealed some deep-rooted instability with my system. Assuming at first that my OC was crap (4.2GHz 4670K), I increased my vcore from 1.260 to 1.265. Where you'd think a little more voltage might help, my PC now wouldn't load into the game and hard locked before I spawned. Tried launching another game, my PC hard locked again. And the same happened a third time with a different game. SO naturally I undo the vcore change, but this had no effect - my PC was still hard locking. I had no choice but to reset my BIOS. Luckily this worked.

So yeah, I guess it's just taken a very CPU-intensive (or RAM-intensive I guess) game to show the cracks. Back to square one with my OC then, but after some stock-clock gaming because I need to be able to bet my left bollock that it was the OC causing the issues.

EDIT: Game still looks pretty smooth at stock.
 
That's the assumption I made when seeing how Rockstar had implemented it here, seems like a watered down form of FXAA. I do hate the blurring it presents normally but when no other AA option is there (that does not kill the framerate) I would take it.

It's just I honestly see only a very small benefit from using it here which is a shame because even on a 980 any other form of AA is a no go unless you want to sacrifice visual fidelity over jaggies.

Is there no way to inject SMAA currently?

Having played a chunk of time with the in-game FXAA and without it (no other AA for performance reasons), I prefer it on. It does address quite a bit of the jaggies. Sure some other spots look a little weird (like faces sometimes have an unusual edge to them) but I prefer it on.
 
I'm guessing the ultra grass is to blame for the 37 FPS drop? The foliage isn't well optimized in this game at all. "Very high" sort of alleviates the insane drops, but it still is quite unpredictable.

Where are you seeing 37 fps drop ? It's about a 21 fps drop going from 2xMSAA + TXAA to 8xMSAA + FXAA.

Edit: you referring to 96fps to 60fps? That's because later is at 4k w/ AA
 
I bought the game and downloaded it. Now for the install its taking forever. I have a pretty slick internet and I was able to download the game faster than me Installing it.
 
As soon as I cross the yellow line on the map is like a hit a wall and the framerate goes to shit, from that point to the countryside.
Edit: The line is just an approximation, not the exact spot.

WjOJjdo.jpg
 
Anyone got any ideas about my issue:

I can run at locked 60fps at max settings and 4K. If I whack up the distance scaling though my framerate drops significantly to about 45fps. But my gpu and CPU usage don't massively increase and I can't tell what it is.

I'm running SLI Titan X's and they probably are at about 70% usage each. The only way I can max out the usage is like if I downsample from 5K or greater, and its a great framerate until I put on distance scaling.

Any idea what's happening?

4790k
8GB RAM.
Dude what are you doing with only 8gb ram on a 4k machine. Those textures have to go somewhere before it hits vram
Super curious about what the differences between 8 and 16 gigs of RAM are.
I really think this is the issue here. I've seen other people in this thread with a 780ti and OC'd 4670k having stuttering issues on 8gb ram. I'm on 24gb ram and I'm having no stutter. There has to be some weird memory utilization going on.
 
I really think this is the issue here. I've seen other people in this thread with a 780ti and OC'd 4670k having stuttering issues on 8gb ram. I'm on 24gb ram and I'm having no stutter. There has to be some weird memory utilization going on.

That's what I'm starting to think. I'm tempted to upgrade to 16GB - if I bought my old sticks it'd be another $60 which isn't bad.
 
Turn down the Grass Quality a bit - it's a killer. That's why the Xbone version was one tick less than the PS4's grass.

Post FX set to the lowest, no AA apart from FXAA, long shadows off and now grass too? :/ I think I'll wait for a patch instead. I really can't justify lowering down too many stuff on a GTX 980. It seems like a different team did the countryside, it's horrible even with a Gsync monitor it goes under 30 like half the time.
 
Top Bottom