Far Cry 3 PC performance thread

FINALLY!!! I got it set up to where it's running at a descent pace. It's not perfect but it's playable. Hopefully there is a patch for this sometime soon. I thought it was my graphics card but when you put it on the "low" preset and everything is still stuttering like crazy then there's something wrong with the game itself.
 
Just a little update. Instead, I changed the "id="ultrahigh" to id="low" since that's what my postfx is set to and now the settings change fine when under read-only. So thanks for that, I would've have thought to do that without your suggestion.

Is it just me, or is this game quite jaggy even with FXAA on? I've set it to 8 previously to test is out and it's still quite jaggy around the gun etc. SMAA set to ultra is the same deal.

Turn off DepthDownSampling in the .xml.
 
Was really enjoying the game until I got to the 'skin 2 boars' mission.

Now my pc just freezes to a black screen every 5 minutes.
I've got an I7 930 at 3.5ghz and an asus 7950.

No other games have ever given me issues (I guess I've been lucky till now)
but this is driving me insane...

I tried underclocking my GPU by 100mhz and I think this might have fixed it, but it wasn't overheating and there was plenty of voltage. I don't get it at all...

It should work on stock if you ask me.
 
Turn off DepthDownSampling in the .xml.

Hm, I've tried this quite a few times and it gives me quite a fps drop. :/ I also don't notice any difference in image quality, but it's hard to look out for a lot of graphical stuff in this game. Would you perhaps be able to show a picture comparison? And do you have any idea why I'm getting an fps decrease. :(

Thanks!
 
Everything on low settings and 1080p resolution can't get past 25 frames per second with my 650m gt. There are many benchmarks that say it can reach 48fps on medium 1080p

Does it have a period of running well and then a period of running badly? If so your laptop could be overheating and throttling the CPU and/or GPU. If that is the case check temps with a program like Real Temp, 90C should be when the CPU starts throttling if I'm not wrong.

To fix it you'd have to cool your laptop, cheapest and best way is to open it up and get dust out of the heatsink(s).
 
So.... could I push for maximum quality settings with this setup? Also, I don't need 60 fps. 45 is fine too.

i5 2500k 3.3ghz
8 GB Ram
ATI 6970 2 GB
res: 1920 x 1080

If not, how far could I go?
 
what if I set Postfx to false? then what do I change the id=.. to?

Don't set it to false, it's a broken setting that the game doesn't know what to do with. I think it looks like it defaults to medium if you set it to false.

If you want to turn PostFX OFF then just use the low setting.

Edit: You might also have to use some config tweaks to turn off some Depth of Field but I'm not quite sure.
 
Nope. Can't do it. Even with just x2 AA it doesn't run at a solid 60. I used the very high preset, HBAO and x2 AA and it's running at a fairly consistent 60 FPS.

What res are you running?

at 1920x1080 I have everything maxed with no AA and can hit a consistent 60 and that's witha 7950... that 690 should be destroying me.
 
The game feels really jerky even when running at ~40 fps for me, so much so that I thought I was mostly in the range of ~20fps most of the time after I run an initial test and I saw it flunctuates between those two ranges then turned the fps display off. But upon turning it on again now that I've finished the game I see that often in many areas it's mostly at ~40fps, not ~20 (which happens too but only in certain areas and sections) yet as I said feels really choppy. Is there an explanation for this, I mean, I play on multiple platforms so often play ~30fps games too but it rarely feels as bad as this game feels at ~40, I only have as bad a feel in games my PC can't handle at all for real, like The Witcher 2 and BF3. I've turned off the frames rendered ahead thing since it made it look like I'm getting lag in a single player game in firefights so it's not that. The game only really felt smooth when around 60fps which of course happens rarely on my PC. Also when I choose to play widescreen letterboxed from the in game options (my current monitor is 4:3, 16:10 broke down) the sniper rifle scopes are invisible (maybe just the most expensive scope, I didn't test the others, but the scopes - not sights - for the assault rifles and stuff didn't have the problem). It's fine with normal widescreen.
 
Hm, I've tried this quite a few times and it gives me quite a fps drop. :/ I also don't notice any difference in image quality, but it's hard to look out for a lot of graphical stuff in this game. Would you perhaps be able to show a picture comparison? And do you have any idea why I'm getting an fps decrease. :(

Thanks!

When it's on (postfx on medium or lower I think), it reduces the resolution of distant objects, so you get tons of aliasing but better performance. If you turn it off it'll look a lot sharper but obviously you'll lose quite a bit of fps.

The game feels really jerky even when running at ~40 fps for me, so much so that I thought I was mostly in the range of ~20fps most of the time after I run an initial test and I saw it flunctuates between those two ranges then turned the fps display off. But upon turning it on again now that I've finished the game I see that often in many areas it's mostly at ~40fps, not ~20 (which happens too but only in certain areas and sections) yet as I said feels really choppy. Is there an explanation for this, I mean, I play on multiple platforms so often play ~30fps games too but it rarely feels as bad as this game feels at ~40, I only have as bad a feel in games my PC can't handle at all for real, like The Witcher 2 and BF3. I've turned off the frames rendered ahead thing since it made it look like I'm getting lag in a single player game in firefights so it's not that. The game only really felt smooth when around 60fps which of course happens rarely on my PC. Also when I choose to play widescreen letterboxed from the in game options (my current monitor is 4:3, 16:10 broke down) the sniper rifle scopes are invisible (maybe just the most expensive scope, I didn't test the others, but the scopes - not sights - for the assault rifles and stuff didn't have the problem). It's fine with normal widescreen.

I really can't stand anything under 50 fps in this game. Don't know if it's different than other games in that regard but it definitely doesn't feel good. I mostly keep a locked 60 with occasional dips to 50-55.
 
As poetic as it may sound, having the game still crash on me every 5 mins. (max.), is no fun. At all.

I looked up what caused the problem and it was nvwgf2um.dll

After reverting to the latest official driver, the crashes still happened. I re-upgraded to the latest beta drivers again and the crashes were still there (killed that lion 17 times by now).

Switched to DirectX9 and played for 3 hours without a crash. A nice side effect: 60 fps all the way AND the stuttering is finally gone as well. It's actually smooth now.

I am now able to really enjoy the game, killed a lot of tigers again ... it's a good thing :)
 
So.... could I push for maximum quality settings with this setup? Also, I don't need 60 fps. 45 is fine too.

i5 2500k 3.3ghz
8 GB Ram
ATI 6970 2 GB
res: 1920 x 1080

If not, how far could I go?

I have pretty much the same setup but with a 6950 instead, everything stock and the game runs fine in ultra settings with no AA.
 
I honestly underestimated how similar this game would look to FarCry 2. That's really what people should be comparing it too, not Crysis. It's basically FC2 with some upgraded lighting effects.

Also, I think I've managed to get into a comfortable performance state with this:

<GamerProfile>
<SoundProfile MusicEnabled="1" MasterVolume="100" MicEnabled="1" IncomingVoiceEnabled="1" Language="English" />
<RenderProfile MSAALevel="0" AlphaToCoverage="2" SSAOLevel="6" SDSM="0" ResolutionX="1440" ResolutionY="900" Quality="custom" QualityEditor="editor_ps3" Fullscreen="1" Borderless="0" UseD3D11="0" D3D11MultithreadedRendering="0" WidescreenLetterbox="0" UseWidescreenFOV="1" FOVScaleFactor="0.9975" EnableSubResolution="0" SubResolutionX="960" SubResolutionY="540" VSync="1" RefreshRate="0" DisableMip0Loading="0" GPUMaxBufferedFrames="1" ShowFPS="1" Brightness="1" Contrast="1" GammaRamp="1" AllowAsynchShaderLoading="1">
<CustomQuality>
<quality ResolutionX="1440" ResolutionY="900" EnvironmentQuality="high" AntiPortalQuality="default" PortalQuality="medium" PostFxQuality="medium" TextureQuality="high" TextureResolutionQuality="high" WaterQuality="very high" DepthPassQuality="high" VegetationQuality="high" TerrainQuality="medium" GeometryQuality="high" AmbientQuality="medium" DeferredAmbientQuality="medium" ShadowQuality="medium" EditorQuality="" Hdr="1" HdrFP32="0" ReflectionHdr="1" EnableVertexBinding="1" id="custom" />
</CustomQuality>
</RenderProfile>
<NetworkProfile VoiceChatEnabled="1" CustomMapMaxUploadRateInBitsOnline="10240000" OnlineEnginePort="9000" OnlineServicePort="9001" FileTransferHostPort="9002" FileTransferClientPort="9003" LanHostBroadcastPort="9004" LanClientBroadcastPort="9005" ScanFreePorts="1" ScanPortRange="1000" ScanPortStart="9000" SessionProvider="" MaxUploadInbpsOnline="10240000">
<Accounts />
</NetworkProfile>
<GameProfile />
<ProfileSpecificGameProfile Sensitivity="1" Invert_x="0" Invert_y="0" DefaultFlickFireDirection_y="0" UseMouseSmooth="1" Smoothness="0.2" Smoothness_Ironsight="0.2" HelpCrosshair="1" Gamepad_vibration="1" UseRoadSignHilight="1" UseSubtitles="0" TaggingEnabled="1" UseAmbx="0" UseGamePad="0" GamepadAnswered="0" Autosave="1" Machete="0" IronsightToggleMode="0">
<FireConfig QualitySetting="VeryHigh" />
</ProfileSpecificGameProfile>
<RealTreeProfile Quality="VeryHigh" />
<EngineProfile>
<PhysicConfig QualitySetting="VeryHigh" />
<QcConfig GatherFPS="1" GatherAICnt="1" GatherDialogs="0" IsQcTester="0" />
<InputConfig />
<ZoneConfig />
</EngineProfile>
<UplayProfile LockString="4E2plnr7uJdUqoPOlTVPlY0AkoV+wk3trNYzBuuCv8U=" />
</GamerProfile>

On these specs:
Intel Core 2 Quad Q6600 2.4GHz
3GB DDR2 RAM
HD6850 1GB
Windows Vista 32bit
1440 x 900 monitor

I haven't been able to display the framerate yet but it definitely seems stable enough.
 
Same here. Runs great.

Guess I'm a little more conservative. I've got the same setup but at 4.6ghz and 2 6970s at 900/1375 clocks. I don't like the game to drop below 60fps EVER. So I've got PostFX on Low and Water on medium to keep it running like butter.

It definitely felt like choppy shit on Ultra. Especially if you looked over areas with a large draw distance.
 
Game was running solid 60fps most of the time. Then I got to (midgame spoilers)
the second island
and started getting a lot of drops to the 40s. not sure why this is happening, haven't changed anything at all. Kind of frustrating after having it running perfectly smooth up to this point
 
Posted this in the other FC3 thread but here goes.

Has anyone had their saves not available for loading in game? I see the files, two of em, but I only get "new game" in the story section of the menu. I really cba to start all over again...
 
weird, but I was having terrible performance with a custom setting, and decided to see how bad it performed with everything on the max and downgrade from there.

To my surprised, that made it smoother! I dunno if it was the montionblur or something but it wasn't a semi slide show that I was having.

My PC:

I5 2500K
GTX 670
8GB Ram

everything stock, I tried to OC the cpu once but for some reason the PC didn't started (I got lucky it started after a lot of time trying and undid the OC), so I´ve been scared to try again since that.
 
Now I understand what people are talking about when they say the performance is great one day and not so good the next, I wonder if it just depends on section of the island you're on.

I've been used to 50-60fps with my current settings but today I saw a lot of 45fps moments.
 
Would you be able to expound upon this a bit? I'm not sure I'm following. Especially the last bit.

While playing without widescreen letterbox (16x9 visually), it would bug me that the mini-map was a slight oval rather than a perfect circle.

For years, PC games were made on monitors that were generally 16x10 whereas console games played on HDTV's that were 16x9. Recent PC games have given the option for 16x9 because that was created for the consoles.

When I switched to widescreen letterbox in Far Cry 3, I noticed the resolution didn't change, but now I have black bars on top and bottom (not cutting my view, but adding more to the sides), and perfectly geometrical HUD elements, like the mini-map.

This completely changed performance for me (but my brother's machine wouldn't even except playing in widescreen letterbox). With my 550ti, I can play with ultra settings and AA all the way up(did more testing last night), Vsync, and other things that I don't remember the names of. Frame rate is at 30, but I can deal with it since it's steady and looks gorgeous.

This could all be coincidence, but I would like confirmation on my findings.
 
Now I understand what people are talking about when they say the performance is great one day and not so good the next, I wonder if it just depends on section of the island you're on.

I've been used to 50-60fps with my current settings but today I saw a lot of 45fps moments.
I was getting it in the same areas. It seems like quite the technical achievement to code a game that behaves so randomly session to session on the same hardware without any driver or settings changes!
 
I was getting it in the same areas. It seems like quite the technical achievement to code a game that behaves so randomly session to session on the same hardware without any driver or settings changes!

I've had my resolution setting in game reset from one resolution to another, albeit the difference might have been difficult to perceive for example between 3200x1800 and 2560x1440 downsampled to my monitors resolution I can assure you that the performance difference was anything but. So yeah, sometimes the game just reset/changed my graphics settings without me realizing.
 
I have pretty much the same setup but with a 6950 instead, everything stock and the game runs fine in ultra settings with no AA.

Just saying that it runs fine doesn't really help anyone though. You might be just fine with 30-40fps, but most people like to keep it over 60 and no 6950 or 6970 will do that on ultra at 1920x1080.

I got an i5 2500k@4,6GHz, 6950 with unlocked shaders that is slightly overclocked and 8GB RAM. And this is the settings I have to use to maintain 60fps almost all the time.
Code:
<RenderProfile MSAALevel="0" AlphaToCoverage="2" SSAOLevel="4" SDSM="0" ResolutionX="1920" ResolutionY="1080" Quality="custom" QualityEditor="editor_ps3" Fullscreen="0" Borderless="1" UseD3D11="0" D3D11MultithreadedRendering="0" WidescreenLetterbox="0" UseWidescreenFOV="1" FOVScaleFactor="1.365" EnableSubResolution="0" SubResolutionX="960" SubResolutionY="540" VSync="0" RefreshRate="0" DisableMip0Loading="0" GPUMaxBufferedFrames="1" ShowFPS="0" Brightness="1" Contrast="1" GammaRamp="1" AllowAsynchShaderLoading="1">
		<CustomQuality>
			<quality ResolutionX="1280" ResolutionY="720" EnvironmentQuality="high" AntiPortalQuality="default" PortalQuality="medium" PostFxQuality="medium" TextureQuality="high" TextureResolutionQuality="high" WaterQuality="veryhigh" DepthPassQuality="high" VegetationQuality="veryhigh" TerrainQuality="high" GeometryQuality="high" AmbientQuality="high" DeferredAmbientQuality="high" ShadowQuality="high" EditorQuality="" Hdr="1" HdrFP32="0" ReflectionHdr="1" EnableVertexBinding="1" id="custom" />
		</CustomQuality>
			<Post>
			<quality GameDepthOfField="1" DepthDownsample="1" CinematicDepthOfField="1" MotionBlur="1" SSAO="1" FXAALevel="0" CloudShadows="1" SSAOMaxDistance="100" id="medium" />
		</Post>
	</RenderProfile>
 
I thought I saw someone say that if you're playing in DX11 there is automatically FXAA applied? So is FXAA applied by default in DX11 unless you turn on MSAA? Might try turning off MSAA if that's the case just to gain a few fps.
 
Finally found the game's graphical sweet spot for me. Surprisingly challenging to find it though, and involved some slight overclocking of my hardware (i7 3770 + 660 ti oc).

However, I just can't figure out is why the game kicks me back to the desktop 10-30 seconds EVERY TIME after first loading up a gamesave (even with uplay safe mode). If I then close the process via task manager and reboot the game it works fine ..
 
Runs amazing. Really impressed. 1680x1050 on Ultra DX11 4XMSAA, SSAO and all effects maxed it's at least a solid 30fps in action and 45+ normally. Surprised with my CPU.

Core 2 Duo 3.2GHz (Wolfdale)
4GB Ram (but on 32bit Win 7)
7850 2GB
 
So what exactly do you lose running DX9? Looks like MSAA and SSAO are disabled.

Putting post processing on anything above high makes the game stutter like crazy on my GTX570.
 
This game is pretty much unplayable for me.

No matter if i have it maxed out or on lowest settings / post fx on high or low, vsync on, 1 2 frames or off...

I get game-breaking microstutter.

I'm running an i5 3570k
crossfire 6950s
newest catalyst beta driver with the newest CAP.

:(
 
This game is pretty much unplayable for me.

No matter if i have it maxed out or on lowest settings / post fx on high or low, vsync on, 1 2 frames or off...

I get game-breaking microstutter.

I'm running an i5 3570k
crossfire 6950s
newest catalyst beta driver with the newest CAP.

:(

Tried restarting the game after setting postfx to low (or in the config)? that's what removed my stuttering - the game does not tell you that it needs a restart.
 
This game is pretty much unplayable for me.

No matter if i have it maxed out or on lowest settings / post fx on high or low, vsync on, 1 2 frames or off...

I get game-breaking microstutter.

I'm running an i5 3570k
crossfire 6950s
newest catalyst beta driver with the newest CAP.

:(

yup setting post fx to low might work. also play around with the gpu frame buffering thing.
 
This game is pretty much unplayable for me.

No matter if i have it maxed out or on lowest settings / post fx on high or low, vsync on, 1 2 frames or off...

I get game-breaking microstutter.

I'm running an i5 3570k
crossfire 6950s
newest catalyst beta driver with the newest CAP.

:(

While investigating SLI vs Crossfire, I read somewhere that disabling V-SYNC might help with the micro stutter.
Also installing a FPS cap tool to limit the FPS seems to work sometime.
 
Is there any way to increase the grass LOD beyond the highest setting in the menu?
 
So, I've got the actual gameplay running flawlessly at 45-60 FPS on these specs:

AMD Phenom II X4 965 3.4GHz
8GB of RAM
and an ATI Radeon HD6770 1GB

and at these settings with vsync and 8x AA forced through my video card settings:

6103BF8D9672A213E5072688D7485E4C585366DB

However, there is a pretty annoying issue that arises after playing for more than 5 minutes. The menus (Crafting, inventory, map, main menu, etc.) start running like molasses and the mouse movement becomes so slow to the point that it takes upwards of 30 seconds to move to a menu item, select it and have it actually open.

Anyone else having this problem and/or know how to fix it?
 
This game is pretty much unplayable for me.

No matter if i have it maxed out or on lowest settings / post fx on high or low, vsync on, 1 2 frames or off...

I get game-breaking microstutter.

I'm running an i5 3570k
crossfire 6950s
newest catalyst beta driver with the newest CAP.

:(
I'm on an i5 3570K and 7870. Setting the GPU buffer to "1" seems to have helped a bit with that awful "microstutter."

Also, is everyone using DX9 for this game now? If so, stupid question, but what's the easiest way to downgrade to DX9? Do I have to do a total uninstall/reinstall of DX? Or is there a way to easily switch between the two?
 
Can anyone give me a summary of the most performance lowering settings that do not offer any discernable increase in image quality? I'm playing on DX 11.
 
Can anyone give me a summary of the most performance lowering settings that do not offer any discernable increase in image quality? I'm playing on DX 11.

While it doesn't show performance differences. Here's a handy video someone linked earlier, showing the difference in settings.

http://www.youtube.com/watch?v=AwLyNm6YZ1A

I believe PostFX can give a dramatic increase in performance, on lower settings. Beyond that, i'm afraid i'm not much help.
 
[...]a summary of the most performance lowering settings that do not offer any discernable increase in image quality?

Um, what makes you so sure that such a setting exists in the first place? It's not like developers sit around thinking of new ways to drain performance without increasing image quality. Anyway, if you need more FPS, the most dramatic changes you can make are to make sure MSAA is disabled, keep ambient occlusion at any setting other than "HDAO," keep your "PostFX" quality setting at "medium" or lower, or keep your "shadows" quality setting at "high" or lower. These changes will all definitely have a "discernible impact" on image quality, however.
 
Keep in mind that even in DX9 with everything on low the game might still perform pretty badly (stutter). This game is weird like that.
 
Top Bottom