• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Tomb Raider PC Performance Thread

Even with the latest Nvidia driver and patch I got crashes after only a few minutes. After doing the Hitman trick, I played the game for an hour without any problems.

The game works without crashes for me with the latest nvidia drivers and the latest game patch. I had the crash problem earlier and had to use the hitman trick, but this was no longer necessary with the latest nvidia drivers for me.
 
Are the Nvidia crashes card specific?

I haven't experienced any crashes with a GTX 480 even before TR got its first patch. I've never had tessellation Disabled either.
 
Motion blur, lens flares, and ambient occlusion.

thanks, thought it must do a lot as it has a similar performance impact as tressFX for me

1tqkxp.png

2ryjti.png


those settings get me

33pki7.png


enabling post processing then drops it to

495k8a.png


the settings i actually play on are the same as those above except i enable post processing and disable tressFX as it causes her hair to flash at this resolution, that gets me

58tjo1.png


got smaa injector enabled aswell, this is on 670 sli and 314.21 drivers, looks very nice indeed in 3D surround although i've only just started it so i may have to drop the settings later
 
Started the game yesterday, my gtx480 handles it really well.

Everything max/ultra/whatever it's called, minus tesselation, and tressfx (aa set to fxaa) I get an average of 60fps according to the in-game benchmark.

Not bad.

I've yet to try it in 3D. It should run flawlessly given the resolution drop.
 
So I beat the game about a week ago, but decided since the new drivers were out and a patch too that I'd try benchmarking the game. Ran it with the same settings I used when I played (Ultra w/ no TressFX) and scored 114.5 FPS average (which is nice since I have a 120 Hz monitor). Decided to see what affect TressFX would have so I just changed that and benchmarked it again. 4.3 FPS.

Seriously, what the actual fuck? Is TressFX that poorly optimized for my 2 x GTX 580's that it literally shits a brick and goes down ~100 FPS? Anyone know what the hell is up with that? Seems that most AMD users only get a ~20-30 FPS hit, but I got a 96% decrease in my performance. Went from being supremely playable to a slideshow I couldn't escape from.
 
Well restarted from scratch today, trying to get the missing cheevos, game crashed climbing the ladder going up to Roth (while speaking to him on the radio), game never crashed this early before.

Tomb Raider: v1.00.722.3
Nvidia: 314.07

Same settings that I always run, BUT with Ultra Shadows, so I guess that could have been the issue of the crash.
 
So I beat the game about a week ago, but decided since the new drivers were out and a patch too that I'd try benchmarking the game. Ran it with the same settings I used when I played (Ultra w/ no TressFX) and scored 114.5 FPS average (which is nice since I have a 120 Hz monitor). Decided to see what affect TressFX would have so I just changed that and benchmarked it again. 4.3 FPS.

Seriously, what the actual fuck? Is TressFX that poorly optimized for my 2 x GTX 580's that it literally shits a brick and goes down ~100 FPS? Anyone know what the hell is up with that? Seems that most AMD users only get a ~20-30 FPS hit, but I got a 96% decrease in my performance. Went from being supremely playable to a slideshow I couldn't escape from.

Is SLI turned on? Is Physics set to auto-select?
 
So I beat the game about a week ago, but decided since the new drivers were out and a patch too that I'd try benchmarking the game. Ran it with the same settings I used when I played (Ultra w/ no TressFX) and scored 114.5 FPS average (which is nice since I have a 120 Hz monitor). Decided to see what affect TressFX would have so I just changed that and benchmarked it again. 4.3 FPS.

Seriously, what the actual fuck? Is TressFX that poorly optimized for my 2 x GTX 580's that it literally shits a brick and goes down ~100 FPS? Anyone know what the hell is up with that? Seems that most AMD users only get a ~20-30 FPS hit, but I got a 96% decrease in my performance. Went from being supremely playable to a slideshow I couldn't escape from.

Something is not right. This person is getting 56 FPS w/TressFX and 80 FPS w/o @ 1080p with GTX 580 SLI.

http://www.youtube.com/watch?v=FU52--3T4F4
 
If anyone has the new Nvidia 314.21 drivers, could they post the SLI flags for DX9 and DX11 if they're available? I'm running older cards (GTX260's) and have settled on a driver that's perfect for my system but I'd like to see if there are new SLI profiles that may remove the occasional flicker. Currently using Far Cry 3 profiles and get almost full GPU scaling, especially under DX9, even in Shantytown.
 
I tried PLaying this just now and everything was in French (Im American)...what the what? I changed the language to English but that apparently only changed the menus since spoken dialog was still in French.

I restarted Steam to see if that's the issue and now I'm downloading a 1GB update.

Weird.
 
This game looks like a pre-rendered CG movie at 2560x1600p fully maxed with 4xSSAA. I love it. :D

Screenshots I took while playing. http://min.us/mlG5zKXFzgUZc
Also, VRAM usage peaked at 3202MB. (Usually hovers between 2.7~2.9GB) Tomb Raider shows the highest VRAM usage I've seen with (unmodded) game.
 
If anyone has the new Nvidia 314.21 drivers, could they post the SLI flags for DX9 and DX11 if they're available? I'm running older cards (GTX260's) and have settled on a driver that's perfect for my system but I'd like to see if there are new SLI profiles that may remove the occasional flicker. Currently using Far Cry 3 profiles and get almost full GPU scaling, especially under DX9, even in Shantytown.

It's almost been 24 hours since your post, so you've probably found them already, but:

DX10x: 0x084000F5 (Tomb Raider (2013))
DX9: 0x03400005 (Tomb Raider (2013), Dead Rising 2, WRC 2 Fia World Rally 2011, Lord of the Rings: War in the North, Serious Sam III, Dead Rising 2: Director's Cut, ArcheAge)
 
Man, I want to play with TressFX on, looks good when it's not flipping out and you have a stable 60fps. But as soon as you get a close-up/in-game cinematics the fps is all over the place and the hair bugs out.
 
It's almost been 24 hours since your post, so you've probably found them already, but:

DX10x: 0x084000F5 (Tomb Raider (2013))
DX9: 0x03400005 (Tomb Raider (2013), Dead Rising 2, WRC 2 Fia World Rally 2011, Lord of the Rings: War in the North, Serious Sam III, Dead Rising 2: Director's Cut, ArcheAge)

Hot damn, thanks for those JaseC. I looked around for a bit but couldn't find anyone who had posted the new SLI profiles. I'll give these a whirl later tonight when I get home.

Edit: Managed to give the profiles a try and they did make an improvement. I then installed the 314.21 Beta drivers and man, those made quite an improvement. Using the older drivers, DX9 scaling was almost perfect but DX10 was terrible. With the new beta's, scaling is pretty much perfect on both. Plus no flickering and no slowdown even in Shantytown (LOD on "normal", TressFX & Tess off, FXAA everything else maxed @ 1080p).

Unfortunately, the new drivers mean I had to give up nHancer for Nvidia Inspector :-(

BTW, I find that I get better SLI scaling by turning off vsync in game and forcing it via Nvidia Inspector.
 
Any idea why I get a better framerate with 4xSSAA @ 1080p than when downsampling from 2880x1620?

First of all, simple mathematics.
4x OGSSAA = [2x2] OGSSAA.
4x OGSSAA at 1080p means rendering at 1920x2=3840 (horizontal) and 1080x2=2160 (vertical) internally. This is higher internal rendering resolution compared to [1.5x1.5] OGSSAA of your 2880x1620 downsampling.

Secondly, driver forced downsampling involves another brute force scaling stage with zero optimization from software(game). 4xOGSSAA supported by software/game should perform better. And it's doing what it's supposed to do in more "optimized" way.
The latest build of Project CARS is another good example. It has in-game downsampling (OGSSAA) option which performs considerably better than manual downsampling through NV driver.
After all, driver forced downsampling is the last (emergency) resort when no other AA options are feasible.

I don't see a reason for you to use 2880x1620p downsampling with in-game 2xOGSSAA on your 1920x1080p monitor when the game already has built-in OGSSAA support. Just use in-game 4xOGSSAA at 1920x1080p *without* downsampling. You actually get better image quality this way. Because 2880x1620p downsampling involves [1.5x1.5] driver OGSSAA and extra interpolation which results in "softer" (read blurred) image compared to straight up in-game 4xOGSSAA without extra interpolation. (Also, for optimal interpolation results, you want factors to be in whole number such as 2x, 4x and/or 8x).

That being said, although, it's great to see another AAA title with out of the box OGSSAA support. I'm not really a fan of OGSSAA. (But it's indeed better than nothing).
Why? Because the SSAA AMD seems to be using with their Gaming Evolved titles such as Sleeping Dogs and now Tomb Raider is OGSSAA - the most inefficient form of SSAA there is. SGSSAA looks better at higher or the same performance. With in-game SSAA for example in Sniper Elite V2, Sleeping Dogs and Tomb Raider, I can still see quite a bit of shimmering. SGSSAA would be a much better solution.
 
First of all, simple mathematics.
4x OGSSAA = [2x2] OGSSAA.
4x OGSSAA at 1080p means rendering at 1920x2=3840 (horizontal) and 1080x2=2160 (vertical) internally. This is higher internal rendering resolution compared to [1.5x1.5] OGSSAA of your 2880x1620 downsampling.

Secondly, driver forced downsampling involves another brute force scaling stage with zero optimization from software(game). 4xOGSSAA supported by software/game should perform better. And it's doing what it's supposed to do in more "optimized" way.
The latest build of Project CARS is another good example. It has in-game downsampling (OGSSAA) option which performs considerably better than manual downsampling through NV driver.
After all, driver forced downsampling is the last (emergency) resort when no other AA options are feasible.

I don't see a reason for you to use 2880x1620p downsampling with in-game 2xOGSSAA on your 1920x1080p monitor when the game already has built-in OGSSAA support. Just use in-game 4xOGSSAA at 1920x1080p *without* downsampling. You actually get better image quality this way. Because 2880x1620p downsampling involves [1.5x1.5] driver OGSSAA and extra interpolation which results in "softer" (read blurred) image compared to straight up in-game 4xOGSSAA without extra interpolation. (Also, for optimal interpolation results, you want factors to be in whole number such as 2x, 4x and/or 8x).

That being said, although, it's great to see another AAA title with out of the box OGSSAA support. I'm not really a fan of OGSSAA. (But it's indeed better than nothing).
Why? Because the SSAA AMD seems to be using with their Gaming Evolved titles such as Sleeping Dogs and now Tomb Raider is OGSSAA - the most inefficient form of SSAA there is. SGSSAA looks better at higher or the same performance. With in-game SSAA for example in Sniper Elite V2, Sleeping Dogs and Tomb Raider, I can still see quite a bit of shimmering. SGSSAA would be a much better solution.

The math was the reason I was asking. 4xSSAA performing worse than 1.5x SSAA didn't make sense, but thanks for explaining why. What is 2xSSAA running at? I assume it's 2x1, or 3840x1080 or 1920x2160, or something along those lines?
 
The math was the reason I was asking. 4xSSAA performing worse than 1.5x SSAA didn't make sense, but thanks for explaining why. What is 2xSSAA running at? I assume it's 2x1, or 3840x1080 or 1920x2160, or something along those lines?
Yes. Spot on. 2xSSAA depends on implementations. It's either [2x1] or [1x2].
 
Came back to this game with the patch and new drivers and... still crashing. Bitterly disappointed at this point.

Try the Hitman Absolution trick. It works perfect for me. Even with the latest patch and drivers I got two successive hard crashes within minutes, after a simple fix in Nvidia Inspector I am now playing the game without a single crash. Fourteen hours and counting.
 
Booted it up today to have my weekly check in to see how it's running. TressFX hair is still flying all over the place, running at about 20FPS. Yeah.
 
Is the Nvidia beta driver stable? It doesn't cause any problems elsewhere? I have never really used their beta drivers, so I'm not familiar with how safe they are to use, etc.
 
The game in 3D is incredible. And it runs very smooth too: everything maxed @720p/120hz on a gtx480, FXAA, no tressfx, no tesselation gets me an average of 52fps.
 
Here is a cool trick if you got an Nvidia card and TressFX ruins your performance (1-2 fps on close-ups). Start the game with all settings on low via the launcher, then in game turn TressFX on first (and apply), then set the rest of the settings to what you want them to be. Voila, normal performance. Pretty ridiculous, hopefully they fix this shit.
 
Here is a cool trick if you got an Nvidia card and TressFX ruins your performance (1-2 fps on close-ups). Start the game with all settings on low via the launcher, then in game turn TressFX on first (and apply), then set the rest of the settings to what you want them to be. Voila, normal performance. Pretty ridiculous, hopefully they fix this shit.

Yeah, I discovered this early on as well. I thought it was a bug but wow. HOLY SHIT.

"NVIDIA's FANBOYS GONNA HATE AMD"
 
Here is a cool trick if you got an Nvidia card and TressFX ruins your performance (1-2 fps on close-ups). Start the game with all settings on low via the launcher, then in game turn TressFX on first (and apply), then set the rest of the settings to what you want them to be. Voila, normal performance. Pretty ridiculous, hopefully they fix this shit.
Ahaha that's crazy, I'll try it tomorrow.

Tressfx is not worth the hassle though, it's inconsistent and it doesn't really add much. It's good from far away, but during close-ups it tends to freak out xD
 
Here is a cool trick if you got an Nvidia card and TressFX ruins your performance (1-2 fps on close-ups). Start the game with all settings on low via the launcher, then in game turn TressFX on first (and apply), then set the rest of the settings to what you want them to be. Voila, normal performance. Pretty ridiculous, hopefully they fix this shit.

Hmm I'm going to have to try that later. Will this have to be done every time you are booting up the game?
 
I just read up on the registry hack to force Tomb Raider to use its DX 9 renderer.

Code:
Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Crystal Dynamics\Tomb Raider\Graphics]
"RenderAPI"=dword:00000009

The performance increase was so subtantial that I thought I was mistaken. I gained 15 - 20 FPS on the same settings set in DX10, which is the difference between okay and really smooth performance for my aging rig. Reverting to DX9 allowed me to increase certain settings like Reflections to High, turn on SSAO and LOD options without sacrificing too much FPS.

I thought that since my old card (ATI Radeon HD 4670) didn't support DX11 the game was already using DX9, but I was wrong, DX10 is apparently used instead, which only greys out certain options like Tesselation, TressFX, Advanced Shadow, SSAO and DOF options. Also, SSAA options are only supported by the DX11 launcher in this game. DX9/DX10 only offers you FXAA.

Apparently the launcher has 3 different sets of options based on the DX support and OS you're running.

If you're using Windows 7 and a DX 10 card, like mine, you'll have a different launcher as compared to Windows 7 coupled with a DX11 card, and Windows XP and any card(XP has no DX10/11 support). DX9 is only enabled by default in Windows XP or if you're somehow using a pure DX9 card with Windows 7 like a Radeon X1800/1900 series or a Geforce 7800/7900 series.(highly unlikely)

The differences are shown below:-

Direct X 9 launcher in Windows 7 enabled using the registry hack


Notice that High Precision and Tesselation options are missing, not just greyed out. Shadow Resolution option is also missing.

Direct X 10 launcher in Windows 7 (enabled by default if your card supports DX10 only in Windows 7, e.g. pre ATI Radeon 5xxx series and Geforce 4xx series)

And the DX 11 launcher would have all the options available.

If you're on a borderline obsolete rig and just wanna squeeze more performance out of your heaping pile, DX9 is a good choice.
 
Won't stay for long that way though.. A new patch will be released on Monday which will improve performance on AMD cards by up to 25%! http://www.rage3d.com/board/showthread.php?p=1337201183

Vendor specific performance boosts lol :P
A (up to, no doubt, not a 25%+.) increase in performance would be nice, no idea why they only refer to AMD unless they use some GCN specific optimizations (Though if they did that would only affect the 7000 series wouldn't it.) or what this actually is based on, otherwise any optimization should work on Nvidia as well.

Maybe because this is a AMD's Gaming Evolved title, they just don't mention other brands.
 
Came back to this game with the patch and new drivers and... still crashing. Bitterly disappointed at this point.
I just finished the game today after about 18 hours (100% on it, too) and I crashed about 4 times total. Most of it was early on in the game, too. Once I about 8 hours in, it never crashed again for me. I didn't do the Hitman trick either....

GTX 680 on the latest beta drivers with everything enabled except TressFX and High Precision. Level of Detail on High and SSAO on Normal and I never dropped below 60 in the whole game.
 
-_________________________________-
fucking woow that was a nice $50 wasted....

did that registry fix thing and the options at least doesn't crash but I still get nothing but this...

;_____;
help please
 
Top Bottom