• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Tomb Raider PC Performance Thread

Has anyone nailed down what is actually happening in Shanty Town? I've dropped the game to its lowest settings and still get drops like crazy in that part of the game.

Just lower level of detail to normal. The games becomes hugely CPU limited in large areas with a lot of objects drawn. Lower lod will use lower quality or less assets in the distance, but the difference isn't particular noticeable.
 
As these other posters have pointed out, in the previous version you never actually had SSAA on unless you explicitly _enabled_ it in the game, even if the options screen would show it as enabled. So you would have to toggle it off in the game first, then back on. That was a pretty bad bug, which is why it was fixed

Ah, so that's why my fps is soo much lower now with the patch.

I actually had FXAA enabled instead of SSAA before the patch lol.

Still, SSAA should not be such a performance killer.
 
Just started playing.

Res: 1920x1080
Texture quality and Level of detail: ultra. Texture Filter: 8x. AA: off(using smaa).
Post processing, High Precision: on. Tessellation: off. Everything else Normal.

Game Looks pretty good so far.

y2m.png
 
I'm still getting crashes after the patch :-(

Running on Hitman Absolution profile with all settings maxed out except FXAA @ 1080p. My setup is SLI'd 670s and a 3570k @ 4.5ghz. GPUs are running stock settings. The game runs at 60fps most of the time with TressFX, Post Processing, and Tessellation. Any other SLI users having problems?
 
Just started playing.

Res: 1920x1080
Texture quality and Level of detail: ultra. Texture Filter: 8x. AA: off(using smaa).
Post processing, High Precision: on. Tessellation: off. Everything else Normal.

Game Looks pretty good so far.

Why would you set texture filtering to x8? Just do x16. The performance hit is negligible. I don't think I've ever set a PC game to less then x16.
 
Runs at a consistent 30fps on my X51 (i7 ver. with GTX660) in 1080p. Everything is on ultra, TressFX are off, along with tessellation and high precision. The benchmark says the frame rate never drops under 29 and never exceeds 39 but averages 30. I was expecting it run better, but it looks good and I can't really complain about it running at the same rate as most games.

I downloaded the latest patch on Steam today and it still crashed on me a few times... :(
 
Can anyone who has a high end cpu comment on what kind of fps they're getting in Shanty Town? I'm just wondering how big a difference the cpu makes at this part of the game.
 
AF is one of those things that you should set to 16 on your amd/nvidia control panel and then forget about it.

Well, AF set to 8x, min/max fps is 60 whereas 16x min/max fps is 56/60(this is with Tessellation enabled using Hitman Absolution profile). Anyway, I'll stick with 8x.

I'm still getting crashes after the patch :-(

Running on Hitman Absolution profile with all settings maxed out except FXAA @ 1080p. My setup is SLI'd 670s and a 3570k @ 4.5ghz. GPUs are running stock settings. The game runs at 60fps most of the time with TressFX, Post Processing, and Tessellation. Any other SLI users having problems?
I have a similar set up as you but cpu@ 4ghurtz. I'm going to play some more to see if I exp any crashes.
 
My PC Settings:

ATI Radeon HD 5700 Series

Windows 7 Professional 64-bit (6.1, Build 7601) Service Pack 1

4096MB RAM

Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz x2 ~3000MHz

Game Settings:

1920x1080

V-sync off (Triple Buffering in RadeonPro)

Texture Quality: Ultra

Texture Filter: Anisotropic 1x (16x in RadeonPro)

AA: Off

Shadows: Normal

Shadow Resolution: Normal

Level of Detail: Normal

Post Processing: Off

Tessellation: Off

High Precision: On

Hair Quality: Normal

Reflections: Normal

Depth of Field: Normal

SSAO: Off

Only played an hour so far, but it's been a smooth performance nearly reaching 60 fps.

Any suggestions on things I could increase without any dips, or things I should decrease for an increase framerate?
 
My PC Settings:

ATI Radeon HD 5700 Series

Windows 7 Professional 64-bit (6.1, Build 7601) Service Pack 1

4096MB RAM

Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz x2 ~3000MHz

Game Settings:

1920x1080

V-sync off (Triple Buffering in RadeonPro)

Texture Quality: Ultra

Texture Filter: Anisotropic 1x (16x in RadeonPro)

AA: Off

Shadows: Normal

Shadow Resolution: Normal

Level of Detail: Normal

Post Processing: Off

Tessellation: Off

High Precision: On

Hair Quality: Normal

Reflections: Normal

Depth of Field: Normal

SSAO: Off

Only played an hour so far, but it's been a smooth performance nearly reaching 60 fps.

Any suggestions on things I could increase without any dips, or things I should decrease for an increase framerate?

Since you're only rocking a Core 2 (like I am but at 2.8 GHz), I'd advise you to take down that Level of Detail to Medium for a better framerate later on in the game for the more open ended areas. People with even quad core i7s suffer severe frame rate drops in those areas and have to take down their LOD quality. LOD is pretty dependant on CPU power.

High Precision is another framerate killer for me, I get 5 extra FPS switching it off. Otherwise you're good to go, the 5700 series is a pretty solid old workhorse for its age. I use an even older lousier 4670.
 
My PC Settings:

ATI Radeon HD 5700 Series

Windows 7 Professional 64-bit (6.1, Build 7601) Service Pack 1

4096MB RAM

Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz x2 ~3000MHz

Game Settings:

1920x1080

V-sync off (Triple Buffering in RadeonPro)

Texture Quality: Ultra

Texture Filter: Anisotropic 1x (16x in RadeonPro)

AA: Off

Shadows: Normal

Shadow Resolution: Normal

Level of Detail: Normal

Post Processing: Off

Tessellation: Off

High Precision: On

Hair Quality: Normal

Reflections: Normal

Depth of Field: Normal

SSAO: Off

Only played an hour so far, but it's been a smooth performance nearly reaching 60 fps.

Any suggestions on things I could increase without any dips, or things I should decrease for an increase framerate?

My settings: MBP ;(
 
i5, 6950, 8GB RAM, Win8. Running at 1920x1280

TressFX halves my framerate to about 29FPS which is insane, it looks nice and all but I just can't take that kind of performance hit. Turning it off
 
You got Bootcamp set up on that thing? Does it have a dedicated graphics card made within the last 3-4 years?

If so, you could probably run it lol. If a Surface Pro with the crappy ass Intel HD 4000 integrated card can run it, your MBP can too lol.

It's a 2011 with NVidia 650M, 16GB RAM, i7. Runs at 1440x900 with medium/low settings with frame-drops a bit more often than I would like.

Might need to drop it down to 720.
 
It's a 2011 with NVidia 650M, 16GB RAM, i7. Runs at 1440x900 with medium/low settings with frame-drops a bit more often than I would like.

Might need to drop it down to 720.

Your laptop is three to 4 times times the laptop mine wishes it could be lol, and I'm running it at an acceptable level for my expectations of what archaic technology can do.

For your reference:-

Intel Core 2 Duo T9550 2.8 GHz
8 GB RAM
ATI Mobility Radeon 4670 1GB (This card has only got 1.5x the juice of the 360's Xenos)
Windows 7 Pro SP1

My settings are:-

1600x900, Textures at Ultra, Texture Filtering at 8x Anisotropic, AA Off, Shadows Normal, Shadow Resolution Normal, Level of Detail Medium, Post Processing Off, High Precision Off, Tesselation Off, Hair Quality Normal, Reflections Normal, Depth of Field Off, SSAO Off.

I get 30 - 40 FPS with this setup, and for a really old system like mine, I find that acceptable. I was actually amazed my system could run this game, it looks great. This system can't even run Crysis 3 due to the lack of DX 11 hardware.
 
Tried out the benchmark with the 'Ultimate' setting save for FXAA and tesselation. min 42, max 60, average 53 at 1920x1080.

i5 3570k
GIGABTYE 7950
8GB Sam RAM
Windows 7 Pro SP1

I'm yet to try those settings out during the game, however. If TressFX really does halve the frame rate then I'm sitting pretty!
 
With everything max but no AA or Tressfx I average 54fps, with Fxaa I get same results but blurry crap picture, with tressfx and no AA I average 32fps. This is all in 1080p.

GTX 580
I7 2600k
16GB Ram

Need 2014 to get here... Need that 780.
 
Just lower level of detail to normal. The games becomes hugely CPU limited in large areas with a lot of objects drawn. Lower lod will use lower quality or less assets in the distance, but the difference isn't particular noticeable.

This, for sure. I dropped LOD to normal and get 30/40 in action areas, 50 in quieter areasnow with everything else on ultra/tesselation/tressfx on. LOD seems to be a real performance killer. And it doesnt really look that different between ultra-normal anyway.
 
AF is one of those things that you should set to 16 on your amd/nvidia control panel and then forget about it.

Yeah, AF level never really seems to impact performance for me at all, so I always crank it up. I'm running this game on my not-exactly-mindblowing 2011 laptop with Ultra textures and 16x AF, and the performance hit compared to Low textures and no AF is minimal (if there is a difference at all). This always seems to be the case for me, which has made me wonder why console games always have such awful AF. Is it a memory thing? My laptop does have 2GB of VRAM, so I guess that's it?
 
Really thought about buying this but I'm already done with the game on Hard with a Game Fly copy. Sorta feel like my Nvidia having ass dodged a bullet here.
 
i7 920 @ 3.36ghz
GTX 480 (slight overclock 770 / 1900Mhz) 314.14 beta drivers

I'm having pretty good results with these settings (benchmark ran with vsync enabled).

yVJr4qD.png


llEsOjE.png


uSFqN60.jpg
 
Anybody else's Tomb Raider crashes when you hit play?? Suddenly the game stopped working adn when I hit play it looks like it would go into fullscreen and then closes. I ended up deleting and about to reinstall it. I didn't change any options since last playing it the other day.
 
Anybody else's Tomb Raider crashes when you hit play?? Suddenly the game stopped working adn when I hit play it looks like it would go into fullscreen and then closes. I ended up deleting and about to reinstall it. I didn't change any options since last playing it the other day.

Happened to me. Close fraps/msi afterburner if you have that running. If not turn off FXAA and select the no AA option or chose SSAA.

On another note when do Nvidia have an ETA on the new drivers for this?
 
Anybody else's Tomb Raider crashes when you hit play?? Suddenly the game stopped working adn when I hit play it looks like it would go into fullscreen and then closes. I ended up deleting and about to reinstall it. I didn't change any options since last playing it the other day.

That happened to me when I played it with a RadeonPro profile. When I stopped using that it worked. Not sure if that helps or not.
 
Happened to me. Close fraps/msi afterburner if you have that running. If not turn off FXAA and select the no AA option or chose SSAA.

On another note when do Nvidia have an ETA on the new drivers for this?

Fuck this is still not working for me. FRaps/MSI is closed, I rebooted to make sure. I tried no AA, FXAA and SSAA and nothing works.
 
Okay Idk what happened but I got it to work. I went to the game's registry location and there was a value data in the "renderapi" field that was "0" for DX11. I changed it to 9 to try in dX9 mode and it came on. I changed it back to 0 for dx11 and it works also. weird.

EDIT: Spoke to soon. It actually isn't working.
mad.gif
I thought I changed it but when it's on "0" it doesn't come on.
 
Even with Titan SLI it cannot maintain a perfect 60 fps with triple buffer v-sync + ultimate setting @ 3200x1620. But after seeing the hair in motion I say that's worthy. Now, NVIDIA, please show me a new driver so I can run this @ 4K.

 
Even with Titan SLI it cannot maintain a perfect 60 fps with triple buffer v-sync + ultimate setting @ 3200x1620. But after seeing the hair in motion I say that's worthy. Now, NVIDIA, please show me a new driver so I can run this @ 4K.

CGI worthy Lara. Only on NVIDIA's Titans. For 2 grand.

Yeah, I'm jealous.
 
If nvidia does not have a headstart they are really slow with drivers, wow.

They didn't have that 'special partnership' with CD and Nixxes for this port, thus they have to do it the hard way, via experimentation. Just like what AMD had to deal with all this time with the studios they didn't bribe lol.
 
Top Bottom