• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Tomb Raider PC Performance Thread

By the way, there is a significant difference between Normal and Ultra DoF. The former does not apply DoF to objects very close to the camera.

Normal:
normalm8j65.png

Ultra:

Also, the Double/Triple Buffering option doesn't really seem to do anything for me. My game is triple buffered regardless of which option I choose.
 
i5 3570k @ 4.20 GHz
GTX 460
8 GB DDR3
Win7 x64

I'm 98% through the game and luckily didn't experience any of the problems that other users were reporting. Running the game fairly well at high settings, 1680x1050 res with Vsync turned on. Getting an average of 60fps most of the time, dipping to 50 in more crowded areas/scenes but nothing unplayable. Even with lowered settings this game looks great.
 
I really like the tesselation in this game. It's very subtle, and not distracting at all. It just improves the geometric detail of the game, at relatively no cost. The benchmark may disagree with that, but in the actual game the performance hit is very minor.
 
I'm 20% in according to my save file (I just reached the bridge beyond the radio control room)... and I've had no lock ups. Performance isn't great with everything turned up full (apart from DOF which I have off) in 3D at 720p... but I think I'm going to deal. It's completely playable and every time I turn off one of the other effects to get a framerate boost, I just want to turn it back on. Even high precision.

Also, this sure feels like a proper Tomb Raider game. Great atmosphere and plenty of exploration.
 
I've updated my benchmarks in the OP; I noticed that the AA setting in the launcher defaults to FXAA upon restart, so I wasn't sure if they were actually run with 2x SSAA enabled. (The originals can be found here and here; TressFX enabled and disabled respectively.)

A comparison:

TressFX on:
Old Min: 23.3fps
New Min: 28fps
Diff: +4.7fps

Old Max: 39.6fps
New Max: 54.5fps
Diff: +14.9fps

Old Avg: 31.1fps
New Avg: 44.3fps
Diff: +13.2fps

TressFX off:
Old Min: 38.9fps
New Min: 25.2fps
Diff: -13.7fps

Old Max: 60.3fps
New Max: 70fps
Diff: +9.7fps

Old Avg: 50.5fps
New Avg: 61.9fps
Diff: +11.4fps

Going from the 313.96s to the 314.14s seems to have improved performance across the board aside save for the inexplicable minimum framerate drop when TresFX is disabled. I'm waiting for the postman to drop off a second 670, so I'll add some SLI benchmarks later on.

wip4jup.png


Anger Rising.

Optimistically, typing "late" rather than "later" is an easy mistake to make.
 
The blood textures on her body look like they are just painted on, IMHO. Not all that convincing. Also noticed that that blood streaks and drops are mirrored on her arms, lol. Is there a story reason for that?

Thank Christ I'm not the only one perplexed by this--it's been this way since the very first screens surfaced three years ago. I was beginning to think I was crazy. As much as everyone obsesses over every last pixel, jaggy, and artifact, I'm surprised no one is vexed by this.
 
So I got everything set to high / ultra , tress FX turned off and SSAO set to normal instead of high and I'm getting close to 60 fps.

Game looks pretty damn slick.

Reading all these nvidia card owners reminds me of the last 10 years where every single release on PC ran like that on AMD cards for essentially the reason tomb raider isn't working on Nvidia. (backroom dealings and politics, money in other words).
 
Welp, I had my first crash with tessellation off. I'm around 25% through the game, at
the mountain village
for the second time, during the day. I seem to be getting a shitty framerate in general in this area, and I've crashed multiple times. (Twice with tessellation on, once with it off).
 
I'll refer you back to the post you quoted...

Do you think AMD got made for Nvidia TWIMTBP final code any sooner the number of new releases that have driver issues the red side?

Let me point you too -

and it's not like Nvidia have ever locked out features using Vendor ID is it? Oh wait Batman Arkham Asylum

to quote the same post :- but everyone has to remember what goes around, comes around, and AMD might be performing better on this release and maybe a couple of others but it won't stop Nvidia doing the same back, the only people who get hurt are the customers.

It's like I said from the start, its stupid PR bullshit tactics. (Actually, I said its all a part of the PR game). So all the finger pointing is useless. I'm sure you'll point out where I did exactly that (finger pointing) in this very thread, and you'd be missing my point about AMD playing the little guy role, all the while, doing the same thing Nvidia does.

I care very little about the little games BOTH vendors practice. What I do care about is when folks make excuses for it. Just because Nvidia has and no doubt WILL do this sort of thing in the future, I'm not gonna suddenly forget AMD is doing this too. Especially when they wanna point fingers and whine and complain about unfair practices.

I'm bailing on this thread as it is becoming console warrior territory, but everyone has to remember what goes around, comes around, and AMD might be performing better on this release and maybe a couple of others but it won't stop Nvidia doing the same back, the only people who get hurt are the customers, it's just AMD customers are a bit more use to it at this point...

Right.
 
Apparently the game is missing all camera effects in Fullscreen or non-exlusive Fullscreen (lens flares, blood splatter, rain drops).
 
What exactly does tessellation do for this game? I can't seem to find any comparisons anywhere.

Improved geometry. Lara's model is far more rounded, trees are more rounded, rock surfaces have higher detail. The touches are fairly subtle, but they're there. If you run the benchmark, you'll notice especially that Lara's shoulders, among other things, are much more rounded with tesselation on. The performance impact in the benchmark is rather large, but I've found that's not representative of the game, where the performance impact is relatively minor.
 
I'm not sure how you got it to DX9, but it might be allowing you to check things that simply don't work on DX9, like tessellation.

AA is different between those 2 shots too. There is like no AA going on Lara in the dx9 shot (actually no AA on everything).

edit: Also noticing a difference in anisotropic filtering too, distant textures are more blurry in dx9. Definitely not using the same settings.
 
I'm not sure how you got it to DX9, but it might be allowing you to check things that simply don't work on DX9, like tessellation.

Nah, that stuff's grayed out, it only lets you choose what you'd expect to be able to choose; Normal SSAO, Depth of Field, etc. But even knocking all the settings down to the same level as in DX9, my framerate is worse.


AA is different between those 2 shots too. There is like no AA going on Lara in the dx9 shot.

edit: Also noticing a difference in anisotropic filtering too, distant textures are more blurry in dx9. Definitely not using the same settings.

DX11 has SMAA injected, I think. I forgot to turn it off when taking the shot. DX9 had none. Not sure about the AF settings, as Inspector is set to force 16x on every game I play regardless.
 
KyleN just told you.

Missed that. :D

And yeah, those two shots are very different. The main thing I notice is that the "Level of Detail" setting is lower in the DX9 shot. Notice that a lot of things in the background are missing. For one, notice the sloped rope above Lara's head, in the distance. It's not in the DX9 shot.
 
An easy way to get a *massive* FPS boost is to turn Level of Detail from ultra to normal. You lose some fine detail on really far away objects, but it's really not super noticeable unless you're really looking for it. I went from 30ish FPS in shantytown with just SMAA to 60 FPS with SMAA and 2xSSAA.

I found a good spot to highlight the differences in graphics, this is an extreme case as generally you aren't looking at stuff this far away.

Ultra:

ibp9Mm16LPZXhK.png


Normal:

iDg0ilu0dB4ui.png
 
An easy way to get a *massive* FPS boost is to turn Level of Detail from ultra to normal. You lose some fine detail on really far away objects, but it's really not super noticeable unless you're really looking for it. I went from 30ish FPS in shantytown with just SMAA to 60 FPS with SMAA and 2xSSAA.

I found a good spot to highlight the differences in graphics, this is an extreme case as generally you aren't looking at stuff this far away.

Ultra:

ibp9Mm16LPZXhK.png


Normal:

iDg0ilu0dB4ui.png

Yeah I've noticed this. Level of Detail is the single setting that seems largely CPU bound, so it's going to result in big performance boosts even for people with high-end setups. It's definitely not all that noticeable in stills unless you're looking for it, but I'm mostly wondering how noticeable the pop-up is from a lower Level of Detail.
 
Maybe that's what's different in DX9 despite it saying LOD is on Ultra. I honestly feel like playing at Normal settings at 1620p and 60fps is better than playing at 30fps with 4xSSAA. I'll stick to DX9 for now.

Yeah I've noticed this. Level of Detail is the single setting that seems largely CPU bound, so it's going to result in big performance boosts even for people with high-end setups. It's definitely not all that noticeable in stills unless you're looking for it, but I'm mostly wondering how noticeable the pop-up is from a lower Level of Detail.

That setting sounds a lot like Geometry Detail in NFS: Most Wanted that was causing performance issues with little to no difference visually.
 
on dx9 the the frame rate drops are basically gone, everywhere it runs at locked 60, the problem for me is that sweetfx will not work, i've tried everything and was not able to get it to work and this for me is essential, so i guess ill just wait for new drivers and a game patch
 
Maybe that's what's different in DX9 despite it saying LOD is on Ultra. I honestly feel like playing at Normal settings at 1620p and 60fps is better than playing at 30fps with 4xSSAA. I'll stick to DX9 for now.

It is. If you take the screenshots with the same camera angle you will immediately notice that the DX9 shot has a lot less objects in it. Heck, look at the trees in the back, they're not even there in the DX9 shot.

Try DX11 with a lower LOD. You'll probably get the same results.

I'm noticing other things too...the texture of the floorboards on the rightmost structure are far lower quality in the DX9 shot. There's more aliasing too. Way more aliasing. Just look at Lara's model in DX11...it's pristine. Now look at the back of her legs in DX9...tons of aliasing.
 
Had about ten crashes over the last two days. Was able to get in some solid hours crash free but eventually it happens.

HD6870. AMD is not immune.
 
on dx9 the the frame rate drops are basically gone, everywhere it runs at locked 60, the problem for me is that sweetfx will not work, i've tried everything and was not able to get it to work and this for me is essential, so i guess ill just wait for new drivers and a game patch

There's a different .dll needed to run SweetFX in DX9, though I think they're all included anyway, so I dunno.
 
Apparently the game is missing all camera effects in Fullscreen or non-exlusive Fullscreen (lens flares, blood splatter, rain drops).
Can you elaborate on this a little bit?

EDIT: Nevermind, I understand.

I run the game in Exclusive Fullscreen anyway, because it gives me an enormous FPS boost. Just makes it a pain in the ass to alt + tab.
 
Top Bottom