• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Tomb Raider PC Performance Thread

Got the new patch, cranked everything up to maximum.. I got the "censor blur" hair and about 10fps. Turned off Tress and got 30fps. Turned down to FXAA, back at 60fps, with everything maxxed out except Tessellation (waiting until they officially announce that they've fixed it before I turn it on).

i7-2600K @ 3.4GHz (stock), GTX-680.

Tesselation hasn't had any issues for me, on a GTX 680. I'm using the 314.14 drivers with the Hitman trick (delete Tomb Raider profile in NVidia Inspector, then add the Tombraider EXE to the Hitman Absolution profile).

I think the number of simulation points per strand increases as the camera gets closer in, which makes sense.

I wish they would allow it to stay constant. When the camera doesn't go close up, I get 60fps.
 
Can someone else with a nvidia card please start a new game. Do you see the fmv intro?

Edit: ok, I think I have tracked down the problem. SSAA turns fmvs black w. the new patch. Switch to FXAA and you will get a picture in fmvs.

Ah ok. Yeah it was a black screen for me too.

670 and the game is still fucked even with the Hitman trick. Made it about 30 seconds out of the cave. Everything max except 2x SSAA and no DoF because I'm stubborn and refuse to play without tessellation.

TressFX ponytail clips really bad through her back/shoulders now too.

edit: Oh, I forgot to go back to the beta driver, maybe I'll try that in a bit.
 
Tesselation hasn't had any issues for me, on a GTX 680. I'm using the 314.14 drivers with the Hitman trick (delete Tomb Raider profile in NVidia Inspector, then add the Tombraider EXE to the Hitman Absolution profile).
I would get the beta drivers, if they had made any mention that it did anything. It seems those drivers were made before Nvidia even got the final code, so I doubt they're having any effect on the game.

I don't like the idea of profile-juggling more on principle because I shouldn't have to.
 
yeah just check and I am on
1.00.718.4

steam never showed it updated the game though. I guess it all happened in the background. anyways
average fps was 38.4fps
with tressfx on
high precision off
FXAA
tessellation off
on i5 3570k 3.8ghz and 6950 2gb
 
I would get the beta drivers, if they had made any mention that it did anything. It seems those drivers were made before Nvidia even got the final code, so I doubt they're having any effect on the game.

I don't like the idea of profile-juggling more on principle because I shouldn't have to.

The drivers add official support for Tomb Raider. There is no profile prior to that AFAIK.
 
The drivers add official support for Tomb Raider. There is no profile prior to that AFAIK.
I know, I'm just saying there's no optimization for the game, I don't think they're having the slightest effect on performance. You could probably switch between drivers all day long and never see a difference.
 
It's not just the intro, every fmv is black with SSAA on. Ugh.

Yes, unfortunately. On the bright side of things, FXAA is not that awful. The game looks slightly better with SXAA and x2 worked nicely in 60fps with my hardware, but FXAA will do just fine. Tressfx looks weird to me. Seems really clippy and ugly in most scenes beyond the benchmark video, so I will gladly disable it.
 
When did the patch go live? I was playing late last night and I swore I was seeing post effects I hadn't seen before. It's too bad I probably only have an hour or so left in the campaign. Will still go back for some collectibles.

Hair looks more realistic in strong wind, I think. It doesn't flap as erratically as before.

some time between midnight EST last night and 8:30 AM this morning.
 
I know i'm no expert but, why video cards dont have an extra chip only to deal with physics and nothing else? Like a second minor/slave card integrated. That way it would'nt take power of the "main" card.
 
Post-patch, I played the end of the game, and sorta wish I hadn't. The effects are ridiculous, and turning off Post-Processing disables AO, which is shit.
 
I know i'm no expert but, why video cards dont have an extra chip only to deal with physics and nothing else? Like a second minor/slave card integrated. That way it would'nt take power of the "main" card.

Because those units could be used for the main processing. GPUs have SIMD centric architectures, which is suitable for both graphics processing and physics processing. There's no sense in arbitrarily separating the two.
 
I know i'm no expert but, why video cards dont have an extra chip only to deal with physics and nothing else? Like a second minor/slave card integrated. That way it would'nt take power of the "main" card.

Makes more sense to just add more multipurpose CirectCompute/CUDA/whatever cores that can do anything pretty well and it's up to developer to decide weather to use them for physics of something else.
 
Got my first crash ever now after the patch, after I got killed by tons of enemies and wolves. On a 6950, i5 2500k and 8gb ram, didn't have any crashes during the 5 hours I played before the patch. :(
 
I know i'm no expert but, why video cards dont have an extra chip only to deal with physics and nothing else? Like a second minor/slave card integrated. That way it would'nt take power of the "main" card.

This used to be case until Unified shaders were used starting with DirectX 10 in Vista. Cards were using dedicated Pipelines which assigned different functions to those portions of the chip up to the Radeon 1900 series and the NVIDIA 7900 series.

DX10 brought with it most significant change to the API in ages with the introduction of the Geometry Shader, which unfortunately is still widely unused lol. DX11 with it's all purpose Compute Shaders and tessellation seems to be the more favoured API.

Heck, PhysX used be hardware enabled on a seperate PPU card sold by AGEIA ages ago until they were bought out by NVIDIA and integrated into their graphics cards.
 
Well this is sad; the hd 7870 beats the gtx 670. Nvidia better release optimized drivers soon.
But still, kepler's direct compute performance was ignored to focus strictly on gaming power as there were no games that used directcompute for any big effects. I guess I should be happy that at least us nvidia users get to have somewhat playable framerates with amd's tech. Imagine playing batman on an amd card with physx on high.
 
But still, kepler's direct compute performance was ignored to focus strictly on gaming power as there were no games that used directcompute for any big effects. I guess I should be happy that at least us nvidia users get to have somewhat playable framerates with amd's tech. Imagine playing batman on an amd card with physx on high.

Or Mirror's Edge. It'd run fine, then glass would start breaking and it'd tank to 2fps. And it was on by default.
 
PC Gamer speculates that people should get used to the AMD advantage in next-gen titles.

Hopefully Nvidia will be able to support current high-end cards well into the next-gen(hopes gtx 660 owner...)
 
Have they fixed aiming in S3D? it's pretty difficult to aim, which is a shame because the game looks amazing in S3D.
 
PC Gamer speculates that people should get used to the AMD advantage in next-gen titles.

Hopefully Nvidia will be able to support current high-end cards well into the next-gen(hopes gtx 660 owner...)

The 360 had a ATI GPU and Geforce cards performed very well in multiplats even early in the 7th generation. I don't think there is anything to worry about for Nvidia owners.

As for myself, I don't plan on switching to AMD anytime soon, if I am to upgrade I would opt for the GTX 7XX series.
 
Does performance suck at 2560x1440 on a single 7970 with tressfx? I see people posting good results, but I get a 20-25 fps hit with everything maxed and it drops below 25 fps in cutscene closeups. I am running a 3570k overclocked to 4.5 ghz and my video card is overclocked. Is the problem the resolution or could it be something else like the overclock?
 
The 360 had a ATI GPU and Geforce cards performed very well in multiplats even early in the 7th generation. I don't think there is anything to worry about for Nvidia owners.

As for myself, I don't plan on switching to AMD anytime soon, if I am to upgrade I would opt for the GTX 7XX series.

I think this time might be different, since Nvidia and AMD have significantly different GPU architectures and AMD's is much faster in DirectCompute stuff. So if dev's take advantage of AMD's architecture, Nvidia cards just can't match it(at least current cards). At this moment there is not really games that take advantage of those features, but next-gen games probably will use them much more heavily.

One example of this architecture advantage is Bitcoin mining, where even low-end AMD cards beat the best Nvidia cards just due the architecture advantage.
 
The 360 had a ATI GPU and Geforce cards performed very well in multiplats even early in the 7th generation. I don't think there is anything to worry about for Nvidia owners.

As for myself, I don't plan on switching to AMD anytime soon, if I am to upgrade I would opt for the GTX 7XX series.

That's mostly thanks to their aggressive TWIMTBP campaign. They were actively courting publishers and devs the moment the gen began. Also Geforce cards didn't truly start to take over until the undisputed king at the time, the 8800 series came out. AMD had no viable answer for that beast of a line until they released the superb 4800 series, and hit another homerun with the 5870 which was the worlds first DX 11 card.

I still remember NVIDIA's close partnership with CAPCPOM of all the publishers. Their logos were plastered everywhere, and their games ran like shiet on AMD's cards at the time. Games like Lost Planet, DMC4, and Resident Evil 5, early gen ports.

AMD's Gaming Evolved helped them out later on, but it was a bit late. They got Codemasters(Dirt), EA (Battlefield 3), and Squenix on board.
 
Does performance suck at 2560x1440 on a single 7970 with tressfx? I see people posting good results, but I get a 20-25 fps hit with everything maxed and it drops below 25 fps in cutscene closeups. I am running a 3570k overclocked to 4.5 ghz and my video card is overclocked. Is the problem the resolution or could it be something else like the overclock?

If you're really maxing everything out, SSAAx4 would be the real killer. What happens when you turn down SSAAx4?
 
If you're really maxing everything out, SSAAx4 would be the real killer. What happens when you turn down SSAAx4?
Opps everything maxed except SSAA, that is even more of a performance killer. I was using max default settings which is FXAA. I was curious because some people barely get a performance hit with Tressfx and a 7970, although I wonder if they are crossfired or 1080p.

edit: Just messed around with a few settings and I get a near consistent 60 fps with all settings maxed using FXAA at 1080p with a single 7970. With 2x SSAA it drops down to 40-50 FPS. However, even with FXAA in cutscenes there is still a FPS drop whenever Lara's hair is on display.

I guess a single 7970 can't handle 2560x1440 with TressFX ;p.
 
Got the new patch, cranked everything up to maximum.. I got the "censor blur" hair and about 10fps. Turned off Tress and got 30fps. Turned down to FXAA, back at 60fps, with everything maxxed out except Tessellation (waiting until they officially announce that they've fixed it before I turn it on).

i7-2600K @ 3.4GHz (stock), GTX-680.

Weird. I get a solid 60 with TressFX off, everything else on w/FXAA or even 2xMSAA @1080P. 680+3570k

edit: even playable at 1440P with FXAA
 
That's mostly thanks to their aggressive TWIMTBP campaign. They were actively courting publishers and devs the moment the gen began. Also Geforce cards didn't truly start to take over until the undisputed king at the time, the 8800 series came out. AMD had no viable answer for that beast of a line until they released the superb 4800 series, and hit another homerun with the 5870 which was the worlds first DX 11 card.

I still remember NVIDIA's close partnership with CAPCPOM of all the publishers. Their logos were plastered everywhere, and their games ran like shiet on AMD's cards at the time. Games like Lost Planet, DMC4, and Resident Evil 5, early gen ports.

AMD's Gaming Evolved helped them out later on, but it was a bit late. They got Codemasters(Dirt), EA (Battlefield 3), and Squenix on board.

Yeah that's true. I hope for the sake of competition that they have something up their sleeve for multiplatform developpment.

I think this time might be different, since Nvidia and AMD have significantly different GPU architectures and AMD's is much faster in DirectCompute stuff. So if dev's take advantage of AMD's architecture, Nvidia cards just can't match it(at least current cards). At this moment there is not really games that take advantage of those features, but next-gen games probably will use them much more heavily
It's not impossible for Nvidia to refocus on direct compute, I don't believe they would be dumb enough to let AMD have such an edge in the PC gaming market.
 
I'm guessing there are still problems that you're going to have to wait for Nvidia to fix with new drivers. 600 series users seem the hardest hit.

My 560ti has had no problems. Game hasn't crashed once. Though I play without TressFX.
 
Oddly, my crashes were caused by the instability of my overclock, well, once I applied the Hitman fix, that is. Not like this hasn't happened for me before, since i like to pick at it on occasion, but I've never had it happen where the games .exe crashes, usually it's the video card driver.
 
Just downloaded the patch and started playing. Pretty sure at the very least this enabled all those missing effects in fullscreen mode. Feel like I'm noticing a lot more lens flare and "gunk on the camera lens."

Edit to myself: Well no shit, dumbass. Says so right in the patch notes.
 
Tesselation hasn't had any issues for me, on a GTX 680. I'm using the 314.14 drivers with the Hitman trick (delete Tomb Raider profile in NVidia Inspector, then add the Tombraider EXE to the Hitman Absolution profile).



I wish they would allow it to stay constant. When the camera doesn't go close up, I get 60fps.

Ty, my crashes are gone. My 680 is happy now

WTF

after this patch my save file that was at 90% now says 2% :-\

Same here, I got scared at first luz
 
Cool the game now runs fine. All we need now is some TressFX optimization for Nvidia cards, it honestly dips down to 5 fps for me in cutscenes and close ups.

They should put a low, medium and high TressFX setting, considering it's using tessellation for the hair that shouldn't be a problem. I can take some angular hair.
 
So I sucked it up and turned off TressFX and dropped to FXAA just so I had something to do today.

Starting in the area where you first see the deer, all the foliage shadowing looks like there are 50 helicopters in the sky cuz they keep strobbing black. I tinkered with every single graphics setting and nothing made it stop. Any ideas?

670
 
Just got a "Tomb Raider has stopped working" crash. First time I ever saw that. Pre-hitman fix, it would just ctd and that was it. I hope the patch didnt break more stuff.
 
Top Bottom