• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Tomb Raider PC Performance Thread

Does level of detail make a huge difference in framerate? I have a 660 ti and I'm running it on the highest setting currently, but turned down SSAO and DOF to normal. I'm about 3 hours in and got to an open mountain area and I get drops into the 30-40s from 60 when I move the camera to look at more open areas.
 
has CD said anything about the missing effects. Are they supposed to not be there? Is this some hot coffee situation where someone found stuff thats not supposed to be in the game. Seems very odd that they would not know about this.

For me, ive shelved it until we get some answers and patches.

They are aware of the issue. Nixxes posted they are investigating the problem and for now they can recommend playing around with the aspect ratio setting, which for some people brings the effects back if you change it and then change it back to the original setting (or something like that).
 
Does level of detail make a huge difference in framerate? I have a 660 ti and I'm running it on the highest setting currently, but turned down SSAO and DOF to normal. I'm about 3 hours in and got to an open mountain area and I get drops into the 30-40s from 60 when I move the camera to look at more open areas.

If you're CPU limited in an area (most likely in the larger areas such as Shantytown), then Level of Detail makes a huge difference. I keep it at normal to maintain 60fps.
 
They are aware of the issue. Nixxes posted they are investigating the problem and for now they can recommend playing around with the aspect ratio setting, which for some people brings the effects back if you change it and then change it back to the original setting (or something like that).

thanks.

Bioshock coming up. better get to work guys.
 
If you're CPU limited in an area (most likely in the larger areas such as Shantytown), then Level of Detail makes a huge difference. I keep it at normal to maintain 60fps.

I'll try dropping it later and see what difference it makes. My cpu is an i-5 3.0 ghz so I didn't really expect to hit a bottleneck on that front.
 
I'll try dropping it later and see what difference it makes. My cpu is an i-5 3.0 ghz so I didn't really expect to hit a bottleneck on that front.

CPU is a huge bottleneck in Shantytown even with a 4.4GHz 2500K. The good news is the drop in quality isn't very noticeable. You'll see the difference in screenshots, but you won't notice while playing at all.
 
Also, TressFX + 2x SSAA is unplayable. My GTX 680 actually runs out of VRAM if the camera gets too close to Lara (at least, I assume that's what explains the FPS drop to <5fps).

Likely a VRAM bottleneck, yeah. When I was testing out settings, TressFX + 2x SSAA @ 1920x1200 had the game allocating around 2.5GB VRAM from my 4GB 680 (I've since settled on SMAA injection, which works quite nicely).
 
These nvidia issues people are talking about, are they related to specific geforce models or do they hit all models? Specifically, does anyone know if the game works as it should with a geforce gtx 580? Has anyone managed to solve the issues?
 
CPU is a huge bottleneck in Shantytown even with a 4.4GHz 2500K. The good news is the drop in quality isn't very noticeable. You'll see the difference in screenshots, but you won't notice while playing at all.

Am I the only one nonplussed by the fact that such a game brings a very good processor to its knees ? My stock 3770 causes Shantytown to drop to the 40s.

I know draw calls are very expensive on PC but still.
 
2. There are various lighting, blood and water effects missing when playing the game in fullscreen, which only show in windowed mode;

I'm surprised this hasnt at the very least been patched yet, Not a fan of windowed gaming myself lol.

Here's one example (for anyone that hasn't seen the difference):

Fullscreen:
Fullscreen_TR_2013-03-07_00006.jpg


Windowed:
windowed_TR_2013-03-07_00005.jpg
 
Is the PC version better than the console version? And does the game have a lot of shooting segment?
And is there a physical PC version?
 
I'm surprised this hasnt at the very least been patched yet, Not a fan of windowed gaming myself lol.

Here's one example (for anyone that hasn't seen the difference):

Fullscreen:
8A79E5A63FDD6AAC1D3E3B37A27E8D02C9E142B1


Windowed:
2B9095F906C3581F49C13E371847041C2647840B

I'm so confused. Not because of the effects, but because there's zero anti-aliasing going on in those pictures. Even with FXAA my picture quality is way above what's features in those shots.
 
Is the PC version better than the console version? And does the game have a lot of shooting segment?
And is there a physical PC version?

my local supermarket sells a DVD version so yes to the physical, and depending how you go at the game it can be stealth kills or full on dudebro
 
Is the PC version better than the console version? And does the game have a lot of shooting segment?
And is there a physical PC version?

"Better" in what sense ? It controls just as well thanks to the native 360 controller support and looks noticeably better especially in the texture department.
 
i don't know if this have been said before but if you revert driver to 310.90 the game wont crash with tessellation on, i can confirm it, i have been playing for 3 hours now without any crash, only option i have lowered is LOD because of the huge frame rate drops on open areas and depth of field i have it off because i hate that effect, i just hate it, its obvious the game need to be patch for better performance but at least like this people should be able to get a good IQ and good performance level
 
Wow. This game uses 3.5GB VRAM @ 1440p on max settings full window mode. The good, glad, I bought a Titan card. The bad, I need 1 or 2 more.
 
CPU is a huge bottleneck in Shantytown even with a 4.4GHz 2500K. The good news is the drop in quality isn't very noticeable. You'll see the difference in screenshots, but you won't notice while playing at all.

Is the game fully utilizing all cores?
 
Can you elaborate?

Because a game calculation can't just be split up in equal pieces. That's the magic behind multi-core support, spreading out the calculations. Say one core is rendering the bulk of the game, another core is doing the physics. The core doing the physics, which is a lighter calculation only has to serve its calculation when the main core needs it. So it may sit idle until the game asks for a physics calculation and could potentially see only 30% use.

Same with every other 'split' in calculation, it needs to be calculated and served in the correct order and as I mentioned, it's a lot of work to split the bulk of a primary calculation up into a bunch of pieces because the calculation also needs to come back in one piece at the end. Some developers do manage to split up some of the primary calculations so all cores see slice of the calculations but it's much easier to just offload most of it on one core and split up the secondary calculations on the other cores.

That's why the upcoming generation is interesting, developers having to work with 8 cores. PS3 developers already showed they could stretch out a lot of calculations and make good use of the additional core support. Also the reason why you rarely ever see a game achieving perfect use across all cores - A 90% - 40% - 20% - 10% across the core use isn't out of the ordinary ( Random numbers ).
 
Wow. This game uses 3.5GB VRAM @ 1440p on max settings full window mode. The good, glad, I bought a Titan card. The bad, I need 1 or 2 more.

Well, Hopefully some of the performance issues will be improved with newer nvidia drivers / TR patch.
 
Because a game calculation can't just be split up in equal pieces. That's the magic behind multi-core support, spreading out the calculations. Say one core is rendering the bulk of the game, another core is doing the physics. The core doing the physics, which is a lighter calculation only has to serve its calculation when the main core needs it. So it may sit idle until the game asks for a physics calculation and could potentially see only 30% use.

Same with every other 'split' in calculation, it needs to be calculated and served in the correct order and as I mentioned, it's a lot of work to split the bulk of a primary calculation up into a bunch of pieces because the calculation also needs to come back in one piece at the end.

That's why the upcoming generation is interesting, developers having to work with 8 cores. PS3 developers already showed they could stretch out a lot of calculations and make good use of the additional core support. Also the reason why you rarely ever see a game achieving perfect use across all cores - A 90% - 40% - 20% - 10% across the core use isn't out of the ordinary ( Random numbers ).

Thanks for the info. I wish multi-core rendering was being used in PC games. It was somewhat only recently that it was added to drivers (at least in Nvidia's drivers), but still buggy. The option helped immensely in Far Cry 3, but it was unstable. Consoles being doing it for a while now.
 
Playing in buggy borderless windowed mode turned out too much of a hassle. I will wait for the patch.

I guess that in some twisted sense we should be be proud of nvidia for supporting a game that is playable with only semi-frequent crashes and a substantial graphical downgrade. At least there is no need for voltage hacking to play the game.
 
Playing in buggy borderless windowed mode turned out too much of a hassle. I will wait for the patch.

I guess that in some twisted sense we should be be proud of nvidia for providing a game that is playable with only semi-frequent crashes and a substantial graphical downgrade. At least there is no need for voltage hacking to play the game.

This is something AMD card users commonly faced in a lot of other games lol. Be grateful that this is like the only recent game that's f'ked up on NVIDIA's cards. Practically the rest of the console to PC ports are made with NVIDIA's TWIMTBP program in mind, often to the detriment of AMD's cards.

AMD's current partnership with Squenix for their PC ports is working wonders for them at the moment. (Sleeping Dogs, Deus Ex Human Revolution, Hitman and now Tomb Raider). Funnily enough, NVIDIA also had similar arrangements in the past with previous port jobs like The Last Remnant, and FF XIV which were both heavily optimised for NVIDIA's cards under the aforementioned TWIMTBP program.
 
Why the hell doesn't vsync work properly in this game? Using Nvidia control panel doesn't work, D3D Overridder, and in-game triple buffering also doesn't fully remove screen tearing.

I've never had this in any game before. -_-
 
This is something AMD card users commonly faced in a lot of other games lol. Be grateful that this is like the only recent game that's f'ked up on NVIDIA's cards. Practically the rest of the console to PC ports are made with NVIDIA's TWIMTBP program in mind, often to the detriment of AMD's cards.

AMD's current partnership with Squenix for their PC ports is working wonders for them at the moment. (Sleeping Dogs, Deus Ex Human Revolution, Hitman and now Tomb Raider). Funnily enough, NVIDIA also had similar arrangements in the past with previous port jobs like The Last Remnant, and FF XIV which were both heavily optimised for NVIDIA's cards under the aforementioned TWIMTBP program.

But won't somebody think of the children!?

on a serious note, this BS has been going on to long, and I don't want to be a downer for team green, but with all the next gen consoles being AMD based I see it not being a good time for Nvidia.
 
But won't somebody think of the children!?

on a serious note, this BS has been going on to long, and I don't want to be a downer for team green, but with all the next gen consoles being AMD based I see it not being a good time for Nvidia.

Yeah, I don't think you should be worried about Nvidia. A bit too much doom and gloom because a single game is poorly optimized.
 
Triple buffering activated, no tearing at all here, I'm very sensitive with tearing (what a plague!!)..... 660ti inside.

I confirm VRAM is engulfed by the game! Glad picking a 3gigs MSi 660ti OC... Playing on a mix of high and ultra settings, downsampling from 1440p and 2,2g used! I said god damned!!
 
Be grateful that this is like the only recent game that's f'ked up on NVIDIA's cards.

Sadly, many of the AAA releases were f'ked up on my NVIDIA cards last autumn. Huge game breaking bugs (green squares freeze) in Black Ops II, Far Cry 3, Assassins Creed 3 and Sleeping Dogs. They fixed these bugs about a month ago I think.
 
Yeah, I don't think you should be worried about Nvidia. A bit too much doom and gloom because a single game is poorly optimized.

wasn't Sleeping Dogs a bit bad on Nvidia too? I don't remember I gave that game a miss, but I do honestly think AMD is stepping up to match Nvidia's TWIMTBP program and again with consoles being the lead platforms AMD reps are going to be on hand for all major title releases next gen.

Sadly, many of the AAA releases were f'ked up on my NVIDIA cards last autumn. Huge game breaking bugs (green squares freeze) in Black Ops II, Far Cry 3, Assassins Creed 3 and Sleeping Dogs. They fixed these bugs about a month ago I think.

Ta, glad I caught that.
 
wasn't Sleeping Dogs a bit bad on Nvidia too? I don't remember I gave that game a miss, but I do honestly think AMD is stepping up to match Nvidia's TWIMTBP program and again with consoles being the lead platforms AMD reps are going to be on hand for all major title releases next gen.

Game worked just fine out of the box for me, the standard SLI profile was a bit wonky but that was quickly updated.
 
I think the key difference between this game and others in AMD's never back down program is the fact that Nixxes had to implement TressFX. This being the first game to do so must have required a significant amount of time,
time which on other projects like Sleeping Dogs, DmC etc. would probably have been used to get the game up to snuff on all cards.

I'm not expecting any problem with Bioshock when it comes out and I can't see Nvidia taking a back seat in the GPU market in the next few years based on the fact that next gen consoles are running AMD APUs.
 
wasn't Sleeping Dogs a bit bad on Nvidia too? I don't remember I gave that game a miss, but I do honestly think AMD is stepping up to match Nvidia's TWIMTBP program and again with consoles being the lead platforms AMD reps are going to be on hand for all major title releases next gen.



Ta, glad I caught that.

A lot of the issues I've seen have been related to bad micro stutter as in Sleeping Dogs and Darksiders 2's cases. I've been using MSI Afterburner to limit pretty much every game I play to 30 or 60fps since last summer and have avoided that problem almost entirely.
 
i cant even leave the cave. my game crashes as soon as it loads which is after i use the explosives to make a hole in the wall.
 
Why the hell doesn't vsync work properly in this game? Using Nvidia control panel doesn't work, D3D Overridder, and in-game triple buffering also doesn't fully remove screen tearing.

I've never had this in any game before. -_-

Huh. Thought this was only me. Its driving me nuts too. I was especially dismayed when D3D Overrider didn't do shit. I thought maybe it was cuz I was forcing a fps cap of 30 so I turned off dxtory but nah. Tearing still there.
 
Top Bottom