• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Tomb Raider PC Performance Thread

Why don't review sites and sites that analyze hardware performance break down the settings into FPS levels. I do like how this particular review breaks down the fps cost from High-Ultra.

http://www.techspot.com/review/645-tomb-raider-performance/page6.html

If we could get this in the settings, that would be perfect!

There is the solution leave DOF to Normal! I'll have to try that when i get home.

Ya DOF on Normal has a minimal change in visuals (its just a blur effect) but offers a pretty decent performance boost, at least on my GTX 480.
 
It annoys me every time I see benchmarks that don't do 1920 x 1080p. Am I wrong in thinking it's the most common resolution people use?

It's the most popular LCD monitor resolution (unfortunately, it's difficult to buy 16x10 monitors these days) and it's the only option for comfy couch PC gamers.
 
tr_chs.jpg


Something is wrong with my shadows....I can't notice contact hardening shadows and I have the shadow setting set accordingly. :(
 
just finished the games doing mostly 45-55 fps. i really wish i didnt get so much screen tearing. i dont know that ill ever go back to replay it and enjoy all the effects once they are patched in since there's no ng+ mode.
 
Andy ETA on that driver?

No ETA that I can share. We have internal targets, obviously, but if I post that target and we miss it you can imagine the comments. Rest assured, the team is working on it.

Then why are they advertising this ? Why would they have cut them ?

Probably pulled shortly before launch. Have seen that a lot over the years (was a PC games journalist for 10+ years before I joined NVIDIA), but you rarely hear(d) about it because most tech features aren't detailed in advance.
 
No ETA that I can share. We have internal targets, obviously, but if I post that target and we miss it, you can imagine the comments. Rest assured, the team is working on it.
Thank you for the update !


Probably pulled shortly before launch. Have seen that a lot over the years (was a PC games journalist for 10+ years before I joined NVIDIA), but you rarely hear(d) about it because most tech features aren't detailed in advance.
That's a shame, the effect seems well done, I don't understand why they would cut it from the final release.
 
Thank you for the update !

That's a shame, the effect seems well done, I don't understand why they would cut it from the final release.

Because the effect didn't look good enough? It was too resource hungry? Buggy code? Driver issues?
 
Hi folks!

What chance to I have of pulling 60fps on "as high as possible" settings on this rig:

Intel Core 2 Duo E6600 2.4 overclocked to 3.2GHz
8GB DDR2 RAM
XFX Radeon 7850 2GB Black Edition (Factory OCed to 975MHz).
Resolution: 1920x1200

I wish this game had a demo so I could try it...
 
Because the effect didn't look good enough? It was too resource hungry? Buggy code? Driver issues?

We can only speculate. As is it now even without that effect, the game is taking the piss on NVIDIA's cards.

Yet TressFX remains?

Dat AMD Gaming Evolved™ Partnership. No cookies for you, NVIDIA underclass.

Hi folks!

What chance to I have of pulling 60fps on "as high as possible" settings on this rig:

Intel Core 2 Duo E6600 2.4 overclocked to 3.2GHz
8GB DDR2 RAM
XFX Radeon 7850 2GB Black Edition (Factory OCed to 975MHz).
Resolution: 1920x1200

I wish this game had a demo so I could try it...

Your Core 2 will drag you down later in the game. Expect to have pull down the LOD setting to Medium or Low during the more open ended areas of the game. The Core 2 is bottlenecking that card tremendously mate.
 
The Core 2 is bottlenecking that card tremendously mate.
I know. I plan to wait until Haswell comes out. Then I'll decide if I'll go with Haswell or Ivy Bridge. I'll do a MBO, CPU, cooler and MEM upgrade. Until then, I get on with this current rig (which eats most games, btw, just not the latest power-hungry ones like Crysis 3, etc.). :)

Thanks for info. When Tomb Raider dips in price, I'll get it and try it.
 
Ya DOF on Normal has a minimal change in visuals (its just a blur effect) but offers a pretty decent performance boost, at least on my GTX 480.


DOF ultra produces a big performance impact but is not minimal change in visuals. Only in DOF ultra things very near the camera are out of focus. For example if Lara moves the torch very near the camera, with ultra DOF you can see the progresive blur along the arm.
 
DOF ultra produces a big performance impact but is not minimal change in visuals. Only in DOF ultra things very near the camera are out of focus. For example if Lara moves the torch very near the camera, with ultra DOF you can see the progresive blur along the arm.

Can definitely live without...
 
Can definitely live without...


Obviously you can. Same with tesselation. Same with ultra SSAO. But every effect counts. While normal DOF is a on/off blur situation, ultra DOF is progressive and "similar" to ultra DOF in Metro 2033. Is subtle but very demanding.
 
Obviously you can. Same with tesselation. Same with ultra SSAO. But every effect counts. While normal DOF is a on/off blur situation, ultra DOF is progressive and "similar" to ultra DOF in Metro 2033. Is subtle but very demanding.

You're the same MaLDo who did that Crysis 2 MaLDo HD mod right? Love your work, even though I can't personally enjoy it at the moment.
 
i cant play this shit. crashes to desktop are annoying but i get this weird full screen artifacting quite often. none of the menus are affected, i just came off crysis 3 with no problems and i double checked temps and they're fine so wtf. seems like it happens in select places pretty consistently. is there a work around?

looks like this kind of.

C3pKPUt.gif


i have a feeling we wont see new nvidia drivers until bioshock and by then i wont have no interest in this game anymore....
 
i cant play this shit. crashes to desktop are annoying but i get this weird full screen artifacting quite often. none of the menus are affected, i just came off crysis 3 with no problems and i double checked temps and they're fine so wtf. seems like it happens in select places pretty consistently. is there a work around?

looks like this kind of.

C3pKPUt.gif


i have a feeling we wont see new nvidia drivers until bioshock and by then i wont have no interest in this game anymore....

Give this a try (as linked in the OP) if you haven't already:

http://www.neogaf.com/forum/showpost.php?p=49068705&postcount=979
 
Played the game from start to finish with zero issues.

i5-2500k
GTX 560Ti 448 Core
8GB Ram

played on all high settings, no tressfx and tessalation on
 
Hi folks!

What chance to I have of pulling 60fps on "as high as possible" settings on this rig:

Intel Core 2 Duo E6600 2.4 overclocked to 3.2GHz
8GB DDR2 RAM
XFX Radeon 7850 2GB Black Edition (Factory OCed to 975MHz).
Resolution: 1920x1200

I wish this game had a demo so I could try it...

My PC is quite similar to yours: a Core 2 Duo e7200, clocked at 3.8ghz, 4gb DDR2 and a 1gb GTX560.

Playing at 1080p I get 40-60fps in most areas, and 25-30fps in the most intensive parts. LOD at medium is fine for the most part, and while I've been prepared to drop it down to low I've never really felt the need.

My settings:
 
My PC is quite similar to yours: a Core 2 Duo e7200, clocked at 3.8ghz, 4gb DDR2 and a 1gb GTX560.

Playing at 1080p I get 40-60fps in most areas, and 25-30fps in the most intensive parts. LOD at medium is fine for the most part, and while I've been prepared to drop it down to low I've never really felt the need.

My settings:

alyson, with that 1gb GTX560 you should be able jack up the texture quality to high or even ultra most of the time with zero FPS loss. VRAM usage never exceeded 1 GB for me, and I use a much more inferior card (HD 4670 1GB) with a much lower clocked Core 2 t9550 @ 2.8 GHz. Of course, I've to aim for a much lower performance target of 30-35 FPS @ 1600x900

Here're my settings:-

 
Oh super, thanks. I forgot to check how much vram it was using, I was just so pleased to get it running at playable framerates.

I keep intending to upgrade my CPU but new games keep running pretty okay on it so I keep delaying...
 
My PC is quite similar to yours: a Core 2 Duo e7200, clocked at 3.8ghz, 4gb DDR2 and a 1gb GTX560.

Playing at 1080p I get 40-60fps in most areas, and 25-30fps in the most intensive parts. LOD at medium is fine for the most part, and while I've been prepared to drop it down to low I've never really felt the need.
Thanks for this! Hope is high then. :)
 
Give this a try (as linked in the OP) if you haven't already:

http://www.neogaf.com/forum/showpost.php?p=49068705&postcount=979

so i tried the hitman fix and it still fucks up. tried without the patch, tried older drivers and it does that corruption effect every time. guess tessellation just isnt going to happen unless someone else has an idea.


i dont understand how this game made it through QA. it's not like the some specific setups might have issues, the game is flat out broken.
 
The Tomb Raider patch didn't fix my TressFX issues with it glowing repeatedly :(.

Hopefully the next nVidia drivers will work this out so I can actually play with TressFX enabled (GTX Titan @ 5720x1200).
 
The Tomb Raider patch didn't fix my TressFX issues with it glowing repeatedly :(.

Hopefully the next nVidia drivers will work this out so I can actually play with TressFX enabled (GTX Titan @ 5720x1200).

TressFX runs fine on my Titan@1440p. Don't use 2X/4XSSAA. Run with no AA or FXAA.
 
Top Bottom