Curufinwe said:Does motion blur affect performance much, and is it an option most of you guys plan to use?
careksims said:Extremely awesome.
Got a B at a constant 59.9FPS
Intel Core 2 Quad Cpu
Nivida GeForce GTX 260
1280x720x32bit
DX 10
2X AA
VSYNC on
Awesome to see it run at that speed and tons of enemies. I have the PS3 version so this is a mighty step up!Will definitely get it.
Xdrive05 said:Great, now try it with the vsync off and see what the game looks like at 200fps! If you're going to vsync cap it at 720p/60fps you might as well turn on 16xQ AA. Your fps will still be 60 and it will look much much better at that res.
Enjoy!![]()
K.Jack said:Hmm, let's see what ye old notebook can muster @ 1680x1050 max settings. Downloading now.
Minsc said:You must feel so outdated with the 2x 280M GTX laptops coming out next month. :lol
Htown said:PC GAF, should I upgrade my card first or my processor? Also, I'm trying to stay with Nvidia and under $150 for my card, as I am just not able to drop 250-300 bucks on a card, especially since I am going to have to upgrade my ASS power supply, too. (250 Watts, what are you thinking, HP?) Is a GTS 250 a decent choice?
I like that screen.Shadow780 said:FML, updated to newest driver, tried CF Xtension on or off. SAME result.:lol :lol
I'm beginning to think this is a CF thing, never had a problem with SFIV benchmark though, it ran perfectly.
http://i26.tinypic.com/2baas1.jpg
http://i26.tinypic.com/t9wc5e.jpg
I think I noticed more bloom on objects. Some said it performs better than DX9.Baloonatic said:Stupid question, but what difference does DX10 actually make? Like, what does it improve graphically?
I only ask because I can get 60fps in the gameplay sections using DX9 when I have AA turned off running at 1920x1080 (I don't even really notice jaggies that much on higher resolutions) so I'll probably just stick with that. I can't really see a difference myself but I guess I would if I knew what to look for.
Dr. Light said:WTF is this garbage?
Screenshots won't show up for some reason, but it drops to about 15fps in Area 3. Only got 37fps in the fixed benchmark (max settings at 1080p), I was expecting 70-80fps.
I have a 1GB 4890 with the latest 9.6 Catalyst version. I got 95fps with everything maxed out in the SF4 benchmark.
Why does this thing hate ATI? I wasn't taking the Nvidia logo seriously, did they actually fuck this up? Wow.
TheExodu5 said:ATi has had some problems with a few games recently. The Last Remnant runs pretty terribly on ATi. No clue why. Far Cry 2 performance isn't stellar either.
Shadow780 said:Last Remnant was fixable. Hopefully the retain version will be too.
Action Jackson said:Laptop benchmark (everything on high, no AA).
![]()
TheExodu5 said:Out of curiosity, what was the fix? I'm wondering what the problem stems from in the first place.
Shadow780 said:Well it's not really a fix per se., more of a compromise. Just turn shadow quality to anything but high, the game ran smoothly for me afterwards.
ChoklitReign said:What's the difference in IQ between DX9 and DX10?
Ikuu said:65.7 FPS on DX9, Highest Settings and 4xAA
40 odd on DX10, Area 3 ran like shit for some reason
1920x1200
3.2GHz Q6600
4870 1GB
4GB RAM
Windows 7
Area 3 has some awesome explosions in DX10 mode. My GTX295 took a hit down to 60fps from the usual 90fps it was getting. I can see where people are coming from.Dr. Light said:Wow. I just tried running at DX9 (same settings, 1920x1080 8xAA maxed) and got 70.3fps. Area 3 was smooth as silk.
What the hell is going on here?
Not only ATI, as I said, Area 3 has some hardcore explosions.Ikuu said:There is obviously something wrong with ATi cards running DX10 in Area 3.
TheExodu5 said:ATi has had some problems with a few games recently. The Last Remnant runs pretty terribly on ATi. No clue why. Far Cry 2 performance isn't stellar either.
Hopefully it's fixed in a coming Catalyst update.