Resident Evil 5 PC Benchmark

Dr. Light said:
The fact of the matter is that I remember playing F-18 Intercepter (and Arkanoid and Winter Games and all that shit) on the Amiga back in the late '80s when I was like 7, well before I had a NES. From SimCity 2000 to Quake 3 to WoW to Crysis, I've had 5 generations of PCs. I'm not sure why I don't identify with PCs as much as consoles, maybe because I've been mostly out of the loop for years, maybe because 90% of the games I own are console, but...whatever.

I always knew PCs provided a kind of experience consoles can't match, but I always thought the reverse was true as well. I never dreamed I'd buy a fighting game for PC, but here I am with the PC version of SF4 even though I already had the PS3 version. After seeing the benchmark I just had to have it. I got a new rig because I'm not interested in a 10-year console generation, I want the new hotness. Crysis is awesome, not just graphically but it's a superb FPS. Seeing RE5 at 1080p 8xAA at over 70fps was indeed an eye opener, makes me wish I hadn't already bought the 360 version. But you know what? The 360 version still looks great. The fact that the PC version goes even further doesn't change it's objective greatness.

The PC fills a hole that existed before, so I'm grateful for that. That doesn't mean that consoles suck or that Uncharted 2 doesn't look great. Better on PC? Sure. But it still looks great.

There is nothing wrong with consoles. They are capable of providing an insane amount of fun and unless I'm mistaken, that's what this hobby is all about. I enjoy my consoles greatly and I don't blame anyone for digging them. That said, one should never argue superiority between PC and consoles in an objective manner. There simply isn't room for argument, only preference. A man sharing his preference is respectable, a man claiming a console a superior device is insane.
 
brain_stew said:
Its for that reason why I do believe PC gaming is actually a better bet for many more gamers than many would would ever realise.

I wholeheartedly agree. There's really never been a better time in terms of value. And even if someone doesn't want to go the traditional desktop route, they can always throw together a classy HTPC build, hook it up to their HDTV, and sit on their big comfy couch with a 360 controller.
 
brain_stew said:
Like I've said in the past, I've always traditionally been a console guy (shock horror, I know). What's so great about this gen for me though is that the PC now offers the best of console gaming, only with a multitude of improvements, and all its lovely and complex strategy and simulation games are still there and flourishing because of digital distribution.

The collapse in PC hardware costs, coupled with the already cheaper game prices, and awesome DD sales as well as the free online and free content, honestly means that I'm not really spending much more at all. I can finally afford PC gaming, and I love it!! :D

That you can build an "uber console" that plays any PC game on the market just great, and is enough for RE5 @ 1080p/60hz for $399 is God damn astonishing. Its truly a major shift in the market as far as I'm concerned and its happening at a time when consoles are more expensive than ever. Its for that reason why I do believe PC gaming is actually a better bet for many more gamers than many would would ever realise.

The Capcom games are the sole reason I am interested. I'm not a fanboy, I go to the platform that has the games I want to play. So the more games that come out like this, the greater my interest becomes.

Anyways, I'm still learning the ropes, but I'm paying attention to specs and seeing what I can afford, and looking at what others have. I think this would be a great winter break hobby for me this year, to build a beastly custom PC. I want to build a gaming PC that runs Lost Planet 2 at 60 FPS. Now that would be a merry Christmas, Indeed!*



*Not enough Wesker "indeed!" in this thread. Come on people it's RE5.
 
Chiggs said:
I wholeheartedly agree. There's really never been a better time in terms of value. And even if someone doesn't want to go the traditional desktop route, they can always throw together a classy HTPC build, hook it up to their HDTV, and sit on their big comfy couch with a 360 controller.
As I type this I'm on my bed in my bedroom with my wireless M/KB and my lappy hooked up to my 50' TV. Hard to do some might ask, but literally all I did was plug in a 4 dollar HDMI into my laptop. Simple as that.

Thats why the whole "big comfy couch" thing kinda bugs me.
 
Chiggs said:
I wholeheartedly agree. There's really never been a better time in terms of value. And even if someone doesn't want to go the traditional desktop route, they can always throw together a classy HTPC build, hook it up to their HDTV, and sit on their big comfy couch with a 360 controller.

The fact this is now an option is pretty damn sweet as well. The excellent support of the 360 controller has vindicated Microsoft's GFW initiative as far as I'm concerned. That the platform now has a "standard" gamepad with near universal support is a great bonus.
 
Dr. Light said:
Y


Did you actually say Uncharted was unplayable? I mean...really now.

Red Faction.


Dr. Light said:
Based on this thread, your $399 package with the 4850 might be more like 50fps, with no AA. Still pretty damn good, don't get me wrong.

You'd get what passes as 60fps in console terms at 1080p with higher than console settings (no aa). If ~50fps games like COD can pass as "60fps" then 55fps PC games can.
 
Thread is totally derailed now.
Also, PC vs Consoles debate is so passé.

On topic, are there any details for the PC version known yet? Like extra content or something?
 
DarkUSS said:
Thread is totally derailed now.
Also, PC vs Consoles debate is so passé.

On topic, are there any details for the PC version known yet? Like extra content or something?

Three new exclusive costumes and a new Mercs mode with 3x as many enemies is what I know has been announced so far.

GFWL support as well, so you can collect some geek points if that's your thing.

Oh, and full m/kb and controller support as well, they've specifically talked about tuning the m/kb setup so it should be a nice option. Which of course opens the option for Wiimote support for those that enjoyed RE4: Wii Edition.
 
While I love tinkering with my rig, and squeezing out the extra tech and fiddling with mods, lots of people don't. People I know, people who I respect and understand.

Lots of people don't have the time or money or even desire to research parts, bargain hunt parts off of NewEgg and other random sites, hand build a rig, then bust their knuckles on a work night when they want to play a game but Nvidia's drivers decided to shit out and you have to spend two hours sifting through different versions of third party kit.

PC gaming is for extreme enthusiasts of tech and community in most cases. You get extra in certain areas, you give extra in certain areas and most of the software types and wants are different. Some folks just want the newest Capcom game or JRPG on time and to put in the disc and have the shit work, other people such as myself want to play the latest Civ, the best versions of Valve software and sub to WoW.

As I type this I'm on my bed in my bedroom with my wireless M/KB and my lappy hooked up to my 50' TV. Hard to do some might ask, but literally all I did was plug in a 4 dollar HDMI into my laptop. Simple as that.

Thats why the whole "big comfy couch" thing kinda bugs me.

It bugs me too. I ran a VGA cord straight from my PC into my HDTV in my living room and use a Wireless Logitech Duo. Been doing so for years and years. Always baffled at the nonsense.

Not like it even makes clutter. My PC is just behind a door on the entertainment center and half of the shit on it is wireless.
 
ViolentP said:
There is nothing wrong with consoles. They are capable of providing an insane amount of fun and unless I'm mistaken, that's what this hobby is all about. I enjoy my consoles greatly and I don't blame anyone for digging them. That said, one should never argue superiority between PC and consoles in an objective manner. There simply isn't room for argument, only preference. A man sharing his preference is respectable, a man claiming a console a superior device is insane.

This man gets it. The "debate" should end here.
 
Gonna DL/Run this on my $600 PC and see how it does...I would assume it will turn out fine.

EDIT!! First tests done...questions!!

I have Core 2 Duo 3.0, 4gb DDR2 800, NVidea 9800 GT 512mb....

I usually never mess with settings in games all that much but anyone have a suggestion about what settings I should try to get close to 60fps?

I just ran the benchmark and with 4x AA and most other settings at high I came out around 48 I think.

Just wondering what impacts the settings the most. I changed the reolution a bit and drop shadow quality down and put it up over 80fps...just trying to find some midground
 
brain_stew said:
Red Faction.

You never bashed Uncharted? Ok then.




You'd get what passes as 60fps in console terms at 1080p with higher than console settings (no aa). If ~50fps games like COD can pass as "60fps" then 55fps PC games can.

I was under the impression that TVs don't display intermediate framerates between 30 and 60fps, that's why console games have a tendency to jump suddenly from 30 to 60, instead of just being at 40-50fps. MGS4 and Infamous are obvious examples. COD4 is 50fps? This is the first I've ever heard that. Sure looks 60fps to me.
 
brain_stew said:
Three new exclusive costumes and a new Mercs mode with 3x as many enemies is what I know has been announced so far.

GFWL support as well, so you can collect some geek points if that's your thing.

Oh, and full m/kb and controller support as well, they've specifically talked about tuning the m/kb setup so it should be a nice option. Which of course opens the option for Wiimote support for those that enjoyed RE4: Wii Edition.
Cool!

Double dip, it is then! I don't care about GFWL and points (besides, I already got my Platinum trophy on the PS3) but the new costumes, the enhanced Mercs mode plus the increased number of enemies and the superior graphical performance sound too good for me to pass. RE5 is among my favorite and most-played games of PS3 (screw the haters), thus replaying it on PC with all these improvements is a no-brainer.
 
Dr. Light said:
I was under the impression that TVs don't display intermediate framerates between 30 and 60fps, that's why console games have a tendency to jump suddenly from 30 to 60, instead of just being at 40-50fps. MGS4 and Infamous are obvious examples. COD4 is 50fps? This is the first I've ever heard that. Sure looks 60fps to me.
Neither can PC monitors (plus modern TVs *are* just monitors with built-in TV tuners). But in both cases the images for each frame aren't sent whole: they are feed pixel by pixel, from left to right, line by line. A computer or a console can happily change whatever image is being fed into the monitor/TV while the latter hasn't finished displaying it. This causes the so-called image tearing. Both PS2 God of War games had tearing even on CRT SDTVs.

And at least COD:WaW looked solid 60fps for me. I saw no tearing at all, so this means it was synced to the TV refresh rate. But with meticulous timing a game can achieve no-tearing 60fps without using v-sync at all and manage to dip into the 50s if need be, causing temporary tearing instead of halving the framerate.
 
Dr. Light said:
I was under the impression that TVs don't display intermediate framerates between 30 and 60fps, that's why console games have a tendency to jump suddenly from 30 to 60, instead of just being at 40-50fps. MGS4 and Infamous are obvious examples. COD4 is 50fps? This is the first I've ever heard that. Sure looks 60fps to me.
TVs display at 60FPS but if a game is running sub 60FPS it repeats frames when there isn't a new frame to draw.

For movies usually what happens is even Frame 1 is displayed twice, Frame 2 is displayed three times, Frame 3 is displayed twice, Frame 4 is displayed three times, etc and thats how 24 FPS is displayed on a 60 FPS display.
 
67.1fps average. Never dipped below 60 during gameplay except when there was that molotov cocktail thrown in area 3. Dropped down to 34. Wtf?

Anyways, looks like it'll play solid. Can't wait for this to come out.
 
Littlegator said:
67.1fps average. Never dipped below 60 during gameplay except when there was that molotov cocktail thrown in area 3. Dropped down to 34. Wtf?

Anyways, looks like it'll play solid. Can't wait for this to come out.
What are your PC specs?
 
brain_stew said:
Three new exclusive costumes and a new Mercs mode with 3x as many enemies is what I know has been announced so far.

GFWL support as well, so you can collect some geek points if that's your thing.

Oh, and full m/kb and controller support as well, they've specifically talked about tuning the m/kb setup so it should be a nice option. Which of course opens the option for Wiimote support for those that enjoyed RE4: Wii Edition.
Don't forget naked Shiva/Jill/Excella mods.
 
Labombadog said:
What are your PC specs?

Forgot to mention it was only 4xAA and no motion blur, but it was running at 1680x1050.

Intel Core 2 Quad Q6600 G0 @ 3.42GHz (380 x 9)
ATi Radeon HD4870 512MB on-board
4GBPNY DDR2 800 (PC2-6400)

BTW, anyone else just love watching the AI play?
 
Littlegator said:
Forgot to mention it was only 4xAA and no motion blur, but it was running at 1680x1050.

Intel Core 2 Quad Q6600 G0 @ 3.42GHz (380 x 9)
ATi Radeon HD4870 512MB on-board
4GBPNY DDR2 800 (PC2-6400)

BTW, anyone else just love watching the AI play?


I'm running a similar rig though I'm clocked at 3.0GHz and my 4870 is 1GB.
Ran at 4xAA, no motion blur and 1680x1050 for the sake of comparison:

334p17q.jpg
 
Littlegator said:
Forgot to mention it was only 4xAA and no motion blur, but it was running at 1680x1050.

Intel Core 2 Quad Q6600 G0 @ 3.42GHz (380 x 9)
ATi Radeon HD4870 512MB on-board
4GBPNY DDR2 800 (PC2-6400)

BTW, anyone else just love watching the AI play?
Cool beans! Man, I cant wait to build me a new pc. :D
 
Really surprised at how well my system handled the benchmark.

Intel Quad Core 2.4MHz
4GB RAM
GeForce 8800 512MB VRAM

Pretty much stayed at 60 in 1440 x 900 (my monitor's max).

Factor in the 3x enemies and supposed visual improvements and I'd have probably bought this version had it come out at the same time as the console versions.

Developing all their games on the PC-centric MT Framework engine seems to have been a really good decision on Capcom's part. Japanese publishers have never had much presence on the PC. Bad ports of console games haven't really helped that, but Capcom is really stepping up to it now. When Lost Planet 2 comes out I fully intend to get it on PC. I only wish Konami and Square Enix could take this much initiative (to some extent Kojima is though).

People talk about how Japanese publishers have been slow to adopt to the current gaming generation and how they need to adopt western development practices. If no one else, Capcom seems to have been doing exactly this and it's totally paying off.
 
*stabs anyone continueing the stupid PC/console flamewar*

topofuji said:
Mercs mode with triple+ enemies? Oh Capcom, why must you deprive the console versions again like you did with DMC4 and the PC's Legendary Dark Knight mode? Can't they at least make them DLC? :(

So has there been any PC vs. 360 visual/video comparisons yet? (not counting the PS3 version here as the 360 version was superior in visuals to that, even though I still like to refer to that as my definitive version)
I posted a bunch of comparisons some pages back: http://www.neogaf.com/forum/showpost.php?p=16700000&postcount=304

The game loooks pretty much the same on 360 and PC if you leave all settings at high. I can't see more detailed textures or anything like that. Although PC of course gets a big advantage with higher resolution, AA and better framerate. There's an oddity with cutscenes having a lower FOV compared to the console versions so you see less, but I'm pretty sure this is specific to the benchmark (official screenshots show cutscenes properly).

This is purely subjective, but I actually think the game looks a lot better with motion blur turned off, although I would probably have kept it on if the framerate was 30 or below.
 
Being new to PC gaming, can someone answer me this: what's the point of having a high frame rate (100+) if your display has a refresh rate of only 60Hz? (especially if you have V-Sync enabled).
 
Miburou said:
Being new to PC gaming, can someone answer me this: what's the point of having a high frame rate (100+) if your display has a refresh rate of only 60Hz? (especially if you have V-Sync enabled).

The higher it is, the less chance you have of it dropping under 60 fps. (When the shit ingame gets "real")

Plus you can show off your e-peen!
 
Miburou said:
Being new to PC gaming, can someone answer me this: what's the point of having a high frame rate (100+) if your display has a refresh rate of only 60Hz? (especially if you have V-Sync enabled).
1) It's a benchmark purely for performance testing. So getting to know the exact framerate with v-sync off is useful info for comparisons.
2) If you're gonna use 3D goggles the ideal fps would be 120.
3) There are people out there who are using a refresh rate higher than 60hz.

I don't think anyone in this thread are stupid enough to think anything will look better with a 100fps if their refresh rate is 60.
 
Sectus said:
1) It's a benchmark purely for performance testing. So getting to know the exact framerate with v-sync off is useful info for comparisons.
2) If you're gonna use 3D goggles the ideal fps would be 120.
3) There are people out there who are using a refresh rate higher than 60hz.

I don't think anyone in this thread are stupid enough to think anything will look better with a 100fps if their refresh rate is 60.

I'm actually not being facetious. I thought maybe I'm missing something and there is a benefit for a frame rate being higher than refresh rate in some cases.

I'm also not quite sure about v-sync. If my refresh rate is 60Hz (I play on my Plasma) does that mean the game will drop to 30fps if it goes below 60fps?
 
Miburou said:
I'm actually not being facetious. I thought maybe I'm missing something and there is a benefit for a frame rate being higher than refresh rate in some cases.

I'm also not quite sure about v-sync. If my refresh rate is 60Hz (I play on my Plasma) does that mean the game will drop to 30fps if it goes below 60fps?

Use D3DOverrider instead of Vysnc, you'll get better performance and won't have any frame dropping or input lag issues. Remember to set the framerate option to "fixed" if you want to lock it at 60fps.
 
Miburou said:
I'm actually not being facetious. I thought maybe I'm missing something and there is a benefit for a frame rate being higher than refresh rate in some cases.

I'm also not quite sure about v-sync. If my refresh rate is 60Hz (I play on my Plasma) does that mean the game will drop to 30fps if it goes below 60fps?

Yes, which is precisely why having an average fps much higher than 60 is ideal so it won't be in danger of dropping. Vsync = very very good. Use vsync. Love vsync.

Dr. Light said:
Use D3DOverrider instead of Vysnc, you'll get better performance and won't have any frame dropping or input lag issues. Remember to set the framerate option to "fixed" if you want to lock it at 60fps.

I believe you're referring to triple buffering, which is generally unsupported in D3D at driver level on current cards. It's still a type of vsync, and it doesn't actually reduce input lag. It will minimize fps loss under 60 though.
 
I'm pleased to have gotten a "B" at 1680x1050 with my 9600GT. Holding out on the upgrade just a little longer...
 
luka said:
I believe you're referring to triple buffering, which is generally unsupported in D3D at driver level on current cards. It's still a type of vsync, and it doesn't actually reduce input lag. It will minimize fps loss under 60 though.

.......That's exactly why you use D3DOverider, which forces triple buffering in all games. Brain stew made a thread about it somewhere, go look at it.
 
Dr. Light said:
.......That's exactly why you use D3DOverider, which forces triple buffering in all games. Brain stew made a thread about it somewhere, go look at it.

I'm aware. I'm just pointing out a few inaccuracies.

Triple buffering is useless to me as I have 2 video cards, but I still use it to enable vsync on games that lack the option.
 
Dr. Light said:
You never bashed Uncharted? Ok then.






I was under the impression that TVs don't display intermediate framerates between 30 and 60fps, that's why console games have a tendency to jump suddenly from 30 to 60, instead of just being at 40-50fps. MGS4 and Infamous are obvious examples. COD4 is 50fps? This is the first I've ever heard that. Sure looks 60fps to me.

http://zoome.jp/ps360/diary/303/

I was unhappy with the tearing in Uncharted, absolutely, its the image artefact I hate above all others, hence my enthusiasm for triple buffering. The fact that Uncharted 2 has zero tearing and is seemingly using triple biffering, pretty much vindicated my complaint, and I'm glad I aired it as its clearly been heeded for the sequal.
 
AMD Athlon 64 X2 Dual Core @ 2.3GHz
1440x900x32b
3.00GB RAM

Running DX10 on Vista SP2.
2XAA and No Motion Blur
Using ATI Overdrive, GPU 800MHZ MEM 1000MHZ

Print Screen gave me a black screen, I need to get an application for that...

Got a B with average FPS of 40.1. I was hoping to get a little better... This sound about right or should I be able to squeeze out a little more juice?

(Just realized the irony of my avatar and this benchmark thread)
 
Sectus said:
This is purely subjective, but I actually think the game looks a lot better with motion blur turned off, although I would probably have kept it on if the framerate was 30 or below.

Yeah, I'm the opposite. I think motion blur can have very tangible benefits on 60Hz tvs and monitors. It may make things a little less clear and smear some detail, but even on 60Hz monitors there is a fairly noticeable after-image effect during fast motion. I'm not talking about LCD response times, but a natural effect of a low refresh rate like 60 leaving a sort of retinal burn effect where your eye still sees the previous frame after the screen has shown the next one. This occurs in all screens and becomes less noticable as you increase the refresh rate. Motion blur helps a lot of covering it up. Try watching the machine gun as chris fires it. Without motion blur you can notice 2 guns and the recoil kicks it up and down, the current frame and the after-image.

Until 120 and 240Hz refresh rates are commonplace, motion blur will still be useful. Some people may not notice it quite so much, but a lot of people don't care about tearing either.
 
Tried to take a screenshot but it just came out all black.

I had the settings at 1920x1200 with everything cranked up to high, AAx4 and motion blur off.
It ran at 59.9 frames per second for pretty much the whole thing, but I got a B overall.

I'm assuming thats a pretty good mark.
 
Sleeker said:
Tried to take a screenshot but it just came out all black.

I had the settings at 1920x1200 with everything cranked up to high, AAx4 and motion blur off.
It ran at 59.9 frames per second for pretty much the whole thing, but I got a B overall.

I'm assuming thats a pretty good mark.
Try running the benchmark without vsync.
 
What the hell is up with area 3?!

I just got my HD4870X2's back from RMA, I ran the benchmark and It went from ~180 area 1, ~50 area 2 and then ~8fps area 3, it was so bad I didn't even let it finish. Is there something ATI users have to do to get stable performance?
 
M3d10n said:
*** TO ANYONE USING AMD ATHLON X2 PROCESSORS ***

If you have not already, install the "AMD Dual-Core Optimizer". I was getting sub-30 dips before it (Windows 7 64-bits RC, DX9), but after I installed it I get ~40fps most of the time.

Is this application specific to RE5 or will it improve performance for everything? The lady is rocking a 4400+ on a Dell machine and it's pretty weak. Getting even a 10% performance boost would be pretty awesome. From what google tells me it seems to disable some throttling and there is one bloke that tried it on a opteron and caused his system to crash.
 
Shambles said:
Is this application specific to RE5 or will it improve performance for everything? The lady is rocking a 4400+ on a Dell machine and it's pretty weak. Getting even a 10% performance boost would be pretty awesome. From what google tells me it seems to disable some throttling and there is one bloke that tried it on a opteron and caused his system to crash.
It should increase performance for any game designed for multicore CPUs. It actually fixes a design flaw in the X2 line that screwed things up when an application (like games) needs fine control over the sync between the cores or something to that effect.

Of course it will crash opterons, since this app is meant for Athlon 64 X2 CPUs *only*. The Opteron, the Phenoms and their derivatives use a different architectures and don't have the design flaw.
 
luka said:
Yeah, I'm the opposite. I think motion blur can have very tangible benefits on 60Hz tvs and monitors. It may make things a little less clear and smear some detail, but even on 60Hz monitors there is a fairly noticeable after-image effect during fast motion. I'm not talking about LCD response times, but a natural effect of a low refresh rate like 60 leaving a sort of retinal burn effect where your eye still sees the previous frame after the screen has shown the next one. This occurs in all screens and becomes less noticable as you increase the refresh rate. Motion blur helps a lot of covering it up. Try watching the machine gun as chris fires it. Without motion blur you can notice 2 guns and the recoil kicks it up and down, the current frame and the after-image.

Until 120 and 240Hz refresh rates are commonplace, motion blur will still be useful. Some people may not notice it quite so much, but a lot of people don't care about tearing either.

The problem with mblur in RE5 is that it seems to create many static images, rather than genuinely blur the screen. It really doesn't look nearly as nice as the mblur in games like Crysis, or even games like HL2 Episode 2 / Portal / TF2.
 
TheExodu5 said:
The problem with mblur in RE5 is that it seems to create many static images, rather than genuinely blur the screen. It really doesn't look nearly as nice as the mblur in games like Crysis, or even games like HL2 Episode 2 / Portal / TF2.

Well, it uses the same motion blur method that's been seen in pretty much all current gen games save Crysis or KZ2. It's not perfect but it does help cover up the after-image effect, and it looks much nicer in DX10. I find it to be a lot more subtle if you're running at 60fps and personally I find the alternative slightly more distracting. Shame HL2 doesn't have full object blur yet. Those old videos were such a cock-tease. :|
 
E8400
9600GSO ($30 card)
I got ~43fps @ 1080P default settings.

Whenever I changed any setting other than the resolution they had zero effect on the IQ or FPS (vsync. AA, motion blur, etc.), what's up with that?
 
Top Bottom