• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doom 3 benchmarks are here!

dark10x

Digital Foundry pixel pusher
DSN2K said:
MGS2 is not 60fps.

its 30 ingame, 25 cutscenes.

Ah ha ha ha!!!111

Try again. MGS2 is 60 fps during gameplay and variable (on purpose) during cutscenes. You dare question the framerate god?
 

Izzy

Banned
WOW! Check out preliminary HL2 benches :


dx1sf_1024_pure.gif

dx1sf_1280_pure.gif

dx1sf_1600_pure.gif
 

DSN2K

Member
dark10x said:
Ah ha ha ha!!!111

Try again. MGS2 is 60 fps during gameplay and variable (on purpose) during cutscenes. You dare question the framerate god?

yeah I do :p

you sure about that ? i swear it was 30fps.
 
"DOA3, Ninja Gaiden, Panzer Orta, Metroid Prime, MGS2, Burnout 3 and GT4 DON'T run at 60 fps?"

You seem to be mixing best looking aesthetically with "pushing the limits of consoles." None of those sans possibly GT4 (which has quite a bit of shortcuts anyway) are really pushing anything. They're just fitting into your idea of an aesthetically pretty game. And MGS2 damned sure isn't one of the best looking games.

So, Which are you going to use? Let's try to stick to one of the other.
 
seismologist said:
Just Buy the game. play it. Then upgrade to a 6800GT and you'll really appreciate the difference. You dont need to upgrade your whole PC.

Ah, yes he does. That videocard on a P4 1.5 is a waste of money. It's gonna be severely processor bound and won't come close to seeing its power until a CPU upgrade is made. He could buy a much cheaper card and get the same framerate if being frugal is the agenda.
 

Doc Holliday

SPOILER: Columbus finds America
When MGS2 came out it was by far the best looking game on either consoles or pc. Of course it looks somewhat dated now.
 

dark10x

Digital Foundry pixel pusher
Tre said:
"DOA3, Ninja Gaiden, Panzer Orta, Metroid Prime, MGS2, Burnout 3 and GT4 DON'T run at 60 fps?"

You seem to be mixing best looking aesthetically with "pushing the limits of consoles." None of those sans possibly GT4 (which has quite a bit of shortcuts anyway) are really pushing anything. They're just fitting into your idea of an aesthetically pretty game. And MGS2 damned sure isn't one of the best looking games.

So, Which are you going to use? Let's try to stick to one of the other.

So, a game like Ninja Gaiden with models MORE DETAILED THAN A DEDICATED 3D FIGHTER isn't pushing the machine?!?!?! Those games push shit-tons of geometry, tons of effects, and all run at 60 fps.

I won't argue with you anymore, however simply due to...

And MGS2 damned sure isn't one of the best looking games.

you sure about that ? i swear it was 30fps.

I'd bet my life on it. I don't even need to bet, really, because it IS 60 fps (with the occasional bit of tearing and slowdown...so really, it's 60 fps 98% of the time). :p
 

dark10x

Digital Foundry pixel pusher
its real easy to tell apart 60 from 30..above 60 it gets tricky, at least for me.

Anything over 60 fps is worthless as far as gaming goes. In fact, it can actually OVER buffer the frames and create a slight ghosting effect.
 
"When MGS2 came out it was by far the best looking game on either consoles or pc. Of course it looks somewhat dated now."

Yes, but the PS2 hardware hasn't changed since.
 

jett

D-Member
DSN2K said:
MGS2 is not 60fps.

its 30 ingame, 25 cutscenes.

You couldn't be MORE wrong. It's 60 fps in game. Like Dark10x said, the framerate in the cut-scenes vary, sometimes it runs at 60 others at 24+motion blur(to give a film-like effect).
 

tenchir

Member
Izzy said:
WOW! Check out preliminary HL2 benches :


dx1sf_1024_pure.gif

dx1sf_1280_pure.gif

dx1sf_1600_pure.gif

I thought at first that there is something really screwy with this benchmark because the x800 and 6800 hardly budge at all at any resolution. Then I saw the pure speed and think the CPU must have been the bottleneck. I expect things to turn around once AA/AF is turn on.
 

6.8

Member
Before I saw Ninja Gaiden, MGS2 was still the prettiest game I had played this gen (I hadn't played Metroid Prime, however). The attention to detail was (and still is) unmatched, the facial animations and featured are very advanced... Beautiful game. The only thing I didn't find astounding were its textures, and considering the platform it's on, they're pretty good.
 

Doc Holliday

SPOILER: Columbus finds America
Tre said:
"When MGS2 came out it was by far the best looking game on either consoles or pc. Of course it looks somewhat dated now."

Yes, but the PS2 hardware hasn't changed since.

Well i think you just proved my point...maybe pc guys should spend more time optimizing or figuring out new tricks instead of brute forcing everything. Easy to program a kick ass engine when you need brand spanking new hardware to run it.

BTW I dont really mean this about the guys at ID, they are pretty good at getting great visuals from lower end systems. I remember playing the shit out of quake2 on my p200 hehe.
 
"Well i think you just proved my point...maybe pc guys should spend more time optimizing or figuring out new tricks instead of brute forcing everything."

Err, I wasn't arguing your point so much as I was dark10x's. And your point isn't entirely true. PC games tend to look somewhat lesser because since they have all the different configurations, they have to shoot for a lowest common denominator.
 
ravingloon said:
Ah, yes he does. That videocard on a P4 1.5 is a waste of money. It's gonna be severely processor bound and won't come close to seeing its power until a CPU upgrade is made. He could buy a much cheaper card and get the same framerate if being frugal is the agenda.

You think? So I should buy a new computer, but not necessary a X800 or 5600?
 

Doc Holliday

SPOILER: Columbus finds America
Stupid question here....

Why cant somebody come up with a way to take advantage of PC cpu speeds the way an xbox does. 700 mghz versus say 2.0 ghz, thats a huge difference..does it all go to windows? Maybe PC's can use a "game mode" or something.
 

dark10x

Digital Foundry pixel pusher
So, Which are you going to use? Let's try to stick to one of the other.

Didn't catch this one, but...what do you mean?

If you are referring to preference, 60 fps + polish + good visuals will ALWAYS be my choice in visuals. I can't say I'm very impressed with most PC games these days because they just aren't where I want them to be...

Far Cry has some amazing aspects, but animates poorly, can't run at a constant 60 fps on any machine, has rough menus and transitions between everything, and feels more like a tech demo than a real game. I expect Half-Life 2 and Doom 3 to buck this trend a bit, but they will still have framerate issues...and the only thing you can do about it is buy more hardware in hopes that you MIGHT be able to get near 60 fps constant.
 
Mrbob said:
I think he meant the 9800XT. Which basically practically all 9800 PROs can overclock too and beyond.

Oops, sorry Doc! :) 9800 Pro's can OC to XTs eh? I would need to replace the Pro's heatsink/fan though, right? I'm tempted to start OCing my CPU and 9800 Pro versus dropping 500 large on another vid card.
 
"Why cant somebody come up with a way to take advantage of PC cpu speeds the way an xbox does. 700 mghz versus say 2.0 ghz, thats a huge difference..does it all go to windows? Maybe PC's can use a "game mode" or something."

Because PCs still have a lot going on in the background, services, blah blah.

dark10x: I'm referring to you saying that the best looking console games push the hardware when all the ones you mentioned except a couple would even be close to being considered "pushing the limits." You're flipflopping between technical achievement and what pleases your eye.
 

dark10x

Digital Foundry pixel pusher
Tre said:
"Why cant somebody come up with a way to take advantage of PC cpu speeds the way an xbox does. 700 mghz versus say 2.0 ghz, thats a huge difference..does it all go to windows? Maybe PC's can use a "game mode" or something."

Because PCs still have a lot going on in the background, services, blah blah.

dark10x: I'm referring to you saying that the best looking console games push the hardware when all the ones you mentioned except a couple would even be close to being considered "pushing the limits." You're flipflopping between technical achievement and what pleases your eye.

What do you consider "pushing" then? Those games may not reach for the limits of the machines, but there are VERY few other games that go beyond them.

DOA3 and Ninja Gaiden both feature higher polygon counts than any other 3D fighter yet made (including backgrounds), high-res resolution textures, and plenty of effects while running at 60 fps. If those games are pushing more polys than the vast majority of games on XBOX while running at 60 fps, I fail to see how they aren't "pushing" the system.

Panzer Orta is one that might not work here, but it is fairly high poly and has a lot of different effects running.

Metroid Prime - High poly, good textures, and a great streaming engine. It just looks and runs beautifully and is easily one of the best looking games on GC.

MGS2 - Tons and tons of effects that have yet to be matched by any other game, efficient usage of polygons, attention to detail everywhere. I personally am still blown away by this game, but if you aren't...whatever.

Burnout 3 - High poly, great reflections, great textures, good effects (the motion blur is very impressive and different), etc. This is pushing the PS2 as much as any racer could be (and even runs in 480p!)

I'd also include...

Rogue Squadron 2 & 3 - These are still some of the most impressive looking GC games to date...yet run at 60 fps. They use a lot of perpixel effects that are very uncommon on Gamecube and generally just push the system in all directions. RS2 blew every other game away on the GC for a long time visually.

Star Fox Adventures - Another top looker on GC that stands above most of Nintendo's own work. Vertex shaded grass and fur, high-res textures, plenty of geometry, streaming engine, etc. Pushes the GC hard and runs at 60 fps.

I'll think of more in good time...

I want you to tell me, in detail, why you believe these don't push the systems fairly hard. They are some of the best looking games on each of the systems, yes? They also DO push the system.
 

dem

Member
dark10x said:
Anything over 60 fps is worthless as far as gaming goes. In fact, it can actually OVER buffer the frames and create a slight ghosting effect.

Uh.. no.
If you cant tell the difference between 60fps and 120fps in a pc fps youre blind. BLIND.
 
"I want you to tell me, in detail, why you believe these don't push the systems fairly hard. They are some of the best looking games on each of the systems, yes? They also DO push the system."

Honestly, it'd be impossible for either of us to properly know what does and doesn't push what because we both aren't familiar with the architecture of any of the three. It'd really just end up being conjecture.
 

robot

Member
goodcow said:
As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience.

:eek

woah i have that card, definetly wasn't expecting to be able to play this game. go me.
 
dem said:
Uh.. no.
If you cant tell the difference between 60fps and 120fps in a pc fps youre blind. BLIND.

Or you don't have a monitor that has a refresh rate of 120HZ at the resolution you play at. I was always under the impression anything over your refresh rate is completely wasted. If I'm wrong, my bad. If not, I'll continue to laugh at tarts who get pumped up that they can run something at 300FPS.
 

dem

Member
ravingloon said:
Or you don't have a monitor that has a refresh rate of 120HZ at the resolution you play at. I was always under the impression anything over your refresh rate is completely wasted. If I'm wrong, my bad. If not, I'll continue to laugh at tarts who get pumped up that they can run something at 300FPS.

Refresh rate only caps your frame rate if you have vsync turned on.
 

Slo

Member
dem said:
Refresh rate only caps your frame rate if you have vsync turned on.

I don't think that's quite right. While your computer may be capable of refreshing the screen 300 times, the monitor will only refresh the screen at the specified refresh rate. Vsync coordinates the two for best visual effect.
 
Slo said:
I don't think that's quite right. While your computer may be capable of refreshing the screen 300 times, the monitor will only refresh the screen at the specified refresh rate. Vsync coordinates the two for best visual effect.

Exactly. I don't have V-Sync turned on. But it doesn't matter if my computer is rendering Quake 3 at 352FPS if the monitor is only updating 85 screens a second. That's what the eye is seeing... the 85 updates per second.
 

Gregory

Banned
ravingloon said:
Exactly. I don't have V-Sync turned on. But it doesn't matter if my computer is rendering Quake 3 at 352FPS if the monitor is only updating 85 screens a second. That's what the eye is seeing... the 85 updates per second.

Yep. Also, if you don`t have v-synch on, the game won`t be perfectly smooth no matter how many fps it`s running. For that you have to have v-sync on and the game has to be running over your monitor refresh rate all the time.

I remember playing Quake 2 with synch on it was buttery smooth, not so with it off.
 

akascream

Banned
If it weren't a technical achievement to have a buttery smooth game, all games would be so. No developer wants a game to have a stuttery or low framerate. And I agree with dark on this one. Gimme smooth 60fps any day. And LCD or not, it's something PC developers have needed to address for some time. Even when games run at hundreds of frames per second, they can still be stuttery. It's amazing they manage it actually. :p
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
"Why cant somebody come up with a way to take advantage of PC cpu speeds the way an xbox does. 700 mghz versus say 2.0 ghz, thats a huge difference..does it all go to windows? Maybe PC's can use a "game mode" or something."

Xbox runs at 640x480.
Typical PC games run at 1024x768 to 1600x1200.
PC games have higher res textures.
PC games have more effects to process. Direct X 9 effects that Xbox has disabled.
Windows consumes some power
Developers design the game for Xbox with exact specifications. They can tweak some more power out.


Clockspeed is one of many factors in CPU speed also...

For example, a P4 2.4 800 FSB totally owns a P4 2.0 400 FSB.



As for the initial benches for HL2, turn on AA to X8, and AF to X16 and show me the scores again.
 

Gregory

Banned
akascream said:
Even when games run at hundreds of frames per second, they can still be stuttery. It's amazing they manage it actually. :p

It`s all about having v-sync on, but then the game also have to be able to consistently run faster than your monitor`s hz setting also. Which doesn`t happen to often. Plus most gamers don`t know this, they play with v-sync off and think they`re getting 200fps in their games..

It`s better on consoles where everyone has the same hardware, same refresh rate on their TV`s etc.
 

dark10x

Digital Foundry pixel pusher
dem said:
Uh.. no.
If you cant tell the difference between 60fps and 120fps in a pc fps youre blind. BLIND.

Oh, but that's the problem, I CAN tell the difference. 60 fps is better looking.

I can handle 120 Hz refresh rates at certain resolutions and I have tested it many times. There is a distinct difference, and 120 fps actually too high and creates a slight ghosting effect (as I've said). I -AM- absolutely correct, at least based on the usage of 4 different PC/Monitor combos.

Of course, I had people yesterday claiming that the best LCD panels for the PC have NO ghosting (which is, in fact, bullshit).

The whole v-sync issue is another important thing...

Turning off V-sync will absolutely destroy your image. Setting the DXDIAG limitation to 60 Hz max + v-sync will provide the smoothest possible image (provided your PC can handle it).
 
I can handle 120 Hz refresh rates at certain resolutions and I have tested it many times. There is a distinct difference, and 120 fps actually too high and creates a slight ghosting effect (as I've said). I -AM- absolutely correct, at least based on the usage of 4 different PC/Monitor combos.

I've never understood this argument, as no matter if it's 120fps, 60 fps, or 30 fps, objects still move the exact same amount and at the same speed across the screen. The only difference being that at higher frame rates, more data is being presented within that margin through which it moves.

It's somewhat akin to this, as I see it...

Code:
30fps...

*.....*.....*.....*.....*.....*.....*

60fps

*...*...*...*...*...*...*...*...*...*

120fps
*.*.*.*.*.*.*.*.*.*.*.*.*.*.*.*.*.*.*

If that worked properly, and *are meant to represent new frames, with periods as the frame just displayed remaining...then you'll see things don't actually move any farther at higher frame rates, their position is just updated more on the way to getting where they're going.

I've heard the ghosting argument so many times i'm inclined to beleive there may be some truth to it, but I really don't see where it comes from. If i'd suspect it would be more eye friendly than the 'chopping' at lower framerates where you can clearly see the divide between where it was in one fram, versus another.
 

dark10x

Digital Foundry pixel pusher
The thing is, the human eye can not perfectly sync up with a monitor device. As a result, 30 fps does not look perfectly smooth as we can actually see between the frames. 60 fps is designed to buffer the image to the point where it almost appears to be in sync and our eyes become incapable of detecting any individual frames.

I'm not necessarily sure about this, but I believe that 120 fps demonstrates ghosting as a result of over-buffering the image. There is simply TOO MUCH data for the human eye and it ends up blending together just a tad.
 

Yusaku

Member
There's so much bullshit, half-truths, and plain made up crap in this thread.

dark10x, you pretty much pulled all of that out of your ass.

Frame rate is a lot like refresh rate in that it's all about perception. We've all met at least one person who can sit in front ofa 60Hz monitor for hours and not have a problem with it. Have that person switch to 85Hz for a week or two and they'll start complaining when they go back to 60Hz.

Someone with some insane PC rig that's used to 60+ fps will complain when suddenly they drop to 30 fps, while Mr. Budget PC Gamer doesn't mind 30 fps at all.

Why is it that broadcast (29.97 fps) and film (24fps) don't appear choppy to us, while games do? Because we're so used it in. In fact, seeing 60fps video looks weird, it actually seems LESS real than 24 fps. I know lots of animators that deliberately make their animations at 24 fps even if it's for broadcast just to give it a "film" look.

One reason why 24 fps in movies is acceptable while it's not for games is the issue of motion blur. I'm not talking about the cheesy motion blur effects we've been seing ever since MGS1, but true motionblur that occurs with cameras and our own eyes. A bouncing ball at 24 fps in a game appears choppy because the ball is perfectly clear between each frame, so the gaps between frames are noticable. A bouncing ball in film will be blurred between frames, lessening the choppy appearance. Once games can have true real-time motion blur and maintain 30fps the whole fps debate in games will pretty much disapear.
 

dark10x

Digital Foundry pixel pusher
...see, I know all of that already. I'm trying to figure out why 120 fps takes on a ghosted appearance in comparison to 60 fps. You've explained nothing new, you've simply moved onto something different...

At the very least, all of these things are so painfully obvious to me...but I can't really describe WHY.

Why is it that broadcast (29.97 fps) and film (24fps) don't appear choppy to us, while games do? Because we're so used it in.

You just answered your own question below, though. It actually is the motion blur which allows this to appear smooth (and resemble reality).

So can you actually explain the original issue here? Why does 60 fps look better than 120 fps (even if just by a tiny bit)?

Once games can have true real-time motion blur and maintain 30fps the whole fps debate in games will pretty much disapear.

That isn't true, though...

Take a look at the video intro of Gran Turismo 3, for instance. As you can clearly see, motion blur is present in the same way as we encounter it in CG films. This video, however, displays at 60 fps. At 30 fps, even with motion blur, it still wouldn't look as smooth. 30 fps will become much more acceptable, to be sure, but 60 fps will still be optimal...
 

Gregory

Banned
Tons of stuff on television is shown at 60fps (fields per second), like sports, news, cheap tv series etc, basically everything shot on video.
 
Top Bottom