Xbox2 hardware: good, yet disappointing at the same time [from a developer]

A edit.

You say you'd like doom3 xbox at a rock solid 60 fps. So would I, but I would also like it to come out this year and have real time lighting. Would you sacrifice either of those things just to get the framerate higher?

Yes I would, give me fake lighting & monsters that animate like silk & take as long as you like to get it right. I'm serious too.
 
We should just divide the forum into the 30fps and 60fps forum. Or the "easily satisfied" and "always striving for the best" forum;P
 
I think the goal should always be 60fps. There should never be "settling." But then again, I think videogames are an artform so I like when art is "completed" or "the best it could be."

That said, it's perfectly acceptable to have 30fps in games. It just shouldn't be our primary goal... that should always be set at 60. No settling.
 
Bebpo said:
We should just divide the forum into the 30fps and 60fps forum. Or the "easily satisfied" and "always striving for the best" forum;P

Depends on what you consider the best. It's all about system resources; that extra power could be used to make a game 60 fps or it could be used to up the graphic detail. The best use of that power is subjective.
 
:lol was he reirom? Sure had the same grammar.
That said, it's perfectly acceptable to have 30fps in games. It just shouldn't be our primary goal... that should always be set at 60. No settling.
:lol
 
DCharlie said:
erm, am i going mad?
isn't 30fps/60fps a developer decission not something dictated by hardware?

If the PS3 and Xbox 2 can both handle 100,000,000 polys per sec at 60fps, i can guarantee that some devs will go "hold on guys, if we slice that to 30fps, and give ourselves twice the time to render, we can maybe stick in even more polys! or more lighting! or ..."

I personally think that art and execution are going to be more important this time around, as that's all that's going to seperate the machines.

The voice of sanity!
 
segasonic said:
29.97 full frames => 59.94 interlaced (half) frames = about 60 fps


wrong, the 29.97 fps you are watching on normal tv is allready Interlaced

480i is INTERLACED interlaced means you draw every other line, so one frame you draw the odd lines then the next frame you draw the even lines

progressive scan is the full picture I don't know how games handle 480p I can tell you for movies though most dvd players probably do a reverse 3:2 pull down (3:2 pulldown is what they do to take the full frame 24 FPS movie to turn it into the interlaced 29.97 FPS picture for tv)

I don't knwo what HDTV programs are being broadcasted in but I can tell you normal tv IS NOT 60fps anyway you slice it it's 30 half frames per second



*edit*

Not sure if you're talking games or TV Broadcast, if it's tv broadcast then my point stands, (again not sure the frame rate on HDTV broadcasts) but for standard broad cast theyre allready interlaced at 29.97
 
Thompson said:
Frame rates are very important, I think I can safely say without fear of ridicule the PS2 would have been long buried if MS got PGR2 & Halo2 running at a solid & detailed 60fps. In fact Sony's master stroke was to get GT4 running at 60fps at the cost of all other things. Forza could have been a nail in the coffin for Sony going into the next gen.


You should check this out
 
Shin Johnpv said:
wrong, the 29.97 fps you are watching on normal tv is allready Interlaced

480i is INTERLACED interlaced means you draw every other line, so one frame you draw the odd lines then the next frame you draw the even lines

progressive scan is the full picture I don't know how games handle 480p I can tell you for movies though most dvd players probably do a reverse 3:2 pull down (3:2 pulldown is what they do to take the full frame 24 FPS movie to turn it into the interlaced 29.97 FPS picture for tv)

I don't knwo what HDTV programs are being broadcasted in but I can tell you normal tv IS NOT 60fps anyway you slice it it's 30 half frames per second



*edit*

Not sure if you're talking games or TV Broadcast, if it's tv broadcast then my point stands, (again not sure the frame rate on HDTV broadcasts) but for standard broad cast theyre allready interlaced at 29.97



Are you trying to say that there is no display difference between games that run at 60 fps and 30?

Are there STILL people pushing this argument?
 
captainbiotch said:
If this 30 fps shit is true I won't be there for xbox2 launch. There is a huge difference between 60 and 30, and those that don't see it aren't very hardcore.

Wrong. Some people CAN'T see it. It's analogous to the 'rainbow' issue with DLP technology. Some people see them, some don't.

It doesn't necessarily have anything to do with how hardcore someone is.
 
HokieJoe said:
Wrong. Some people CAN'T see it. It's analogous to the 'rainbow' issue with DLP technology. Some people see them, some don't.

It doesn't necessarily have anything to do with how hardcore someone is.

??!? Any human being is capable of seeing the difference between a 30fps and a 60fps game even if it's on some 2" black and white tv.

Those that do not see it... well some people don't 'see' red lights either, or so they claim.
 
Bebpo said:
??!? Any human being is capable of seeing the difference between a 30fps and a 60fps game even if it's on some 2" black and white tv.

Those that do not see it... well some people don't 'see' red lights either, or so they claim.

RayCharles.jpg


YOU'RE SO INSENSITIVE!
 
morbidaza said:
Are you trying to say that there is no display difference between games that run at 60 fps and 30?

Are there STILL people pushing this argument?

wow did you even read what I wrote

or did you just reply blindly

I mean answer honestly here

you'll notice I didn't mention games at ALL

read it again NOT AT ALL

I was talking about the NTSC 29.97 through broadcast TV

which the point i was making is allready interlaced


and anyway 29.97 full frames does not equal 59.94 interlaced frames for interlaced images a full frame is still captured/made and is interlaced later down the pipe


I made no comment on which was better or worse just clearing up "facts" that were posted
 
Bebpo said:
??!? Any human being is capable of seeing the difference between a 30fps and a 60fps game even if it's on some 2" black and white tv.

Those that do not see it... well some people don't 'see' red lights either, or so they claim.

not really some people see shades better some people see motion better etc etc the human brain and eye and everything imbetween operate in a range and some people have a better range than others
 
DCharlie said:
erm, am i going mad?
isn't 30fps/60fps a developer decission not something dictated by hardware?
Depends on the hardware. :)
Eg. NGage is 30fps, there's no 60fps there no matter how much developer wants to decide it.
And there's also at least on other hw out there right now that highly "encourages" 60fps (meaning that 30fps is actually not easy to use, and even requires certain... tradeoffs).
 
Fafalada said:
Depends on the hardware. :)
Eg. NGage is 30fps, there's no 60fps there no matter how much developer wants to decide it.
And there's also at least on other hw out there right now that highly "encourages" 60fps (meaning that 30fps is actually not easy to use, and even requires certain... tradeoffs).

PSP?
 
Gregory said:
Extra detail at 30fps is well and good, when the game stands still.

Then you start to move and all the detail is for nothing as everything turns into a big blurry mess, due to shitty 30fps. Like PGR2 and Forza for example. 30fps really kill graphical detail.


You must be freaking cross-eyed or a member of the golden-eye club who can divine the difference between 60fps and 30fps. Again, not everyone can see the difference. Just because you do, doesn't mean it's generalizable to the population at large.
 
Bebpo said:
??!? Any human being is capable of seeing the difference between a 30fps and a 60fps game even if it's on some 2" black and white tv.

Those that do not see it... well some people don't 'see' red lights either, or so they claim.

Well, the data input to their rods and cones may register the difference; but that doesn't mean their eye/brain system will actually discern the difference as problematic.
 
Yeah it's the BRAIN that is able to sense the difference, and the max discernable diff ever noted was 120 fps...anything above that and people can't tell the difference.

Some people can't tell the difference between 30 and 60...on a PC monitor. But on TV, there are interlacing issues that can be seen by all.

30 fps progressive scan should solve that issue, shouldn't it?

Then the only concern is the extra benefit of 60 fps that most people don't talk about (and I don't think it's been talked about in this thread)...

60 fps helps fake a blur our brain would create, according to a paper I've read on this topic. Our brain creates a blur effect to help us discern what direction fast objects are moving in. 30 fps on a TV, as in 30 pictures in sequence, if it doesn't capture or create this blur, our brain won't create it and things don't look as realistic as they do in real life. Cameras used in television DO capture this blur because the shutter is open long enough to do so.

60 fps games help our brain create this blur because it moves less like a slide show, and so our brain can create the blur for us.

I can see games running at 30 fps with progressive scan, with a blur effect, looking as good, if not more realistic (realism as compared to what we see on TV), than 60 fps games.

There are other advantages to 60fps games, such as the visual feedback we get from the game having a higher resolution, meaning that we can react a bit quicker. That's another issue.

IMO, Racers, and fighters should be running at 60 fps, because of the timing involved and the need for users to see how to react in time. Sure, game controls are running at 120 fps in some games (WipeOut, Forza) but we need to be able to see changes on the screen updated nearly as fast. Everything else can run at 30...

I even think FPS games can run at 30 fps as long as it has progressive scan support.
 
I saw that too.

I was too scared to ask what the original tag was.

Kinda like that Pizza Shop line-up scenario...Bish beating down on Thompson...all I could do was watch and keep silent. :P
 
Well the wheelbarrows' running low on arguments, i'll go take it back into the shed full of monkeys on type writers, see what they've come up with.



p.s. is there a video of that link in bishops tag? Sounds hilarious.
 
:lol

It must really suck to be one of you framerate whores with the magic eyeballs. Us normal people will be getting the most out of the next generation, experiencing the best possible poly counts and effects and loving every moment of it while you guys are bitching and moaning about what slow, blurry messes videogames have become. I'd pity ya'll if it weren't so funny.

How much money would you spend to loose this cursed ability, which serves no purpose other than to ruin your enjoyment? I'd be interested to hear someone defend their ability to tell framerates, too. For the laughs.

Go to this thread. It has a link to a side-by-side F-Zero GX comparison video in the first post. On the left is the 60 hz (hz = fps ???) version we know, on the right is the pal 50 hz version. I see little difference, other than the guy playing the 50 hz version isn't at peak speed as much as the guy on the left. But everyone in the thread is like, "WTF it's so different!!" Hilarious.
 
Reducing the number of frames per second gives devs more time to increase poly count, increase detail etc.

There might be pressure from the publisher to make their game look as good as 'that one from EA' - maybe the developer isn't so tuned to the machine so they have to take a framerate hit to make that jump.

Lots of reasons to use 30fps. If they made PGR3 in two versions, one 30fps, one 60fps, both otherwise identical - I'll take the 60fps. But if there is only one version which runs at 30fps, and its fantastic, I'll take that one rather than nothing.
 
mrklaw, but say the 30FPS version of PG3 had 16 cars on screen and the 60fps had 8 which one would you get. Surely 16 cars from a pure gameplay perspective is far more important than the extra frames. You have to take for granted the extra cars on track add to the experience. I'm just trying to point out what dvelopers must consider I don't consider that either is the correct choice.
 
RE4 vs. SH4 said:
:lol

It must really suck to be one of you framerate whores with the magic eyeballs. Us normal people will be getting the most out of the next generation, experiencing the best possible poly counts and effects and loving every moment of it while you guys are bitching and moaning about what slow, blurry messes videogames have become. I'd pity ya'll if it weren't so funny.

If you can't see the difference, by definition you are the one who is abnormal. And it is not your eyes that is abnormal, but your brain.

Heh.

How much money would you spend to loose this cursed ability, which serves no purpose other than to ruin your enjoyment? I'd be interested to hear someone defend their ability to tell framerates, too. For the laughs.

Hey, maybe to you wearing a sock for a condom doesn't make a difference, because your nervous system can't really tell the difference. To ask someone who has normal functionality of their brain to lose it, wouldn't make sense to me IMO.

Go to this thread. It has a link to a side-by-side F-Zero GX comparison video in the first post. On the left is the 60 hz (hz = fps ???) version we know, on the right is the pal 50 hz version. I see little difference, other than the guy playing the 50 hz version isn't at peak speed as much as the guy on the left. But everyone in the thread is like, "WTF it's so different!!" Hilarious.

Yeah but don't PAL TVs refresh at 25 fields per second? Meaning that at 50 it will look solid, and there won't be any interlacing problems?

Isn't that the case RE4 vs. SH4?
 
Fight for Freeform said:
If you can't see the difference, by definition you are the one who is abnormal. And it is not your eyes that is abnormal, but your brain.

Heh.

I guess the definition of 'abnormal' means to be part of the majority. Wow, I am abnormal then.

Hey, maybe to you wearing a sock for a condom doesn't make a difference, because your nervous system can't really tell the difference. To ask someone who has normal functionality of their brain to lose it, wouldn't make sense to me IMO.

Yeah, because seeing the difference between 60 fps and 30 fps is the best protection against VD!

Yeah but don't PAL TVs refresh at 25 fields per second? Meaning that at 50 it will look solid, and there won't be any interlacing problems?

Isn't that the case RE4 vs. SH4?

Hmm... haven't a clue. Anyone know?
 
Lazy8s said:
Though the writer's analysis doesn't seem so insightful, this comment is bothersome if true. I was hoping nVidia's 'new architecture' was really as much of a departure from their current line as they've been saying.

What? Being like the Xenon GPU certainly would be a departure, but from what we know, they actually won't be similar GPUs. For a start, NVidia isn't yet convinced of the merits of a unified shading pipeline. Perhaps the writer means in terms of shading flexibility, featureset etc. In that sense, it'll probably be SM3.0+++, like the Xenon part - but that doesn't mean the architecture won't be new.
 
Pug said:
mrklaw, but say the 30FPS version of PG3 had 16 cars on screen and the 60fps had 8 which one would you get. Surely 16 cars from a pure gameplay perspective is far more important than the extra frames. You have to take for granted the extra cars on track add to the experience. I'm just trying to point out what dvelopers must consider I don't consider that either is the correct choice.

60fps>8more cars.

TOCA2 on ps2 = 21 cars on the screen with 60fps on the screen all while looking pretty damn good. This game does what should be ideal for all other racing games.
 
Fight for Freeform said:
If you can't see the difference, by definition you are the one who is abnormal. And it is not your eyes that is abnormal, but your brain.

Being unable to detect something doesn't make one "abnormal", unless the normal populace is able to detect it.

I mean, I can hear several high frequencies that the average population can't, but that doesn't mean that folks who are unable to hear the whine from a monitor/light/television/electrical unit are "abnormal".
 
DavidDayton said:
Being unable to detect something doesn't make one "abnormal", unless the normal populace is able to detect it.

This statement has been said a few times but it doesn't work because we have no idea if the general gamer can detect 30-60fps difference. Until there is some kind of study done, we don't know which is the normal and which is the abnormal.
 
Bebpo said:
This statement has been said a few times but it doesn't work because we have no idea if the general gamer can detect 30-60fps difference. Until there is some kind of study done, we don't know which is the normal and which is the abnormal.

Then attack the person who originally tried to define the standard. Even if he's on your side. ;)
 
Being unable to detect something doesn't make one "abnormal", unless the normal populace is able to detect it.

I mean, I can hear several high frequencies that the average population can't, but that doesn't mean that folks who are unable to hear the whine from a monitor/light/television/electrical unit are "abnormal".

The paper on this subject said that most people are able to distinguish up to 90 fps. Or was it 100, I honestly forget.

So the people who can't distinguish up that amount, wouldn't be classified as "most" people, in fact they would be in the minority. Which makes them abnormal, because normality (I'm guessing it's a word :P) is defined by the qualities of the majority.

If you can hear high frequencies, and studies have shown that most can't, it would make you abnormal. Whether it's good or bad I'm not saying, I'm just saying that studies have shown that most people can tell the difference between 30 and 60 fps. And again, consider that it was done on a PC monitor, which makes identifying it a bit harder!
 
Fight for Freeform said:
The paper on this subject said that most people are able to distinguish up to 90 fps. Or was it 100, I honestly forget.

So the people who can't distinguish up that amount, wouldn't be classified as "most" people, in fact they would be in the minority. Which makes them abnormal, because normality (I'm guessing it's a word :P) is defined by the qualities of the majority.

If you can hear high frequencies, and studies have shown that most can't, it would make you abnormal. Whether it's good or bad I'm not saying, I'm just saying that studies have shown that most people can tell the difference between 30 and 60 fps. And again, consider that it was done on a PC monitor, which makes identifying it a bit harder!

I believe that paper tried to determine what is discernable within human limits. That has no implication on whether or not seeing the difference between 60 and 30 fps is normal.

I didn't read the paper though, so... :D
 
Pug said:
mrklaw, but say the 30FPS version of PG3 had 16 cars on screen and the 60fps had 8 which one would you get. Surely 16 cars from a pure gameplay perspective is far more important than the extra frames. You have to take for granted the extra cars on track add to the experience. I'm just trying to point out what dvelopers must consider I don't consider that either is the correct choice.

I'll take less polys per car, 16 cars @ 60FPS :D
 
"This statement has been said a few times but it doesn't work because we have no idea if the general gamer can detect 30-60fps difference. Until there is some kind of study done, we don't know which is the normal and which is the abnormal."

I think the big point is the general gamer (non hardcore) doesn't care.

If you pointed out that GTA, MGS3, RE4, Halo 1, Halo 2, Gotham 2 etc all don't run at 60fps i think they'd just shrug and say "looks good to me"

It seems that the general gamers are becoming more hardcore than the hardcore. They don't give a toss what the frame rate is , or whether the game slows down, they just want to play what they like.

Also, i don't think this is gonna be something specific to MS as seems to be being suggested by some people (if it's at all true, which i personally don't think it is)
 
I've tried looking for the link to "the paper", but apparently locking onto that small bit of blue text amidst a sea of black is another thing my eyes aren't trained to do. However, I don't believe there has ever been a study to determine if most people can read framerates (the issue wouldn't come up over and over again otherwise). And if there has been, it's likely a college paper written by a student for a 101 class; I'd like to see the size of their focus group and what kind of people it's composed of.
 
Would anyone explain me who is the man talking about the next-gen Xbox hardware and why we have to trust his words? Where's the NDA?
 
ThirdEye said:
Would anyone explain me who is the man talking about the next-gen Xbox hardware and why we have to trust his words? Where's the NDA?

Well, his tag is "insider". Who knows what that means. I'm more puzzled by the fact that his post garnered hardly any attention on that board.
 
isn't 10MB edram a little small? considering the slagging PS2 got for having 'only' 4MB, and thats mostly for displaying 480i. Xbox2 will be expected to do 720p or even 1080i, which is a lot higher resolution (1280x720 or 1920x1080). Surely 10MB isn't enough?

Having said that, if it supports 720p, doesn't it have to do 60fps? Or at least have enough memory for double buffering.


As for the TV comments, the image actually updates at 60/59.97 times per second, so the movement is that smooth - it takes twice as long to display the whole image. Its a nice compromise - fast update for sports etc, good image quality for slower moving stuff.
 
Top Bottom