Xbox2 hardware: good, yet disappointing at the same time [from a developer]

mrklaw said:
isn't 10MB edram a little small? considering the slagging PS2 got for having 'only' 4MB, and thats mostly for displaying 480i. Xbox2 will be expected to do 720p or even 1080i, which is a lot higher resolution (1280x720 or 1920x1080). Surely 10MB isn't enough?

Having said that, if it supports 720p, doesn't it have to do 60fps? Or at least have enough memory for double buffering.

Having read a variety of posts on this on different boards, I'm quite confused by this myself. PS2 had 4MB on the GS, but Xbox didn't have any memory right on the graphics card, did it? Can't main memory be used for the framebuffer?
 
DCharlie said:
I think the big point is the general gamer (non hardcore) doesn't care.

In my experience, that's exactly it for the vast majority of gamers. As long as it doesn't chug and freeze, or have an extraordinarily inconsistent frame rate, most people don't care, in general. How else would the top-selling games of the last two generations be so universally hailed as great games, when most of them don't have even a purist's hard-locked 30fps at the minimum? For most normal, non-super-jaded motherfuckers...if the game is fun, that feeling will supercede even the clunkiest and most technically spotty games out there.
 
gofreak said:
Having read a variety of posts on this on different boards, I'm quite confused by this myself. PS2 had 4MB on the GS, but Xbox didn't have any memory right on the graphics card, did it? Can't main memory be used for the framebuffer?
I guess it could be possible to use the main memory as the framebuffer, unless you are somehow required to use the eDRAM because it's the only "link" between the GPU and the TV encoder chip. But using such an ultra fast memory for the framebuffer is of course a big plus, especially of you are doing post processing on the full image. Contrary to PS2 where the framebuffer and the textures had to be in VRAM to be used (please correct me if I'm wrong), on Xbox 2 the eDRAM region is used only as a temporary memory for the framebuffer, zbuffer and a few more stuff that require few memory, but fast one. Textures will be stored directly in the main memory where the GPU has direct access like on Xbox. So there is no texture streaming or whatever, the GPU can just access both memory pools by itself.
 
mrklaw said:
isn't 10MB edram a little small? considering the slagging PS2 got for having 'only' 4MB, and thats mostly for displaying 480i.
Different uses, PS2 eDram has to store all rendering and display buffers, as well as serve as temporary texture storage.
If you stick with just rendering buffers, (leave the texture and other data in mainmemory, as well as the front buffer that is actually displayed) GCN gets away with only 2MB for 480i.

And with front buffer in mainmem, you can always render in large tiles, if you can't fit the whole screen-size in the eDram.
 
I have the perfect solution for us gameplay whores who'd like to quit paying for the graphic-whores of the world: Have developers do a streamlined version of PC options. You don't have to confuse the end user by giving too many options, but ship with a default option and a menu option that turns the game to 60fps by disabling effects. This way people who value gameplay could have their cake and so would the graphic tarts. You don't like the reduced effects required for 60fps, don't frickin' enable it.
 
gofreak:
What? Being like the Xenon GPU certainly would be a departure,
I'm hoping for more than just those evolutionary changes to flexible data access and programmability, though. More along the lines of alternative rendering schemes. Of course, I've never expected much in the way of that since I realized it was nVidia behind the GPU, though (and, just the opposite, was actually worried about going in that direction when I thought Sony and Toshiba were going to be behind it.)
 
MightyHedgehog said:
In my experience, that's exactly it for the vast majority of gamers. As long as it doesn't chug and freeze, or have an extraordinarily inconsistent frame rate, most people don't care, in general. How else would the top-selling games of the last two generations be so universally hailed as great games, when most of them don't have even a purist's hard-locked 30fps at the minimum? For most normal, non-super-jaded motherfuckers...if the game is fun, that feeling will supercede even the clunkiest and most technically spotty games out there.

No one is arguing otherwise.

If a game is a kickass game than it will sell tons and be loved by gamers even if it looks like a shitty PS1 game running at 15fps. Halo 2 would've still sold the same amount if it was running at 10-15fps.

The difference is that 60fps makes a good game even better. It's like icing on the cake. Ninja Gaiden is a great game because of the gameplay, but it's also the best looking game on Xbox because of the 60fps + tons of enemies + great textures and effects. If Ninja Gaiden was 30fps it still would have been a great game loved by everyone, but people would stop using it as an example of amazing Xbox graphics.
 
Enigma said:
I have the perfect solution for us gameplay whores who'd like to quit paying for the graphic-whores of the world: Have developers do a streamlined version of PC options. You don't have to confuse the end user by giving too many options, but ship with a default option and a menu option that turns the game to 60fps by disabling effects. This way people who value gameplay could have their cake and so would the graphic tarts. You don't like the reduced effects required for 60fps, don't frickin' enable it.
If only it was that simple. Console games are developed with one hardware in mind, so they the way the timing are done in a game can vary a lot. A PC game is developed with lots of hardware in mind, so the developers have to handle all the game internal timings without taking the output in mind. A PC will have a certain refresh rate internally, and will try to output as much frames as the hardware can. If it's 1 FPS, the game will still be in sync, unplayable, but in sync. If someone disable all the GPU heavy functions, it will then run at 60 fps maybe, but internally all the game world will be the same.
This is not the best way to get the most out of the hardware though, you are losing a lot of CPU cycles if you use such a method and you have no "simple" way to prevent slowdowns or anything since basically you let the GPU act on his own. So console developers can for example sync everything with the framerate, this lets them control what's happening more tightly and easily, but then you are locked to this framerate. If something goes wrong, it's the whole game logic that's affected.
It's been a long time since I've been actively researching such stuff so maybe these techniques are not really used anymore. There are of course many ways to handle the game timings, but on the console side you usually don't do stuff like you would on a PC. It's not efficient enough.
 
Enigma said:
I have the perfect solution for us gameplay whores who'd like to quit paying for the graphic-whores of the world: Have developers do a streamlined version of PC options. You don't have to confuse the end user by giving too many options, but ship with a default option and a menu option that turns the game to 60fps by disabling effects. This way people who value gameplay could have their cake and so would the graphic tarts. You don't like the reduced effects required for 60fps, don't frickin' enable it.

And here I thought all the framerate whores were complaining because 30 fps makes everything a blurry mess. I guess all those posts weren't really from framerate whores, but fake framerate whores trying to make you guys look bad.

Nice how you can lump people into a negative stereotype whenever they have an opposing view, isn't it?

BTW, getting double the polygon power by staying at 30 fps could enhance the gameplay far more than bumping it up 30 fps. You could have double the enemies, double the racers, double a lot of things that can dramatically change the way you play. Weird that someone like you, who's so interested in gameplay, would deny that for smoother animation. Or are your intentions not as noble as you like to make them sound?
 
Bebpo said:
No one is arguing otherwise.

If a game is a kickass game than it will sell tons and be loved by gamers even if it looks like a shitty PS1 game running at 15fps. Halo 2 would've still sold the same amount if it was running at 10-15fps.

The difference is that 60fps makes a good game even better. It's like icing on the cake. Ninja Gaiden is a great game because of the gameplay, but it's also the best looking game on Xbox because of the 60fps + tons of enemies + great textures and effects. If Ninja Gaiden was 30fps it still would have been a great game loved by everyone, but people would stop using it as an example of amazing Xbox graphics.

I agree. I love 60fps. However, it isn't necessary and I certainly would not dock any game points for having only a decent 30fps. 60fps, to me, is, like you've said, icing.
 
Lazy8s said:
gofreak:

I'm hoping for more than just those evolutionary changes to flexible data access and programmability, though. More along the lines of alternative rendering schemes. Of course, I've never expected much in the way of that since I realized it was nVidia behind the GPU, though (and, just the opposite, was actually worried about going in that direction when I thought Sony and Toshiba were going to be behind it.)


Yeah, I think alternate rendering schemes are an expectation too far ;) Maybe PS4?
 
RE4 vs. SH4 said:
And here I thought all the framerate whores were complaining because 30 fps makes everything a blurry mess. I guess all those posts weren't really from framerate whores, but fake framerate whores trying to make you guys look bad.

Nice how you can lump people into a negative stereotype whenever they have an opposing view, isn't it?

BTW, getting double the polygon power by staying at 30 fps could enhance the gameplay far more than bumping it up 30 fps. You could have double the enemies, double the racers, double a lot of things that can dramatically change the way you play. Weird that someone like you, who's so interested in gameplay, would deny that for smoother animation. Or are your intentions not as noble as you like to make them sound?

If they're gonna call the 60FPS camp dumbass graphic-whores (Which they always do, including this thread), then I'm gonna shoot back and place that hated monicker on the camp where it actually belongs. This is always their arguement and it belongs with them.

As for people complaining that 30fps makes things look blurry, if that's the arguement they're using, they're an even bigger dumbass than the pricks who are trigger happy to call the 60fps camp graphic whores.
 
Enigma said:
If they're gonna call the 60FPS camp dumbass graphic-whores (Which they always do, including this thread), then I'm gonna shoot back and place that hated monicker on the camp where it actually belongs. This is always their arguement and it belongs with them.

As for people complaining that 30fps makes things look blurry, if that's the arguement they're using, they're an even bigger dumbass than the pricks who are trigger happy to call the 60fps camp graphic whores.

I've never seen anyone call a framerate whore a graphics whore, but I've seen PLENTY of people content with 30 fps being labeled as graphics whores. Regardless, I agree that throwing around the insulting term is stupid, which is why your post irked me.
 
Just go back a couple pages (Well at least the 1st half) in this thread and you'll see it.

Still, you're third point didn't immediately connect with me about the polygons and gameplay. Maybe gameplay whore was a bad phrase. But I truly do feel 60fps gives a much more natural feeling of control... and I associate control with gameplay. Looking at your typical review, most people seperate the two when reviewing a game. To me it's central to the gameplay, but tha'st a matter of semantics.
 
Lazy8s
I'm hoping for more than just those evolutionary changes to flexible data access and programmability, though. More along the lines of alternative rendering schemes. Of course, I've never expected much in the way of that since I realized it was nVidia behind the GPU, though (and, just the opposite, was actually worried about going in that direction when I thought Sony and Toshiba were going to be behind it.)
Have you seen that image based rendering demo for the latest graphics cards? These things are possible through the clever use of pixel shaders and raw speed - both of which should be aplenty in the future GPUs.
 
Enigma said:
Just go back a couple pages (Well at least the 1st half) in this thread and you'll see it.

Still, you're third point didn't immediately connect with me about the polygons and gameplay. Maybe gameplay whore was a bad phrase. But I truly do feel 60fps gives a much more natural feeling of control... and I associate control with gameplay. Looking at your typical review, most people seperate the two when reviewing a game. To me it's central to the gameplay, but tha'st a matter of semantics.

Yeah. The problem with this whole thing is, 60 fps is never going to be a given. Developers will always be able to sacrafice it to free resources. That's crappy because the debates will never stop. I don't care about 60 fps, but I do want everyone to be happy.
 
Shin Johnpv said:
wow did you even read what I wrote

or did you just reply blindly

I mean answer honestly here

you'll notice I didn't mention games at ALL

read it again NOT AT ALL

I was talking about the NTSC 29.97 through broadcast TV

which the point i was making is allready interlaced


and anyway 29.97 full frames does not equal 59.94 interlaced frames for interlaced images a full frame is still captured/made and is interlaced later down the pipe


I made no comment on which was better or worse just clearing up "facts" that were posted


Yes I read what you wrote, and if you're referring simply to broadcast tv, and not to games, then the post was alltogether pointless. Instead you came acrossed as saying that ntsc tvs can only display 30 pictures per second (in any sense of the word be it fields or frames), which is simply not the case. That would imply that the difference between 30 and 60 fps is only discernable inside the xbox, and has no difference in on screen display, which is, quite simply, not true.
 
gofreak:
Yeah, I think alternate rendering schemes are an expectation too far ;) Maybe PS4?
The change in rendering architecture would not have to be as pronounced as some REYES or Renderman set-up, just be something more subtle like tile-based deferred texturing or a strong cross bred immediate mode renderer. I think there's a chance it'll be some hybrid-to-a-good-degree IMR, so my anticipation is high.

Marconelly:
Have you seen that image based rendering demo for the latest graphics cards? These things are possible through the clever use of pixel shaders and raw speed - both of which should be aplenty in the future GPUs.
Haven't had the chance to see it. Yeah, you don't need raytracing or anything in particular to do realistic looking effects with enough ingenuity, yet certain schemes facilitate associated areas of graphics like lighting and IQ much better for the average developer.
 
HokieJoe said:
You must be freaking cross-eyed or a member of the golden-eye club who can divine the difference between 60fps and 30fps. Again, not everyone can see the difference. Just because you do, doesn't mean it's generalizable to the population at large.

OFCOURSE everyone can see the difference. It`s just that most people doesn`t know what it is. Why do you think most developers strive for 60fps? Why do you think they made GT 3/4 60fps, when GT1/2 was 30fps? Because only some "special" people can see thej difference?
 
Gregory said:
OFCOURSE everyone can see the difference. It`s just that most people doesn`t know what it is. Why do you think most developers strive for 60fps? Why do you think they made GT 3/4 60fps, when GT1/2 was 30fps? Because only some "special" people can see thej difference?

Pretty much. The human eye can see WELL above 60 fps( I beleive around 74 is the "saturation point" though...kindof the point of diminishing returns), and I find it hard to beleive that theres people out there who's eyes do not send signals back to their brain at over 30fps. Most people simply don't know what they're looking at when they see the difference. I guess people expect 60fps to look specifically less choppy than 30, but it doesn't. It just looks a bit....clearer, I guess would be the best word for it.
 
Someone from TXB posted this (so take with a grain of salt):

EPe9686518 said:
Anyway this guy is clearly bias twards Xbox 2 and really has no clue what he is talking about. The fact that this guy stated that "no developers are shooting for 60 fps " on a system that has basicly 3 CPUs and can do 6 processing threads at once is simply insane. They can dedacate one whole CPU core to helping the GPU with geometry and still have 2 full CPU cores left for Physics, AI and frame rate.

As some of you know I work for a gaming web site, I talk with developers a good bit and have listened to what they had to say about Xbox 2. Every single one I have spoke with has been nothing less but extremely impressed and amazed at the system. From the fully developed tool sets to the architechture it self, developers have been extremely happy with what MS had put togeather for Xbox 2. This guy who made this post is about to shoot him self in the foot big time as he is dead wrong on many things and completely lose the small amount of credablity he had to start with.

When you guys see Xbox 2 at E3 this year you will be blown away. The jump in graphics is not going to be the same from PSX to PS2. It's going to be closer to the jump made from software rendering to voodoo 2. I am not being the least bit bias when I say people will littlerly not belive what they are seeing in terms of graphics and gameplay. The games look so good that there would be a ton of post on the fourms when you will finally get to see them debating reather they are real time or pre rendered CG movies. I say "would be" because you will see these be played in real time and that will end any debates about that.

We should see PS3 first before we see Xbox 2. But expected to be blown away when you finally get to see Xbox 2 as they have done a great job on the hardware and got some damn impressive games on tap."
 
Most people simply don't know what they're looking at when they see the difference
Yep, it often takes showing them the same thing at different fps running side by side before they start realizing what it is they were seeing.

Anyway, there are many subtle graphical touches in-games that are subject to similar scrutiny. Most people have no idea they are there(including many of those that like to argue about such stuff on forums) but pretty much everyone will notice something missing if you take them away.
 
I dont understand why No one EVER talks about Nintendo Revolution. For all anyone knows, Nintendo could have the best console of next gen.

I honestly hope Sony have the best hardware next gen. Sony have some of the best developers on thier side and it would be incredible to see developers like Konami, Sqauare and others could do since they dont develop as much on other consoles as they do with Sony
 
Fight for Freeform said:
Yeah but don't PAL TVs refresh at 25 fields per second? Meaning that at 50 it will look solid, and there won't be any interlacing problems?

PAL, 50fields per second, 25frames per second. 50hz. Compared to NTSC 60hz, 60fps/30fps.

So a slightly lower framerate, which is mostly noticeable in 30fps games,which then runs at 25 in PAL. 60fps becomes 50fps but the difference in smoothness is very small, 50fps games still look silky smooth The biggest difference can be that games run slightly slower if not PAL optimized.

Confusing enough ? ;)

But most PAL games, atleast on Xbox, has a 60hz option these days, so people who wants can play in 60hz.
 
psycho_snake said:
I dont understand why No one EVER talks about Nintendo Revolution. For all anyone knows, Nintendo could have the best console of next gen.

I honestly hope Sony have the best hardware next gen. Sony have some of the best developers on thier side and it would be incredible to see developers like Konami, Sqauare and others could do since they dont develop as much on other consoles as they do with Sony

Because we don't really know anything about it...or even have many rumors to go on...or even care (for alot of us).

Here...I'll get the ball rolling. I've heard from somebody that knows somebody who washes floors at nintendo that it will have 14 GPU's, 800 CPU's all running at greater than a googlehertz, and will shine lazers into your eyes to read your mind to control your charachters.

And it can be worn like a back pack for easy transportation...or can be folded to fit in a purse.

Discuss.
 
Last gen most games were a solid 30 fps if they were lucky. This gen the percentage of 60fps games increased. I fully expected that to happen again with these new consoles. Games like Pro Evolution, F Zero and Monkey Ball wouldnt be as nice to control at 30 fps. I hope its just a launch issue and not a trend.
 
Cpiasminc said:
BTW, I should also note that based on what I'm hearing from every studio I've been to, I'd have to say that, at least for the first generation of games on next-gen consoles, you will not see anything running at 60 fps. There is not one studio I've talked to who hasn't said they're shooting only for 30 fps. Some have even said that for next-gen, they won't shoot higher than 30 fps ever again.
Evidently this guy doesn't talk to anyone at Tecmo or Namco. ;)
 
Haven't had the chance to see it. Yeah, you don't need raytracing or anything in particular to do realistic looking effects with enough ingenuity
Unfortunatelly I can't find the one I was thinking of (which had *really* high quality motion blur on top of all things) but here's one ATI made available on their site, which is not bad either:

http://www.ati.com/developer/demos/R9700.html

It's called 'rendering with natural light' and is made for the card two generations behind their current tech.

*Edit* MArsomega has an avatar with a screen from the demo I was thinking of:

avatar8.jpg
 
bishoptl said:
and that's where you should have left it.

yeah because it was so much more bias than the original post :|

The guy is from worthplaying - a good info site on all consoles.
 
Marconelly, that demo can be found here:

Ah yes, that's it! Thanks :)

Certainly chockfull of 'dev art' but amazingly impressive from the technology standpoint.

00.jpg


*edit* From the wreckless dev? Well, doesn't surprise me!
 
Marconelly said:
Unfortunatelly I can't find the one I was thinking of (which had *really* high quality motion blur on top of all things) but here's one ATI made available on their site, which is not bad either:

http://www.ati.com/developer/demos/R9700.html

It's called 'rendering with natural light' and is made for the card two generations behind their current tech.

*Edit* MArsomega has an avatar with a screen from the demo I was thinking of:

avatar8.jpg

So we can expect games to look like this or better next gen.. I really likes the bear demo :lol
 
I wouldn't be surprised if we see that level of graphics in games. That demo again runs on hardware 3-4 generations behind what Xbox 2 will have, and PS3 could have even newer stuff in it (frankly, I have no idea what ati or nvidia are cooking).

yeah because it was so much more bias than the original post :|

The guy is from worthplaying - a good info site on all consoles.
Yeah, but the original post at least didn't read like some incoherent rambling of a pre-school kid.
 
morbidaza said:
Yes I read what you wrote, and if you're referring simply to broadcast tv, and not to games, then the post was alltogether pointless. Instead you came acrossed as saying that ntsc tvs can only display 30 pictures per second (in any sense of the word be it fields or frames), which is simply not the case.

no that's not what I said

and in talking about broadcast tv the post was not pointless because the original post that eventually led to mine the poster said they enjoy watching sports on broadcast tv at 60 FPS which is wrong because regular broadcast tv is 30fps interlaced

if you read me talking about the diference between interlaced frames and none interlaced frames as that NTSC tvs could only do 30fps then you read way more into then I put into the post no where did I say NTSC Tvs can only do 30fps all I talked about was the broadcast signal for regular NTSC TV is a 29.97 fps interlaced video
 
Marc said:
I wouldn't be surprised if we see that level of graphics in games.
Depends on just how much of a pixel monsters the console chips will be.
This thing is pretty much entirely fixed cost pixel processing(lots of image postprocessing), so this demo scales pretty much linearly with resolution - around 54fps on 6800GT in 640x480, around 14fps @ 1280x960, etc.

The problem is though, that while fixed cost, these processes don't pipeline with the rest of your rendering, so other stuff you'll do will come on top of it. For HDTV resolutions, you'll need massive pixel resources to go crazy like this and still run a game with it.

But at the very least, you ought to be able to pull off a pool or pinball game looking like this :D
 
Shin Johnpv said:
no that's not what I said

and in talking about broadcast tv the post was not pointless because the original post that eventually led to mine the poster said they enjoy watching sports on broadcast tv at 60 FPS which is wrong because regular broadcast tv is 30fps interlaced

No, it`s not wrong. Most broadcast is shown at 60 interlaced fps. Just as how 60fps in games is achieved (keeping progressive out of this) NFL2K5 for example runs at the same framerate as a NFL (or any sports for that matter basically) broadcast.

Here you go.

"Broadcasters can choose between three formats:

* 480i - The picture is 704x480 pixels, sent at 60 interlaced frames per second (30 complete frames per second).
* 480p - The picture is 704x480 pixels, sent at 60 complete frames per second.
* 720p - The picture is 1280x720 pixels, sent at 60 complete frames per second.
* 1080i - The picture is 1920x1080 pixels, sent at 60 interlaced frames per second (30 complete frames per second).
* 1080p - The picture is 1920x1080 pixels, sent at 60 complete frames per second. "

http://www.chrisguinn.com/byopc/avspecs.html
 
tekken4_0327_78.jpg

Especially this one. Imagine a level that looked like that, with the trees blowing in the breeze, boats in the harbour, clouds moving in the sky.
 
HyperionX said:
Xenon's lack of eDRAM is the next-gen's version of this gen's PS2's lack of VRAM. I'd say this is a fair assessment.


you mean Xenon's *reported* but unconfirmed 10 MB of eDRAM is comparable to PS2's 4 MB of eDRAM. well, yes, I would tend to agree with that. even the Dolphin | Gamecube was supposed to have 8 to 16 MB of embedded 1T-SRAM back circa late 1999 early 2000. And the 16 enhanced GS's in GSCube *Each* have 32 MB eDRAM.

hopefully that Xenon document was indeed old, and the VPU is getting significantly more than 10 MB. but i wouldn't be surprised if it doesn'y
 
the edram is for the framebuffer dudes.

It's not the video ram. You know... so it can fit a high resolution image on screen with fsaa and object occlusion without taking much of a hit at all?
 
Top Bottom