soundwave05 said:Why do I get the feeling that if Microsoft could have used the West Coast (formerly Art-X) team they would have opted for them over the East Coast team?
Whoa whoa whoa... B3D eats shit for breakfast?AB 101 said:Will this give us Toy Story graphics now.
Need to check B3D out. They eat this shit for breakfast.![]()
Duckhuntdog said:I believe the amount is roughly just 10- 13 Megs.
GhaleonEB said:what does this mean?
GhaleonEB said:what does this mean?
Tenacious-V said:It means basically free Anti Aliasing with almost no impact on the GPU, with an output of 720p.
Tenacious-V said:basically free Anti Aliasing with almost no impact on the GPU
gofreak said:You sure? There's still the computational side of things to take care of, even if we ignore memory issues. Though said computation may be negligible in comparison, I guess I'm just asking to be sure![]()
mrklaw said:come on, 10MB is enough for 720p and its all smiles? What about the PS2s 4MB edram and all the bitching? Even if you've finally worked out that the arguments were stupid, we could at least have some MS hate, just for old times sake.
Pug said:I nicked this quote from Jawed over at B3D
"The beauty is, this bus is totally separate from the bus connects the R500 to the CPU and system RAM (where the textures, vertices and shader code, etc. come from) which runs at 33GB/s.
So it seems there's a vast amount of bandwidth in XBox 360, with the frame buffer's bandwidth "isolated" from the bandwidth involved in texturing, for example."
maybe the computational stuff needed for AA will be done on the eDRAM module ?
gofreak said:Is eDRAM bandwidth ever shared?
Possible, but from the sounds of it it's "just" a memory chip. I doubt the computation's too expensive anyway, relative to memory/bandwidth requirements.
NEC signs up to provide Xbox 360 eDRAM
Illustration
Rob Fahey 10:08 27/04/2005
Another blue-chip supplier joins the ranks as Xbox 360 unveiling looms near
Japanese semiconductor giant NEC has announced that it has been tapped to provide embedded DRAM chips for Microsoft's next-generation Xbox console, which will be used as part of the graphics solution in the system.
The deal sees NEC providing eDRAM "macros" for the Xbox 360, which are described as a vital component in enabling the console to output high-definition visuals, and presumably integrate with the system's ATI-designed graphics chipset.
"NEC Electronics' cutting-edge embedded DRAM technology plays a vital role in enabling our graphics engine's performance," explained Xbox Product Group corporate VP Todd Holmdahl, "while its manufacturing process provides a reliable resource that can deliver the volumes required to support what will be an extremely popular gaming platform."
NEC is just the latest in a string of blue-chip suppliers to be announced for the Xbox 360 manufacturing process - with the two highest profile partners, of course, being ATI and IBM. ATI is supplying a next-generation graphics chipset design for the console, while a number of IBM's Power-series processors will lie at the heart of the system.
Pug said:Gofreak if it is a seperate chip conected via a seperate Bus I dont think it could be shared. Its a straight, Frambuffer, AA and blending. This really would open up the bandwith of the GPU and CPU.
gofreak said:I see that, but I was just wondering is it surprising that the bandwidth isn't shared? Whenever I think of edram I think of it having its own bandwidth.
Also, does anyone know what the bandwidth/latency will be like on this?
nightez said:Microsoft decieds to copy all Nintendo technology partners.
If i were nintendo I weould signed exclusivity agreements with ATI, IBM, NEC, RAMBUS etc long ago
It doesnt even take a 6 year old to see XBox360 is more gamecube, than Xbox2MightyHedgehog said:Are you 6 years old?
nightez said:Microsoft decieds to copy all Nintendo technology partners.
If i were nintendo I weould signed exclusivity agreements with ATI, IBM, NEC, RAMBUS etc long ago
Becuase Xbox was slapped together last second with off the shelf parts and MS now has tons more experience and feedback from their own and 3rd party game devs as to what bits are important to them?nightez said:Why the dramatic change in architecture anmd design philosphy?
Tenacious-V said:It means basically free Anti Aliasing with almost no impact on the GPU, with an output of 720p.
nightez said:It doesnt even take a 6 year old to see XBox360 is more gamecube, than Xbox2
Why the dramatic change in architecture anmd design philosphy?
Duckhuntdog said:Nothing is free, especially AA. AA has more to do with fill-rate than how much ram you have for a framebuffer.
The MS docs said that 10-13 Megs is enough for a 720P framebuffer.
They use the term 'EDRAM' module and is quite clearly a separate chip. Also confusingly they use 'E'DRAM as opposed to 'e'DRAM even though they refer in the 'leak' text that it's 'E' for 'Embedded' DRAM. Though some call it 'Enhanced' DRAM. It's relative and really irrelevant as it still a separate custom chip with custom logic with eDRAM. In this case there's far less logic than usual.
The R500 seems designed around AA and this eDRAM module so that you get 720P with 4*MSAA for free essentially with compression saving bandwidth to the framebuffer. Out of 10 MB, ~7.4 is used for the framebuffer and the rest to make this scheme work I guess.
The AA multisamples are generated by the GPU at the same time as the pixel fragments are generated. It seems the GPU is generating 4xAA samples per clock, per pixel.
The pixel fragments and their accompanying AA samples are then sent off for blending and AA. That work is done by a unit that has a dedicated frame buffer memory, with its own private bus. That's the EDRAM.
The EDRAM unit doesn't generate the AA samples, it just manipulates them. It isn't just EDRAM, it's some extra circuitry to do the blending and decide how to use the AA samples.
Azih said:Oh dear I forsee a future where xbots and psydrones will spend the next four years arguing how 360's superior IQ does or does not alleviate the power difference between it and the PS3 and whether the 360 really has better IQ anyway with each side providing copious amounts of graphical evidence to support their side.
I mean hell it's kept ATI and nVidia fanboys busy for the longest time.
Sony has always had an IQ problem and that was the main reason I didn't purchase the ps1 or ps2 since IQ is really important to me. Hopefully they have solved that issue with the ps3. It doesn't matter how many polygons you can push around if the polygons are ugly.Azih said:Oh dear I forsee a future where xbots and psydrones will spend the next four years arguing how 360's superior IQ does or does not alleviate the power difference between it and the PS3 and whether the 360 really has better IQ anyway with each side providing copious amounts of graphical evidence to support their side.
I mean hell it's kept ATI and nVidia fanboys busy for the longest time.
Tenacious-V said:It means basically free Anti Aliasing with almost no impact on the GPU, with an output of 720p.
Tenacious-V said:So in essence what all this means is XBox 360 will output 720p, with 4xMSAA (multi sample anti aliasing) essentially FREE!!!!!!!!!
Tenacious-V said:I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!
Tenacious-V said:Who cares what idiot fanboys think, if both sides have to argue which is this and which is that just to make their dicks seems longer seems pathetic to me. I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!
Fanboys are just sorry excuses for gamers. I don't understand why a person can't own one console and be happy with it without having to bitch and moan about the competition to try and justify their purchase. I mean fuck, just enjoy the games on it.
sonycowboy said:Isn't this essentially referring to the absolute last stage of the graphics pipeline immediately before the image is sent to your TV? Yes, it's great that the system doesn't have to go back to the CPU or to main memory to access anything, but it doesn't magically make the rest of the system better does it??
Vince said:As I believe Dave Barron, a founder of B3D (as opposed to Baumann who is there now) used to say, Nothing Comes For Free In 3D.
In this case, they did have to trade off logic area... so you trade off added peak preformance (perhaps added pipeline|shader concurrency) for fixed AA.
Duckhuntdog said:As far as the ram being linked to another compoenent. From the docs I have seen, that doesn't seem to be the case. MS docs refer to it as the framebuffer.
Nothing is free.
Video graphics system that includes custom memory and supports anti-aliasing and method therefor
Abstract
A method and apparatus for supporting anti-aliasing oversampling in a video graphics system that utilizes a custom memory for storage of the frame buffer is presented. The custom memory includes a memory array that stores the frame buffer as well as a data path that performs at least a portion of the blending operations associated with pixel fragments generated by a graphics processor. The fragments produced by a graphics processor are oversampled fragments such that each fragment may include a plurality of samples. If the sample set for a particular pixel location can be compressed, the compressed sample set is stored within the frame buffer of the custom memory circuit. However, if such compression is not possible, pointer information is stored within the frame buffer on the custom memory, and a sample memory controller included on the graphics processor maintains a complete sample set for the pixel location within a sample memory. When the sample memory controller maintains a complete sample set for a pixel location, the frame buffer stores a pointer corresponding to the location of the sample set.
...
...
The present invention provides a method and apparatus for performing video graphics anti-aliasing in a system that utilizes a custom memory. The custom memory includes a memory array, which may be a DRAM memory array, that stores the frame buffer for the system. Also included within the custom memory is a portion of the blending circuitry used to blend newly generated pixel fragments with stored pixel data. In cases where the blending performed within the DRAM results in data that can be reduced to a compressed sample set, further blending operations are not required. However, if the results of the first pass blending cannot be compressed, the appropriate information is provided back to the graphics processor such that the complete sample set for the particular pixel location can be stored within a sample memory for subsequent use. As such, the present invention provides a lossless compression technique that supports an over-sampled video graphics-processing environment. Such oversampling can help reduce undesirable aberrations within displayed images.
Tenacious-V said:WHich docs are you referring to? The The ones leaked last year? or the patent leaked recently?
This is what the recent patent states:
Yes it serves as a frame buffer, but not exclusively that only. It's a frame buffer with logic that in essence blends/z ops/AA/etc. It aids alongside the GPU performing tasks so the GPU is freed to do other things. So yes 4xAA is free due to this eDRAM module performing the tasks the R500 would normally have to do.
Duckhuntdog said:From docs leaked a few months ago.
Tenacious-V said:But since this is a separate component from the GPU it leaves the R500 to fully utilize it's die space.
Tenacious-V said:Who cares what idiot fanboys think, if both sides have to argue which is this and which is that just to make their dicks seems longer seems pathetic to me. I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!
Fanboys are just sorry excuses for gamers. I don't understand why a person can't own one console and be happy with it without having to bitch and moan about the competition to try and justify their purchase. I mean fuck, just enjoy the games on it.
midnightguy said:FSAA wont be free. it will still cost pixel fillrate. unless I am mistaken.
and I don't concider 4x AA on next-gen consoles to be anything close to 'amazing' - the Xbox can do 4xAA even though it costs alot.
32X AA or even 16X AA would be amazing, but not 4X. current PC GPUs pull off 6x and 8x AA, and have done so for the last couple of years (at least 6x).
midnightguy said:FSAA wont be free. it will still cost pixel fillrate. unless I am mistaken.
and I don't concider 4x AA on next-gen consoles to be anything close to 'amazing' - the Xbox can do 4xAA even though it costs alot.
32X AA or even 16X AA would be amazing, but not 4X. current PC GPUs pull off 6x and 8x AA, and have done so for the last couple of years (at least 6x).