• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Microsoft Deploys NEC Embedded DRAM Technology in Its Next-Generation Xbox

Why do I get the feeling that if Microsoft could have used the West Coast (formerly Art-X) team they would have opted for them over the East Coast team?
 
That's great to hear especially if it improves AA which I think is the biggest difference between realtime graphics and that rendered look.
 
come on, 10MB is enough for 720p and its all smiles? What about the PS2s 4MB edram and all the bitching? Even if you've finally worked out that the arguments were stupid, we could at least have some MS hate, just for old times sake.
 
Tenacious-V said:
basically free Anti Aliasing with almost no impact on the GPU

You sure? There's still the computational side of things to take care of, even if we ignore memory issues. Though said computation may be negligible in comparison, I guess I'm just asking to be sure :)
 
gofreak said:
You sure? There's still the computational side of things to take care of, even if we ignore memory issues. Though said computation may be negligible in comparison, I guess I'm just asking to be sure :)

maybe the computational stuff needed for AA will be done on the eDRAM module ?
 
mrklaw said:
come on, 10MB is enough for 720p and its all smiles? What about the PS2s 4MB edram and all the bitching? Even if you've finally worked out that the arguments were stupid, we could at least have some MS hate, just for old times sake.

You obviously do not know what that embedded RAM is intended for or did you choose to ignore the fact that the X360 also has that UMA designed RAM pool that can be accessed by the GPU and CPU's?
 
I nicked this quote from Jawed over at B3D

"The beauty is, this bus is totally separate from the bus connects the R500 to the CPU and system RAM (where the textures, vertices and shader code, etc. come from) which runs at 33GB/s.

So it seems there's a vast amount of bandwidth in XBox 360, with the frame buffer's bandwidth "isolated" from the bandwidth involved in texturing, for example."
 
Pug said:
I nicked this quote from Jawed over at B3D

"The beauty is, this bus is totally separate from the bus connects the R500 to the CPU and system RAM (where the textures, vertices and shader code, etc. come from) which runs at 33GB/s.

So it seems there's a vast amount of bandwidth in XBox 360, with the frame buffer's bandwidth "isolated" from the bandwidth involved in texturing, for example."

Is eDRAM bandwidth ever shared?


maybe the computational stuff needed for AA will be done on the eDRAM module ?

Possible, but from the sounds of it it's "just" a memory chip. I doubt the computation's too expensive anyway, relative to memory/bandwidth requirements.
 
gofreak said:
Is eDRAM bandwidth ever shared?




Possible, but from the sounds of it it's "just" a memory chip. I doubt the computation's too expensive anyway, relative to memory/bandwidth requirements.

yes exactly. and i thought i read that the eDRAM unit will have some sort of circuity for processing AA and framebuffer stuff, but dont quote me :)



also, GI.biz article on the main topic

NEC signs up to provide Xbox 360 eDRAM
Illustration
Rob Fahey 10:08 27/04/2005

Another blue-chip supplier joins the ranks as Xbox 360 unveiling looms near

Japanese semiconductor giant NEC has announced that it has been tapped to provide embedded DRAM chips for Microsoft's next-generation Xbox console, which will be used as part of the graphics solution in the system.

The deal sees NEC providing eDRAM "macros" for the Xbox 360, which are described as a vital component in enabling the console to output high-definition visuals, and presumably integrate with the system's ATI-designed graphics chipset.

"NEC Electronics' cutting-edge embedded DRAM technology plays a vital role in enabling our graphics engine's performance," explained Xbox Product Group corporate VP Todd Holmdahl, "while its manufacturing process provides a reliable resource that can deliver the volumes required to support what will be an extremely popular gaming platform."

NEC is just the latest in a string of blue-chip suppliers to be announced for the Xbox 360 manufacturing process - with the two highest profile partners, of course, being ATI and IBM. ATI is supplying a next-generation graphics chipset design for the console, while a number of IBM's Power-series processors will lie at the heart of the system.
 
Gofreak if it is a seperate chip conected via a seperate Bus I dont think it could be shared. Its a straight, Frambuffer, AA and blending. This really would open up the bandwith of the GPU and CPU.
 
Microsoft decieds to copy all Nintendo technology partners.
If i were nintendo I weould signed exclusivity agreements with ATI, IBM, NEC, RAMBUS etc long ago
 
Pug said:
Gofreak if it is a seperate chip conected via a seperate Bus I dont think it could be shared. Its a straight, Frambuffer, AA and blending. This really would open up the bandwith of the GPU and CPU.

I see that, but I was just wondering is it surprising that the bandwidth isn't shared? Whenever I think of edram I think of it having its own bandwidth.

Also, does anyone know what the bandwidth/latency will be like on this?
 
gofreak said:
I see that, but I was just wondering is it surprising that the bandwidth isn't shared? Whenever I think of edram I think of it having its own bandwidth.

Also, does anyone know what the bandwidth/latency will be like on this?


it'll be interesting to compare Xenon's eDRAM latency to Gamecube Flipper's 1T-SRAM latency.
(1T-SRAM is not a true SRAM, it's actually a DRAM but better than normal DRAM)
 
nightez said:
Microsoft decieds to copy all Nintendo technology partners.
If i were nintendo I weould signed exclusivity agreements with ATI, IBM, NEC, RAMBUS etc long ago

If you were Nintendo, you'd be a large building and you still couldn't afford that.
 
nightez said:
Why the dramatic change in architecture anmd design philosphy?
Becuase Xbox was slapped together last second with off the shelf parts and MS now has tons more experience and feedback from their own and 3rd party game devs as to what bits are important to them?
 
Tenacious-V said:
It means basically free Anti Aliasing with almost no impact on the GPU, with an output of 720p.

Nothing is free, especially AA. AA has more to do with fill-rate than how much ram you have for a framebuffer.

The MS docs said that 10-13 Megs is enough for a 720P framebuffer.
 
nightez said:
It doesnt even take a 6 year old to see XBox360 is more gamecube, than Xbox2
Why the dramatic change in architecture anmd design philosphy?


one reason is because Xbox had a horrible bottleneck which was a real disadvantage compared to PS2 and Cube - all CPU, Audio and GPU data had to be crammed through the shared 6.4 GB sec pipeline, giving Xbox far less graphics bandwidth than the other two consoles. this must have been a nightmare for developers. even though the 6.4 GB sec bandwidth on Xbox is more than PS2's 3.2 GB/s or GC's 2.6 GB/s main memory bandwidths, the Xbox didn't have the tens of GB/s graphics bandwidth that PS2 and Cube enjoyed.
 
Duckhuntdog said:
Nothing is free, especially AA. AA has more to do with fill-rate than how much ram you have for a framebuffer.

The MS docs said that 10-13 Megs is enough for a 720P framebuffer.

That is specifically what this eDRAM module is meant to resolve. It's designed specifically to take the burden of doing AA away from the GPU. This is indeed a separate chip from the GPU. We know the ATi R500 GPU is being fabbed at TSMC using their .90nm process, this eDRAM module is being fabbed at NEC on their most advanced .90nm process. At first from the leaked layout last year we thought it was 10MB embedded directly on the GPU, now we know it's a separate chip consisting of eDRAM and blending/z/stencil/MSAA/compression/decompression logic.

More specific explenation from the tech heads at B3D who have been studying the patent leaks like hawks.

They use the term 'EDRAM' module and is quite clearly a separate chip. Also confusingly they use 'E'DRAM as opposed to 'e'DRAM even though they refer in the 'leak' text that it's 'E' for 'Embedded' DRAM. Though some call it 'Enhanced' DRAM. It's relative and really irrelevant as it still a separate custom chip with custom logic with eDRAM. In this case there's far less logic than usual.

The R500 seems designed around AA and this eDRAM module so that you get 720P with 4*MSAA for free essentially with compression saving bandwidth to the framebuffer. Out of 10 MB, ~7.4 is used for the framebuffer and the rest to make this scheme work I guess.

The AA multisamples are generated by the GPU at the same time as the pixel fragments are generated. It seems the GPU is generating 4xAA samples per clock, per pixel.

The pixel fragments and their accompanying AA samples are then sent off for blending and AA. That work is done by a unit that has a dedicated frame buffer memory, with its own private bus. That's the EDRAM.

The EDRAM unit doesn't generate the AA samples, it just manipulates them. It isn't just EDRAM, it's some extra circuitry to do the blending and decide how to use the AA samples.

So in essence what all this means is XBox 360 will output 720p, with 4xMSAA (multi sample anti aliasing) essentially FREE!!!!!!!!!
 
Oh dear I forsee a future where xbots and psydrones will spend the next four years arguing how 360's superior IQ does or does not alleviate the power difference between it and the PS3 and whether the 360 really has better IQ anyway with each side providing copious amounts of graphical evidence to support their side.

I mean hell it's kept ATI and nVidia fanboys busy for the longest time.
 
Azih said:
Oh dear I forsee a future where xbots and psydrones will spend the next four years arguing how 360's superior IQ does or does not alleviate the power difference between it and the PS3 and whether the 360 really has better IQ anyway with each side providing copious amounts of graphical evidence to support their side.

I mean hell it's kept ATI and nVidia fanboys busy for the longest time.

Who cares what idiot fanboys think, if both sides have to argue which is this and which is that just to make their dicks seems longer seems pathetic to me. I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!

Fanboys are just sorry excuses for gamers. I don't understand why a person can't own one console and be happy with it without having to bitch and moan about the competition to try and justify their purchase. I mean fuck, just enjoy the games on it.
 
Azih said:
Oh dear I forsee a future where xbots and psydrones will spend the next four years arguing how 360's superior IQ does or does not alleviate the power difference between it and the PS3 and whether the 360 really has better IQ anyway with each side providing copious amounts of graphical evidence to support their side.

I mean hell it's kept ATI and nVidia fanboys busy for the longest time.
Sony has always had an IQ problem and that was the main reason I didn't purchase the ps1 or ps2 since IQ is really important to me. Hopefully they have solved that issue with the ps3. It doesn't matter how many polygons you can push around if the polygons are ugly.
 
Tenacious-V said:
It means basically free Anti Aliasing with almost no impact on the GPU, with an output of 720p.

Isn't this essentially referring to the absolute last stage of the graphics pipeline immediately before the image is sent to your TV? Yes, it's great that the system doesn't have to go back to the CPU or to main memory to access anything, but it doesn't magically make the rest of the system better does it??

(I'm seriously asking here, I have no f'ing clue).

That is, I don't think that framerate problems or resolution problems are solely a factor of the framebuffer, which is essentially all this refers to, correct?

(It's still great news as every system prior did have issues with this, I just don't know that it's the godsend somepeople think it is and that it will magically make all games run at maximum resolution with no frame hiccups).

Thanks in advance to whoever can explain this better.
 
Tenacious-V said:
So in essence what all this means is XBox 360 will output 720p, with 4xMSAA (multi sample anti aliasing) essentially FREE!!!!!!!!!
Tenacious-V said:
I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!


As I believe Dave Barron, a founder of B3D (as opposed to Baumann who is there now) used to say, Nothing Comes For Free In 3D.

In this case, they did have to trade off logic area... so you trade off added peak preformance (perhaps added pipeline|shader concurrency) for fixed AA.
 
Tenacious-V said:
Who cares what idiot fanboys think, if both sides have to argue which is this and which is that just to make their dicks seems longer seems pathetic to me. I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!

Fanboys are just sorry excuses for gamers. I don't understand why a person can't own one console and be happy with it without having to bitch and moan about the competition to try and justify their purchase. I mean fuck, just enjoy the games on it.

As far as the ram being linked to another compoenent. From the docs I have seen, that doesn't seem to be the case. MS docs refer to it as the framebuffer.

Nothing is free, not even AA.
 
sonycowboy said:
Isn't this essentially referring to the absolute last stage of the graphics pipeline immediately before the image is sent to your TV? Yes, it's great that the system doesn't have to go back to the CPU or to main memory to access anything, but it doesn't magically make the rest of the system better does it??

No it doesn't make the rest of the system better, but in laymens terms if you no longer have to rely on the CPU/GPU to generate everything the eDRAM module does, it frees them up to perform other tasks more often, resulting in better performace. Does that answer your question?
 
Vince said:
As I believe Dave Barron, a founder of B3D (as opposed to Baumann who is there now) used to say, Nothing Comes For Free In 3D.

In this case, they did have to trade off logic area... so you trade off added peak preformance (perhaps added pipeline|shader concurrency) for fixed AA.

Yes that would be the case if this was still embedded in the GPU itself. But since this is a separate component from the GPU it leaves the R500 to fully utilize it's die space.
 
Duckhuntdog said:
As far as the ram being linked to another compoenent. From the docs I have seen, that doesn't seem to be the case. MS docs refer to it as the framebuffer.

Nothing is free.

WHich docs are you referring to? The The ones leaked last year? or the patent leaked recently?

This is what the recent patent states:

Video graphics system that includes custom memory and supports anti-aliasing and method therefor

Abstract

A method and apparatus for supporting anti-aliasing oversampling in a video graphics system that utilizes a custom memory for storage of the frame buffer is presented. The custom memory includes a memory array that stores the frame buffer as well as a data path that performs at least a portion of the blending operations associated with pixel fragments generated by a graphics processor. The fragments produced by a graphics processor are oversampled fragments such that each fragment may include a plurality of samples. If the sample set for a particular pixel location can be compressed, the compressed sample set is stored within the frame buffer of the custom memory circuit. However, if such compression is not possible, pointer information is stored within the frame buffer on the custom memory, and a sample memory controller included on the graphics processor maintains a complete sample set for the pixel location within a sample memory. When the sample memory controller maintains a complete sample set for a pixel location, the frame buffer stores a pointer corresponding to the location of the sample set.
...
...
The present invention provides a method and apparatus for performing video graphics anti-aliasing in a system that utilizes a custom memory. The custom memory includes a memory array, which may be a DRAM memory array, that stores the frame buffer for the system. Also included within the custom memory is a portion of the blending circuitry used to blend newly generated pixel fragments with stored pixel data. In cases where the blending performed within the DRAM results in data that can be reduced to a compressed sample set, further blending operations are not required. However, if the results of the first pass blending cannot be compressed, the appropriate information is provided back to the graphics processor such that the complete sample set for the particular pixel location can be stored within a sample memory for subsequent use. As such, the present invention provides a lossless compression technique that supports an over-sampled video graphics-processing environment. Such oversampling can help reduce undesirable aberrations within displayed images.

Yes it serves as a frame buffer, but not exclusively that only. It's a frame buffer with logic that in essence blends/z ops/AA/etc. It aids alongside the GPU performing tasks so the GPU is freed to do other things. So yes 4xAA is free due to this eDRAM module performing the tasks the R500 would normally have to do.
 
Tenacious-V said:
WHich docs are you referring to? The The ones leaked last year? or the patent leaked recently?

This is what the recent patent states:



Yes it serves as a frame buffer, but not exclusively that only. It's a frame buffer with logic that in essence blends/z ops/AA/etc. It aids alongside the GPU performing tasks so the GPU is freed to do other things. So yes 4xAA is free due to this eDRAM module performing the tasks the R500 would normally have to do.

From docs leaked a few months ago.

Even with that patent, I doubt AA will be "free".
 
Duckhuntdog said:
From docs leaked a few months ago.

Well if it's the one I stated above (not the full patent of course) then the explenations I gave are correct, if it isn't that patent, then it's the old one which is outdated now.
 
Tenacious-V said:
Who cares what idiot fanboys think, if both sides have to argue which is this and which is that just to make their dicks seems longer seems pathetic to me. I'm a tech head, and the fact that Xbox 360 can output at 720p with FREE 4xAA due to this design is for all intents and purposes AMAZING!!

Fanboys are just sorry excuses for gamers. I don't understand why a person can't own one console and be happy with it without having to bitch and moan about the competition to try and justify their purchase. I mean fuck, just enjoy the games on it.


FSAA wont be free. it will still cost pixel fillrate. unless I am mistaken.

and I don't concider 4x AA on next-gen consoles to be anything close to 'amazing' - the Xbox can do 4xAA even though it costs alot.

32X AA or even 16X AA would be amazing, but not 4X. current PC GPUs pull off 6x and 8x AA, and have done so for the last couple of years (at least 6x).
 
midnightguy said:
FSAA wont be free. it will still cost pixel fillrate. unless I am mistaken.

and I don't concider 4x AA on next-gen consoles to be anything close to 'amazing' - the Xbox can do 4xAA even though it costs alot.

32X AA or even 16X AA would be amazing, but not 4X. current PC GPUs pull off 6x and 8x AA, and have done so for the last couple of years (at least 6x).

640*480 + FSAA 4x max ( 2x more realistic )
Vs
1280*720 +FSAA 4x ... mmhhh :D
 
midnightguy said:
FSAA wont be free. it will still cost pixel fillrate. unless I am mistaken.

and I don't concider 4x AA on next-gen consoles to be anything close to 'amazing' - the Xbox can do 4xAA even though it costs alot.

32X AA or even 16X AA would be amazing, but not 4X. current PC GPUs pull off 6x and 8x AA, and have done so for the last couple of years (at least 6x).

Yes you are mistaken my friend :). If you check the B3D forums as well as some of the leaked documentation you will notice that the fillrate for Xbox 360 is the same at 0xAA and at 4xAA.

And I do consider it amazing. As 4xAA on Xbox was a huge hit on the GPU, and that was at 640x480, this time around 4xAA is free and with an output of 1280x720 (720p).

Sure PC's have been doing that for years, but not the way the X360 is, and not without their own perfomance hits.
 
Top Bottom