• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

new details on Xbox 360's ATI GPU - It's called Xenos !!!!

I like that name. Xenos. very fitting for a GPU that is part of a console codenamed Xenon


http://www.hardocp.com/article.html?art=NzcxLDE=

1115994217JuUUrFAwxF_1_1_l.jpg


Xbox 360 GPU Features : More facts make their way to the surface concerning the Xbox 360 graphics processor, codenamed XENOS. This time ATI’s Vice President of Engineering chimes in on the unique technology that is powering the pixels behind Microsoft’s new gaming console.

Introduction

Our goal here is to give you a better working knowledge of the video technology inside the Xbox 360 in “plain English.” While there is going to be some technogeek babble, we will try to keep it to a minimum. There will surely be more in-depth articles posted in the coming days and as usual, we will be linking those on the HardOCP news page, so keep your eyes peeled if we have only piqued your inner geek interests.

Earlier this week we got to speak candidly with Bob Feldstein, VP of Engineering at ATI, and lead sled dog on the Xbox 360 GPU development team. While Microsoft owns the technology that powers the graphics of the Xbox 360, ATI very much engineered the GPU internally codenamed XENOS. After a 3,200 mile round trip to the Microsoft offices last week, I came home a bit stunned to spend the day with their Xbox 360 team and not see any sort of gaming demo on their new gaming console. While there are tons of cool features embodied in the Xbox 360, it is a gaming platform…thankfully. And thankfully this week at E3, we have been given more than a few sneak peaks at the upcoming games that will be available for the Xbox 360 platform.

Without a doubt, if you have broadband, and are a gamer or computer enthusiast, you owe it to yourself to head over the FileShack and check out their E3 2005 Hi Def Coverage. The level of graphic detail in these upcoming titles looks to be generations ahead of the current Xbox and other aging consoles. As for comparing it to PC titles, I will have to make that call November when the Xbox 360 is due for release and see what games and hardware are out for the PC at that time. If I had to make a call now though, I would have to say that the Xbox 360 graphics I have seen are just as impressive as any 3D PC game title I have ever seen. Then again, we have to remember that currently we are being thoroughly kicked in the head by the size 15 Microsoft Marketing Boot, so of course we are not going to be shown crappy clips of games. As for the how much content, we are hearing from several industry insiders that there will be between 25 and 45 titles available for the Xbox 360 at launch. A lofty goal says many insider naysayers. But let's get back on topic…

A GPU is a GPU, Right?

If I have read it once, I have read it a thousand times, “The Xbox 360 is just ATI’s next generation (fill in favorite code name here) GPU.” The simple answer to that is, “No it is not.” While most modern GPUs share many architectural similarities, Bob Feldstein and Chris Evenden of ATI went out of their way to explain to me, no matter how hard I tried to convince them otherwise, that the Xbox 360 GPU is very much an original creation. While some will try to tell you that is it simply a souped up DirectX 9 GPU, you might be interested to learn that the only API spec that the Xbox 360 hardware meets is its own API. That is correct, the Xbox 360 GPU only meets it own Xbox 360 API specifications. While of course some lessons learned in DX9 and upcoming DX10 were applied, the GPU of the Xbox 360 is very much its own and comparing it directly to anything in the PC world is simply “not right” according to Mr. Feldstein. Obviously the Xbox 360 can be thought of as a very innovative solution specifically for the Xbox and only the Xbox.

One interesting thing that was said to me during our conversation was that 3D game content developers were relied on along the way as the GPU was designed. Not consulted with or talked to once in a while, but relied upon for their GPU functionality requests and feedback as to the GPU’s implementation. Also keep in mind that Microsoft owns this technology and while there is certainly a good amount of technology sharing between ATI and Microsoft, Microsoft does have the ability to make their own changes and take the part anywhere in the world to be fabricated. So while it is ATI’s design, it is Microsoft’s GPU and all of this even more makes the Xbox 360 GPU its own.





About the Hardware

While we have tried our best to get better pictures of the Xbox 360 internals, we have pulled up short. While folks at Microsoft and ATI will definitely not like to have the comparison made, there really is “just” a PC inside the Xbox 360...that happens to be a Mac. You can find a full table of Xbox 360 specifications here, but in the interest of speeding things up a bit, below is the short list covering video.


GPU & Northbridge in One!

Many of you read that the Xbox 360 will have 512MB of GDDR3 RAM and that is 100% correct. But how exactly does this work with the CPU and GPU. Once you learn that the Xbox 360 GPU also acts as the system’s memory controller, much like the Northbridge in an Intel PC, the picture becomes a bit clearer. ATI has been making and designing chipsets for a good while now that use GDDR3 RAM. Add to this that Joe Macri (go cart racing fiend extraordinaire), who was a pivotal factor in defining the GDDR3 RAM specification at JEDEC is also a big fish at ATI, and it only makes sense that ATI could possibly put together one of the best GDDR3 memory controllers in the world. So while it might seem odd that the Xbox 360 Power PC processor is using “graphics” memory for its main system memory and a “GPU” as the “northbridge,” once you see the relationship between the three and the technology being used it is quite simple. So, we have the 700MHz GDDR3 RAM acting as both system RAM and as GPU RAM, connected to the GPU via a traditional GDDR3 bus interface that can channel an amazing 25 Gigabytes per second of data.

Now between the GPU and the CPU things get a bit fuzzier. And by “fuzzier,” I mean that they would not tell me much about it at all. The bus between the CPU and GPU was characterized as unique and proprietary. Mr. Feldstein did let on that the bus could shuttle up to 22 Gigabytes of data per second. Much like GDDR3, this would be a full duplex bus, or one that “goes both ways” at one time. Beyond that, not much was shared.

Viewing the Xbox 360 GPU as the “northbridge” should give you a better idea of how the Xbox works and answer some of the overall architecture questions. It is my own opinion that it is very likely that the CPU/GPU bus is very similar to the GPU/RAM bus as it was stressed to me by Mr. Feldstein that the CPU/RAM pathway was very free of any latency bottlenecks. The world may never know for sure...till some crazy Mac guys hack the thing and run a Stream benchmarks.





GPU Features & Buzzwords

There are always tons of buzzwords flying around in the technology and gaming communities, but what we wanted to dig into here is what exactly all these Xbox 360 technologies do for you. The three that you are going to hear the most are “Smart / Intelligent 3D Memory,” “Adaptable / Unified Shader Approach,” and the “Modeling Engine.” Another buzz word that you are going to hear a lot of is “Fluid Reality.” While this is not a new approach in the PC world, it is new to consoles. This Fluid Reality refers to the way the fabrics of clothing might flow with movement or how hairs on a character's head fall into place or how a monster’s fur may rustle as it stomps toward you. It also refers to lifelike facial animations that have been recently made famous by games like Half Life 2.


Smart / Intelligent 3D Memory

Notice the slash above? Not even ATI had a solid name for this technology but for the sake of this explanation we are just going to call is “Smart 3D Memory.” Smart 3D Memory is the biggest standout and innovative feature I saw inside the entire Xbox 360. To give you an idea of what it would look like first hand, think of any normal GPU you might see, something much like this Mobility Radeon X700 chipset. That is pretty much what any modern GPU looks like. Now think of that same chipset as having a single piece of DRAM sitting off to one side, much like seen in this ATI slide below, but with one less piece of RAM (and no arrows).



Keep in mind, ATI is not a stranger to adding memory to a chipset, but remember this is “smart” memory.

The Xbox 360 Smart 3D Memory is a relatively small piece of DRAM sitting off to the side of the GPU but yet on the same substrate. The Smart 3D Memory weighs in at only 10MB. Now the first thing that you might think is, “Well what the hell good is 10MB in the world of 512MB frame buffers?” And that would be a good line of questioning. The “small” 10MB of Smart 3D memory that is currently being built by NEC will have an effective bus rate between it and the GPU of 2GHz. This is of course over 3X faster that what we see on the high end of RAM today.

Inside the Smart 3D Memory is what is referred to as a 3D Logic Unit. This is literally 192 Floating Point Unit processors inside our 10MB of RAM. This logic unit will be able to exchange data with the 10MB of RAM at an incredible rate of 2 Terabits per second. So while we do not have a lot of RAM, we have a memory unit that is extremely capable in terms of handling mass amounts of data extremely quickly. The most incredible feature that this Smart 3D Memory will deliver is “antialiasing for free” done inside the Smart 3D RAM at High Definition levels of resolution. (For more of just what HiDef specs are, you can read here. Yes, the 10MB of Smart 3D Memory can do 4X Multisampling Antialiasing at or above 1280x720 resolution without impacting the GPU. So all of your games on Xbox 360 are not only going to be in High Definition, but all will have 4XAA applied as well.

The Smart 3D Memory can also compute Z depths, occlusion culling, and also does a very good job at figuring stencil shadows. You know, the shadows in games that will be using the DOOM 3 engine, like Quake 4 and Prey?

Now remember that all of these operations are taking place on the Smart 3D Memory and having very little workload impact on the GPU itself.
So what exactly will the GPU be doing?

Adaptable / Unified Shader Approach

First off, we reported on page 2 in our chart that the capable “Shader Performance” of the Xbox 360 GPU is 48 billion shader operations per second. While that is what Microsoft told us, Mr. Feldstein of ATI let us know that the Xbox 360 GPU is capable of doing two of those shaders per cycle. So yes, if programmed for correctly, the Xbox 360 GPU is capable of 96 billion shader operations per second. Compare this with ATI’s current PC add-in flagship card and the Xbox 360 more than doubles its abilities.

Now that we see a tremendous amount of raw shader horsepower, we have to take into account that there are two different kinds of shader operations that can be programmed by content developers. There are vertex shaders and pixels shaders. These are really just what they sound like. Vertex shader operations are used to move vertices, which shape polygons, which make up most objects you, see in your game, like characters or buildings or vehicles. Pixel shader operations dictate what groups of pixels do like bodies of water or clouds in the sky or maybe a layer of smoke or haze.

In today’s world of shader hardware, we have traditionally had one hardware unit to do pixel shaders and one hardware unit to do vertex shaders. The Xbox 360 GPU breaks new ground in that the hardware shader units are intelligent as well. Very simply, the Xbox 360 hardware shader units can do either vertex or pixel shaders quickly and efficiently. Just think of the Xbox 360 shaders as being agnostic SIMD shader units (Single Instructions carried out on Multiple Data).

The advantage of this would not be a big deal if every game were split 50/50 in terms of pixel and vertex shaders. That is not the case though. While most games are vertex shader bottlenecked, some others are pixel shader bottlenecked. When you combine the Xbox 360 Unified Shader Approach and its massive shader processing power, you end up with a GPU that is built to handle gaming content far beyond what we see today in terms of visual quality.






Modeling Engine

The Xbox 360 modeling engine is quite frankly something that is really not explainable in layman’s terms. At least I am not smart enough to explain most of it. That said, I can share with you some of the things it does while being able to endlessly loop and manipulate shader data.

Some of the global illumination effects you might have seen in years past at SIGGRAPH have been put into motion on the Xbox 360 GPU in real time. For the most part, global illumination is necessary to render a real world picture quality 3D scene. The Xbox 360 also makes using curved surfaces possible in-game, meaning that it can calculate the polygons from the proper curved surface math in order to draw it on your screen “correctly.” Much in line with this is the ability to do high order surface deformation. And if you are familiar with high order surfaces, you are likely familiar with what gamers and hardware enthusiasts commonly refer to as “LOD” or Level of Detail. Mr. Feldstein shared with us that the Xbox 360 GPU has some “really novel LOD schemes.” So we are thinking that the days of cars in the distance with pentagonal shaped wheels in new Grand Theft Auto titles is a thing of the past.

GPU Power

ATI would not share with us the power displacement numbers for the Xbox 360 GPU, but if you have seen the heatsink on it in our low quality picture linked here (heatsink on the right), you will know that the total power displacement is not much by today’s standards. In fact I would compare the heatsink on that GPU to the ones we used to see on Voodoo 3 3000 video cards back in 1999 that were running a whopping 200MHz when overclocked. ATI has implemented “lots” of their mobile power saving features in this Xbox 360 GPU. Clock gating is even more efficient in this chipset than what we were aware of being implemented in Mobility Radeons of the recent past. Sorry, but here again we were light on specifics.

Conclusions

ATI took their Xbox 360 GPU project from the ground floor to a working piece of silicon in 2 years. It is obvious that that there is much more to this GPU than what is being shared with us here, but even considering just the features shared here, a 2 year project time seems amazing. This can easily be considered a ground breaking GPU in terms of 3D gaming regardless of platform.

We should expect to see many of the Xbox 360 GPU technologies shared with the PC desktop market. Unfortunately it does not look like Smart 3D Memory will be one of the things that make the crossover, at least immediately. A shame really, but the impact of a desktop Unified Shader Model will be hugely benefiting to the PC gamer as well.

From what I have seen coming from this year’s Electronic Entertainment Expo coverage, there is no doubt that the future of the Xbox 360 in terms of pure gaming fun looks very promising. But of course the marketing fluff always does. The truth will be known sometime before the Christmas buying season. It is highly likely though that we will not see the real fruits of ATI’s and Microsoft’s labor till a couple of years have passed and game content developers have had ample time to learn to exploit the incredible power of the Xbox 360. Programmability and flexibility are two of ATI’s Xbox 360 GPU features that will not be realized immediately and it looks as if the GPU has plenty of both.

I fully expect that we will see more details float to the surface throughout the rest of the year as this is not the full Xbox 360 GPU story.
 

3rdman

Member
Inside the Smart 3D Memory is what is referred to as a 3D Logic Unit. This is literally 192 Floating Point Unit processors inside our 10MB of RAM. This logic unit will be able to exchange data with the 10MB of RAM at an incredible rate of 2 Terabits per second.

Don't know if that really is impressive being a non-techie kind of guy...but it sure sounds like it is...WOW.

the Xbox 360 GPU is 48 billion shader operations per second. While that is what Microsoft told us, Mr. Feldstein of ATI let us know that the Xbox 360 GPU is capable of doing two of those shaders per cycle. So yes, if programmed for correctly, the Xbox 360 GPU is capable of 96 billion shader operations per second. Compare this with ATI’s current PC add-in flagship card and the Xbox 360 more than doubles its abilities.

Was this known before?
 

gofreak

GAF's Bob Woodward
3rdman said:
Was this known before?

I do believe they made a mess of that.

R500 can do 2 shader ops per alu per clock.

48*2 = 96 shader ops per cycle total.

96 shader ops * 500Mhz = 48bn shader ops per second.

The 48bn figure accounts for that.
 

Izzy

Banned
8 pipes
48 ALUs
96 shader ops/clock


3rdman said:
Don't know if that really is impressive being a non-techie kind of guy...but it sure sounds like it is...WOW.

2 Terabits/s = 256 GB/s - as announced.
 
According to the newest information released by Microsoft they were hiding information to throw Sony off, and the Xbox 360 GPU is indeed more powerful than that of the Ps3.

Just some stuff from the hardocp article that supports this:

"The bus between the CPU and GPU was characterized as unique and proprietary. Mr. Feldstein did let on that the bus could shuttle up to 22 Gigabytes of data per second. Much like GDDR3, this would be a full duplex bus, or one that “goes both ways” at one time.

Inside the Smart 3D Memory is what is referred to as a 3D Logic Unit. This is literally 192 Floating Point Unit processors inside our 10MB of RAM. This logic unit will be able to exchange data with the 10MB of RAM at an incredible rate of 2 Terabits per second. So while we do not have a lot of RAM, we have a memory unit that is extremely capable in terms of handling mass amounts of data extremely quickly. The most incredible feature that this Smart 3D Memory will deliver is “antialiasing for free” done inside the Smart 3D RAM at High Definition levels of resolution. (For more of just what HiDef specs are, you can read here. Yes, the 10MB of Smart 3D Memory can do 4X Multisampling Antialiasing at or above 1280x720 resolution without impacting the GPU. So all of your games on Xbox 360 are not only going to be in High Definition, but all will have 4XAA applied as well.

The Smart 3D Memory can also compute Z depths, occlusion culling, and also does a very good job at figuring stencil shadows. You know, the shadows in games that will be using the DOOM 3 engine, like Quake 4 and Prey."

Microsoft skipped specs to throw off sony,
First off, we reported on page 2 in our chart that the capable “Shader Performance” of the Xbox 360 GPU is 48 billion shader operations per second. While that is what Microsoft told us, Mr. Feldstein of ATI let us know that the Xbox 360 GPU is capable of doing two of those shaders per cycle. So yes, if programmed for correctly, the Xbox 360 GPU is capable of 96 billion shader operations per second. Compare this with ATI’s current PC add-in flagship card and the Xbox 360 more than doubles its abilities.
 
Ghost of Bill Gates said:
According to the newest information released by Microsoft they were hiding information to throw Sony off, and the Xbox 360 GPU is indeed more powerful than that of the Ps3.

Please don't send this thread down the shitter so early. I want to see some good discussion from some techies before the thread goes down that route.

Edited for clarification.
 
Sal Paradise Jr said:
Please don't send this thread down the shitter so early. I want to see some good discussion from some techies before the thread goes down that route.

The info you're looking for may never be found..
sheep7.gif


Just read the hardocp article yourself..that's if you know how to read.
 

Blimblim

The Inside Track
Wow, very nice article. I did not know about the "3D Memory". Very cool technology, it's like a small GPU inside the GPU.
 

gofreak

GAF's Bob Woodward
Ghost of Bill Gates said:
Microsoft skipped specs to throw off sony,
First off, we reported on page 2 in our chart that the capable “Shader Performance” of the Xbox 360 GPU is 48 billion shader operations per second. While that is what Microsoft told us, Mr. Feldstein of ATI let us know that the Xbox 360 GPU is capable of doing two of those shaders per cycle. So yes, if programmed for correctly, the Xbox 360 GPU is capable of 96 billion shader operations per second. Compare this with ATI’s current PC add-in flagship card and the Xbox 360 more than doubles its abilities.


This bit, as pointed out above, is not true. They mixed up per cycle and per second calculations.
 
Sal Paradise Jr said:
Please don't send this thread down the shitter so early. I want to see some good discussion from some techies before the thread goes down that route.

So quoting from hardocp is sending the thread down the shitter?
 
Ghost of Bill Gates said:
The info you're looking for may never be found..
sheep7.gif


Just read the hardocp article yourself..that's if you know how to read.

I can read fine, and it's a great, exciting article. You just need to drop the "X360 is indeed more powerful than PS3" schtick. We don't know enough about the RSX to make such proclamations.
 
gofreak said:
This bit, as pointed out above, is not true. They mixed up per cycle and per second calculations.
These numbers from both Sony and MS are confusing a lot of people..

But i'm excited about the "Smart 3D Memory" that was just unearth by hardocp ..MS just left that out the specs all together! I wonder why...
 

shpankey

not an idiot
jboldiga said:
Can someone please explain to me how the hell you can do anti-aliasing in memory??? This sounds like BS.
.
Inside the Smart 3D Memory is what is referred to as a 3D Logic Unit. This is literally 192 Floating Point Unit processors inside our 10MB of RAM. This logic unit will be able to exchange data with the 10MB of RAM at an incredible rate of 2 Terabits per second. So while we do not have a lot of RAM, we have a memory unit that is extremely capable in terms of handling mass amounts of data extremely quickly. The most incredible feature that this Smart 3D Memory will deliver is “antialiasing for free” done inside the Smart 3D RAM at High Definition levels of resolution. (For more of just what HiDef specs are, you can read here. Yes, the 10MB of Smart 3D Memory can do 4X Multisampling Antialiasing at or above 1280x720 resolution without impacting the GPU. So all of your games on Xbox 360 are not only going to be in High Definition, but all will have 4XAA applied as well.
 
Izzy said:
8 pipes
48 ALUs
96 shader ops/clock




2 Terabits/s = 256 GB/s - as announced.


yep.

this above info has been known since Feb 2004 (Xenon System Block Diagram) I think.
or no later than June 2004 (Leaked Xenon Hardware Overview)
 
anandtech has a new article up, on Xbox360 GPU and PS3's Nvidia RSX GPU


page 1:

With a relatively light schedule thanks to the small size of the show, we were able to spend quite a bit of time digging deeper on the two highlights of this year's E3 - ATI's Xbox 360 GPU, and NVIDIA's RSX, the GPU powering the PlayStation 3.

Given that both of the aforementioned GPU designs are very closely tied to their console manufacturers, information flow control was dictated by the console makers, not the GPU makers. And unfortunately, neither Microsoft or Sony were interested in giving away more information than their ridiculously light press releases.

Never being satisfied with the norm, we've done some digging and this article is what we've managed to put together. Before we get started, we should mention a few things:

1) Despite our best efforts, information will still be light because of the strict NDAs imposed by Microsoft and Sony on the GPU makers.

2) Information on NVIDIA's RSX will be even lighter because it is the more PC-like of the two solutions and as such, a lot of its technology overlaps with the upcoming G70 GPU, an item we currently can't talk about in great detail.

With those items out of the way, let's get started, first with what has already been announced.

The Xbox 360 GPU, manufactured by ATI, is the least PC-like of the two GPUs for a number of reasons, the most obvious being its 10MB of embedded DRAM. Microsoft announced that the 10MB of embedded DRAM has 256GB/s of bandwidth availble to it; keep this figure in mind, as its meaning isn't as clear cut as it may sound.

The GPU operates at 500MHz and has a 256-bit memory interface to 512MB of 700MHz GDDR3 system memory (that is also shared with the CPU).

Another very prominent feature of the GPU is that it implements ATI's first Unified Shader Architecture, meaning that there are no longer any discrete pixel and vertex shader units, they are instead combined into a set of universal execution units that can operate on either pixel shader or vertex shader instructions. ATI is characterizing the width of the Xbox 360 GPU as being 48 shader pipelines; we should caution you that these 48 pipelines aren't directly comparable to current 16-pipeline GPUs, but rest assured that the 360 GPU should be able to shade and texture more pixels per clock than ATI's fastest present-day GPU.

Now let's move on to NVIDIA's RSX; the RSX is very similar to a PC GPU in that it features a 256-bit connection to 256MB of local GDDR3 memory (operating at 700MHz). Much like NVIDIA's Turbo Cache products, the RSX can also render to any location in system memory, giving it access to the full 256MB of system memory on the PS3 as well.

The RSX is connected to the PlayStation 3's Cell CPU by a 35GB/s FlexIO interface and it also supports FP32 throughout the pipeline.

The RSX will be built on a 90nm process and features over 300 million transistors running at 550MHz.

Between the two GPUs there's barely any information contained within Microsoft's and Sony's press launches, so let's see if we can fill in some blanks.

page 2

More Detail on the Xbox 360 GPU

ATI has been working on the Xbox 360 GPU for approximately two years, and it has been developed independently of any PC GPU. So despite what you may have heard elsewhere, the Xbox 360 GPU is not based on ATI's R5xx architecture.

Unlike any of their current-gen desktop GPUs, the 360 GPU supports FP32 from start to finish (as opposed to the current FP24 spec that ATI has implemented). Full FP32 support puts this aspect of the 360 GPU on par with NVIDIA's RSX.

ATI was very light on details of their pipeline implementation on the 360's GPU, but we were able to get some more clarification on some items. Each of the 48 shader pipelines is able to process two shader operations per cycle (one scalar and one vector), offering a total of 96 shader ops per cycle across the entire array. Remember that because the GPU implements a Unified Shader Architecture, each of these pipelines features execution units that can operate on either pixel or vertex shader instructions.

Both consoles are built on a 90nm process, and thus ATI's GPU is also built on a 90nm process at TSMC. ATI isn't talking transistor counts just yet, but given that the chip has a full 10MB of DRAM on it, we'd expect the chip to be fairly large.

One thing that ATI did shed some light on is that the Xbox 360 GPU is actually a multi-die design, referring to it as a parent-daughter die relationship. Because the GPU's die is so big, ATI had to split it into two separate die on the same package - connected by a "very wide" bus operating at 2GHz.

The daughter die is where the 10MB of embedded DRAM resides, but there is also a great deal of logic on the daughter die alongside the memory. The daughter die features 192 floating point units that are responsible for a lot of the work in sampling for AA among other things.

Remember the 256GB/s bandwidth figure from earlier? It turns out that that's not how much bandwidth is between the parent and daughter die, but rather the bandwidth available to this array of 192 floating point units on the daughter die itself. Clever use of words, no?

Because of the extremely large amount of bandwidth available both between the parent and daughter die as well as between the embedded DRAM and its FPUs, multi-sample AA is essentially free at 720p and 1080p in the Xbox 360. If you're wondering why Microsoft is insisting that all games will have AA enabled, this is why.

ATI did clarify that although Microsoft isn't targetting 1080p (1920 x 1080) as a resolution for games, their GPU would be able to handle the resolution with 4X AA enabled at no performance penalty.

ATI has also implemented a number of intelligent algorithms on the daughter die to handle situations where you need more memory than the 10MB of DRAM on-die. The daughter die has the ability to split the frame into two sections if the frame itself can't fit into the embedded memory. A z-pass is done to determine the location of all of the pixels of the screen and the daughter die then fetches only what is going to be a part of the scene that is being drawn at that particular time.

On the physical side, unlike ATI's Flipper GPU in the Gamecube, the 360 GPU does not use 1T-SRAM for its on-die memory. The memory on-die is actually DRAM. By using regular DRAM on-die, latencies are higher than SRAM or 1T-SRAM but costs should be kept to a minimum thanks to a smaller die than either of the aforementioned technologies.

Remember that in addition to functioning as a GPU, ATI's chip must also function as a memory controller for the 3-core PPC CPU in the Xbox 360. The memory controller services both the GPU and the CPU's needs, and as we mentioned before the controller is 256-bits wide and interfaces to 512MB of unified GDDR3 memory running at 700MHz. The memory controller resides on the parent die.



the rest here: http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423&p=3
 

gofreak

GAF's Bob Woodward
Ghost of Bill Gates said:
These numbers from both Sony and MS are confusing a lot of people..

But i'm excited about the "Smart 3D Memory" that was just unearth by hardocp ..MS just left that out the specs all together! I wonder why...

This is the first time we've had detail on it, but most people guessed the eDram module would have such logic. Makes too much sense not to.

Anand has an article up on the GPUs too:

http://www.anandtech.com/tradeshows/showdoc.aspx?i=2423

Apparently the eDram bandwidth is internal bandwidth, not GPU-to-eDram bandwidth. It's the bandwidth between the eDram's own logic and the eDram memory.

edit - heh, midnight beat me to it.
 

3rdman

Member
Just found out that the eDRAM bandwidth between the shader core and eDRAM core is 256GB/s, real bandwidth (not extrapolated) - the interlink between the two is running at 2GHz.
-Dave Bauman at B3D


What does this mean?
 
I just got done talking with the ATI guys at their booth.

When I got there, and after fully absorbing the new cool realtime demo for X360, sI was like "I want to know more about the R500 in the X360. Can I ask some Q's?" And they were like "It's not a PC part". And I was like "Yeah, not the R520, but the R500 of the X360". And they were like "Stop calling it that. It's got nothing to do with the next gen PC part". And I was like "I'm stupid. Help me." :lol

So yeah guys, we can stop calling it R500 now. It's Xenos. And it's totally bitchin'. Totally XBox. ;)

The ATI booth is the only place at E3 where actual Xenos is doing demos. They are showing 2 Plasmas surrounding a beta kit with XBox 360 displayed on top, a new Ruby demo that was originally made for R520 PC part that they ported over in the last month to show off Xenos. As such, the demo is far from taking advantage of native Xenos features. Even with that, it's an awesome showcase for the X360 that should have been pimped from the get go this week. What were they thinking?!?

I made a quick video with my digicam (only capable of 320x240 @ 15fps), but no way to transfer it to this laptop they have at one of the free web access area. It's fucking cool guys. Stay tuned!
 

AB 101

Banned
Boy, MS would have been well served to have shown their own explosion demos and rubber ducky's in a bathtub. :)
 

arhra

Member
So... wait. They have final hardware available. They have tech demos running on final hardware. They didn't showcase these demos during their press conference.

WTF are they thinking!?
 

3rdman

Member
block.gif


More log for the fire...

http://techreport.com/etc/2005q2/xbox360-gpu/index.x?pg=1

On chip, the shaders are organized in three SIMD engines with 16 processors per unit, for a total of 48 shaders. Each of these shaders is comprised of four ALUs that can execute a single operation per cycle, so that each shader unit can execute four floating-point ops per cycle.

500MHz x 48 x 4 = 96 billion shader ops.

Again, not a techie. I pulled this from B3d. Seems interesting if accurate...
 

thorns

Banned
FiringSquad: Do you know if it supports dual HD displays?

ATI: No it doesn’t. I know the NVIDIA chip does, but that’s only because PC products do. It doesn’t seem to have a real use inside the living room, but maybe you differ with me on that.

FiringSquad: Well, on the Sony console, I think they’re looking at applications that go beyond just a console in the living room don’t you think?

ATI: Yeah I really think it’s just an accident because, well you know, last summer they had to change their plans. They found out that Cell didn’t work as well as they wanted to for graphics. Remember originally you had two or three Cell processors doing everything and then in August last year they had to take an NVIDIA PC chip. And as you know, all PC chips do this, and so it [dual HD display outputs] just came for free.

interesting.
 

thorns

Banned
xbox 1 has a custom gpu, not a pc-like one. It's also based on geforce 3 tech, which I'm not sure if it supported dual displays or not in the first place.
 

arhra

Member
mashoutposse said:
...which is why XBOX 1, with its PC-like GPU also has two TV outputs.

Or not.
Dual outputs was limited to expensive Matrox cards when the xbox was being designed. Since then it's become a pretty much standard feature.
 

Panajev2001a

GAF's Pleasant Genius
Ghost of Bill Gates said:
These numbers from both Sony and MS are confusing a lot of people..

But i'm excited about the "Smart 3D Memory" that was just unearth by hardocp ..MS just left that out the specs all together! I wonder why...

What ? It has been MONTHS that people have been talking about having the ROP's/Rasterization logic moved to the DRAM core.

You also have to think that 3DRAM is also not a ultra new concept, infact it is patented by Mitsubishi IIRC ;).

What ATI did IS cool, very cool: it was in the specs all along though ;).
 
mashoutposse said:
And the RSX is not "custom?" Come on.

I'm not a techie and it's clear that RSX is custom, but given the info (or lack thereof), it looks like ATI's work for MS is more of a custom job than nVidia's part for Sony. There's still more info to be released about both, so it's not fair to call either something more (or less) than the other yet... Still, from my limited hardware tech understanding, the Xenos seems amazingly state of the art with some cool innovations in hardware, more than what is currently known about nVidia's RSX.

And though it's especially dangerous to make assumptions given that not all information has been disclosed as of yet and I'm not qualified to make one, I'll just say that I think that MS made a kick-ass system that, despite being raked over the coals for being less than Sony/nVidia's numbers-ass-raping at the Sony PS3 conference, looks to be as capable as the PS3 in every important way...so, no, PS3 does not bitch slap X360 in any absolute way...IMO. :lol
 

Panajev2001a

GAF's Pleasant Genius
I do not get their bandwidths' numbers.

((700 MHz * 2 (it is DDR) * 256 bits)/8)/1,024 = 43.75 GB/s = more or less 44 GB/s.


((700 MHz * 2 (it is DDR) * 128 bits)/8)/1,024 = 21.875 GB/s = more or less 22 GB/s as announced.

Which is what (same question for RSX and for Xenos) ?
 

Panajev2001a

GAF's Pleasant Genius
Panajev2001a said:
I do not get their bandwidths' numbers.

((700 MHz * 2 (it is DDR) * 256 bits)/8)/1,024 = 43.75 GB/s = more or less 44 GB/s.


((700 MHz * 2 (it is DDR) * 128 bits)/8)/1,024 = 21.875 GB/s = more or less 22 GB/s as announced.

Which is what (same question for RSX and for Xenos) ?

The link for both is 128 bits wide: 256 bits was an error by Anandtech's guys.
 

Panajev2001a

GAF's Pleasant Genius
FiringSquad: Where does the 2-terabit number that’s been floating around come from?

ATI: The 2-terabit (256GB/sec) number comes from within the EDRAM, that’s the kind of bandwidth inside that RAM, inside the chip, the daughter die. But between the parent and daughter die there’s a 236Gbit connection on a bus that’s running in excess of 2GHz. It has more than one bit obviously between them.

Ok, so the link between parent and daughter chips there is about 30+ GB/s of bandwidth and not 256 GB/s, Dave understood it wrong this time it seems.
 

Pug

Member
Pani it was never going to 256GB was it? Still Dave was told by ATI thats what is was when he specifically asked that question, he even clarified. I'm with you though 34GB.
 
teiresias said:
OK, when Pana starts answering his own questions in some weird-ass 3rd person kind of way I get a little freaked out.

He is here, there, and everywhere.

The PlayStation Tech God speaks. We listen, for our future.

[This message was brought to you by the PlayStation Party of GAF]
 
Anyone who doubts [H]ardOCP on what they say about hardware is stupid. Hardware is what they do, it is what they know, it is what they rip apart if it doesn't add up.

So to just cast off what [H]ardOCP says about the GPU as "propaganda" or "missinformation" is just stupid.
 

Panajev2001a

GAF's Pleasant Genius
teiresias said:
OK, when Pana starts answering his own questions in some weird-ass 3rd person kind of way I get a little freaked out.

That is because Panajev sometimes disagrees with Panajev...


Uhm... no...*Panajev runs away from the guys with large white jacket that try to stop him*
 
ok there are so many threads and so much info that i am having trouble keeping up. what do the tech heads here think of the revelation that Xenos / C1 / R500 has 4 ALUs in each of the 48 shader pipes ( i thought the 48 shader pipes themselves where ALUs) for a total of 196 ALUs? o_O
 
as far as bandwidth

GPU to CPU is ~25 GB/sec

GPU to external memory is ~22 GB/sec

main GPU die to daughter die is ~32 GB/sec

daughter die internal memory bandwidth is 256 GB/sec


please correct where wrong.
 

FightyF

Banned
Ghost of Bill Gates said:
Anyone who doubts [H]ardOCP on what they say about hardware is stupid. Hardware is what they do, it is what they know, it is what they rip apart if it doesn't add up.

So to just cast off what [H]ardOCP says about the GPU as "propaganda" or "missinformation" is just stupid.

Yeah but a "misunderstanding" is always possible. Especially when a lot of things are being kept so secretive...
 
MightyHedgehog said:
I'm not a techie and it's clear that RSX is custom, but given the info (or lack thereof), it looks like ATI's work for MS is more of a custom job than nVidia's part for Sony. There's still more info to be released about both, so it's not fair to call either something more (or less) than the other yet... Still, from my limited hardware tech understanding, the Xenos seems amazingly state of the art with some cool innovations in hardware, more than what is currently known about nVidia's RSX.

And though it's especially dangerous to make assumptions given that not all information has been disclosed as of yet and I'm not qualified to make one, I'll just say that I think that MS made a kick-ass system that, despite being raked over the coals for being less than Sony/nVidia's numbers-ass-raping at the Sony PS3 conference, looks to be as capable as the PS3 in every important way...so, no, PS3 does not bitch slap X360 in any absolute way...IMO. :lol


yes it is very ironic isnt it - that Xenos is less PC-like and RSX is more PC-like.

the 360's Xenos GPU has eDRAM where as PC GPUs do not have eDRAM. the PS3's RSX GPU has no eDRAM.

this is basicly the TOTAL OPPOSITE of the current generation, where PS2 graphics chip has eDRAM while Xbox GPU has none.
 
thorns said:
xbox 1 has a custom gpu, not a pc-like one. It's also based on geforce 3 tech, which I'm not sure if it supported dual displays or not in the first place.


yes Xbox1 GPU was custom, but it *was* still very PC-like, unlike the Xenos GPU for Xbox360.
 
Top Bottom