WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
E6760 is 35w for PC. If we remove all the junk chips, used for pc, then its reduced down to 20 or even less? So if someone wants to believe Nintendo, that wii U is all about "graphics" this time... The e6760 is a good candidate, with that consumption! I find it hard to believe that an rv7xx chip with 95w, pulled down to 15w and it performs slightly less gflops that the original design. Its totally modified, I accept that, but they could use the E6760 power draw and tech, combined with high performance.
 
Well, it's perfectly possible that current games simply didn't use all of Wii Us potential, i.e. with future games utilizing it more the power usage could rise.
What sort of stands against this is that the system appears to never ever go above 33 or 34W (can't remember which one) for any game.
My memory may be fuzzy(I just woke up) but didn't some devs say that there were features of the GPU that they were still locked out of at launch?
 
Another thing off about his math is that TDP is not how much the GPU draws, but the limit of the design's power draw. It always draws less than that, and R700 series has a 320:16:8 chip called "Mario" R730: HD 4670, clocked at 750MHz draws 59watts @ 55nm, IBM says that a die shrink yields a 40% reduction in power draw, thus HD 4670 @ 40nm would draw ~36watt TDP, lowering the clock to 550MHz should also be possible at less voltage, drastically lowering the TDP, but lets say the voltage stays the same, you are looking a mid 20watt TDPs... Which is likely what Wii U's GPU is. Beyond 3D has been speculating a 30watt TDP for Latte, and while that is possible, I doubt it is that high, 25watt TDP also fits the embedded 55nm part e4690, which is also "Mario" R730, with 320SP @ 600MHz.

you can guess at the numbers or look at what a amd gpu at 320:16:8 on 40nm does at 550mhz, a Radeon HD 5550. Its tpd is 39 watts. It no where here mid 20s.

You are looking at around 13-14 watts at most for the gpu core itself. I just dont see any way in the world to make these cards work. Even using amd 40nm binned parts.

Now dropping 320 to 160 fit perfect in the power range and also every other fact we have on the chip itself. Everything fits...
 
you can guess at the numbers or look at what a amd gpu at 320:16:8 on 40nm does at 550mhz, a Radeon HD 5550. Its tpd is 39 watts. It no where here mid 20s.

You are looking at around 13-14 watts at most for the gpu core itself. I just dont see any way in the world to make these cards work. Even using amd 40nm binned parts.

Now dropping 320 to 160 fit perfect in the power range and also every other fact we have on the chip itself. Everything fits...

So you are looking at a R800 GPU, completely ignoring the R700 320SP's and don't even mention the embedded chip e4690, good job.
 
So this new information means Broadway is actually more efficient than bobcat, contrary to your previous analysis?
If you like to put it that way. I thought my previous analysis was rather clear in its claims.

Apropos, that's just a particular workload test - it's not drawing any holistic conclusions. I made it mainly to verify the 'Expresso's fp is particularly weak! Nintendo should have used bobcats/Venezuelan beavers, etc' opinions circulating the net (and because I have free access to those two CPUs, apparently).
 
If you like to put it that way. I thought my previous analysis was rather clear in its claims.

Apropos, that's just a particular workload test - it's not drawing any holistic conclusions. I made it mainly to verify the 'Expresso's fp is particularly weak! Nintendo should have used bobcats/Venezuelan beavers, etc' opinions circulating the net (and because I have free access to those two CPUs, apparently).
Yes, sorry for oversimplifying. I should have realized it was just on one specific test, that's what I get for doing this on my phone while switching between busses :P

From my perspective it's not noobish at all since it's not like we comb over die shots on a daily basis. That said it doesn't seem that's due to the photography, but I know I can't say that as fact either.
Thanks for the reply, was afraid people weren't answering because I was missing something obvious :P
I'll see if I can read up on the subject a bit.

And mad props to Marcan for getting down and dirty on Wii ;)
 
So, from what I can tell from reading this thread. We don't know the exact numbers of anything. ALU and SPU counts so far are, for the most part, guesses. Not sure of the 100% shader count. Don't know what the dedicated silicon is for. Gflops are 320 but if the dedicated silicon is fixed function than that would have to be taken into account. Makes me wonder if we'll ever get to the bottom of this thing.
 
I thought this was based on Iwata's comment on the system having a 75 watt PSU, how that was interpreted, and software averaging 33 watts. Is this actually the upper limit or simply the limit we've seen so far? Even in the Eurogamer test they said they observed a spike of above 33 watts so I don't know why it's being treated as an absolute max

I thought they said 33W *was* the spike, that it was usually 32 watts, 33 was the max it ever went to.

And this has been discussed to death a few pages back by the way, the tl;dr of it is that consoles always over provision the power supply by a lot, and in Nintendos case it's typical to have a power rating nearly double what is actually drawn for whatever reason (efficiency, aging, etc). And consoles are notoriously bad at stepping down power at low load, all the current consoles have nearly the same draw at the start screen as when playing a game for instance.
The variance is bigger on some consoles, but those are dealing with far higher wattage to start with.

Besides, if there was a high power mode for when the system is under higher load, why would that not already be activated with some games already dipping below 30fps?
 
Besides, if there was a high power mode for when the system is under higher load, why would that not already be activated with some games already dipping below 30fps?
That's unlikely. What is more likely is that a circuit that is not optimally loaded (as in 'is stalling a good deal of the time') does not reach its max power draw. A game running at sub 30fps does not mean it's utilizing the GPU at 100%.
 
That's unlikely. What is more likely is that a circuit that is not optimally loaded (as in 'is stalling a good deal of the time') does not reach its max power draw. A game running at sub 30fps does not mean it's utilizing the GPU at 100%.

That could be it, but there's still very little variance in most consoles power draw, it's bigger in some but those are dealing with far higher numbers than 33 watts to start with so the variance in watts rather than percentage would be higher.

powerconsumption-next-g.jpg


The wiis minimum was just 2 watts from its load. The PS3 and 360s averages weren't far from the max.
 
That could be it, but there's still very little variance in most consoles power draw, it's bigger in some but those are dealing with far higher numbers than 33 watts to start with so the variance in watts rather than percentage would be higher.

powerconsumption-next-g.jpg


The wiis minimum was just 2 watts from its load. The PS3 and 360s averages weren't far from the max.
I'm aware the variance is low on console. As low as it might be, though, it's not totally absent. Thing is, there are far too few titles on the platform yet (yet alone such titles that have been power-measured) to claim the ABSOLUTE MAX of the system sans peripherals is N Watts, pardon the all-caps.
 
That could be it, but there's still very little variance in most consoles power draw, it's bigger in some but those are dealing with far higher numbers than 33 watts to start with so the variance in watts rather than percentage would be higher.

powerconsumption-next-g.jpg


The wiis minimum was just 2 watts from its load. The PS3 and 360s averages weren't far from the max.

Wouldn't the comparison better for us to compare a launch title vs. a late pushing title?
 
That could be it, but there's still very little variance in most consoles power draw, it's bigger in some but those are dealing with far higher numbers than 33 watts to start with so the variance in watts rather than percentage would be higher.

powerconsumption-next-g.jpg


The wiis minimum was just 2 watts from its load. The PS3 and 360s averages weren't far from the max.

XBox 360 Slim has been shown to peak at as low as 82w in some games and as high as 90w in others. Also the last test I can find was a 2010 game so it could have peaked even higher in newer titles.

I'd be very interested to see someone test a selection of the best looking 360 titles from launch right up to today (Halo 4) and see what the total difference is. I wouldn't be surprised at all to see a 15% difference between lowest peak power usage to highest.
 
As soon as backward compatibility is involved QA costs are a huge part of R&D costs. Are you wanting to tell me that Nintendo managed to not reuse the original tech verbatim but get there through significant variations instead some straight up extensions, still saved the cost for QA the respectively complex regression tests for ensuring compatibility would cause and still end up with perfect BC? Sometimes I think people imagine this process being way too simple.

I think you're missing that the GPU and CPU were in development for over two years so of course I don't think it's as simple as you're making it out to seem. That's plenty of time R&D a BC-capable CPU/GPU in that manner and with Nintendo's apparent attempt to keep this small, I see them doing just that since it would allow them to manufacture a "less complicated" chip for future cost reduction. To me it's obvious BC was important to the point that sacrifices were made just for it. So I do believe a decent amount of the R&D budget went towards designing the hardware for BC and having engineers familiar with the previous hardware allowed them to cut parts they thought would have been necessary. I'm not saying this is fact, but when looking at all info available no 1:1 Wii components is the most logical conclusion IMO.

So, from what I can tell from reading this thread. We don't know the exact numbers of anything. ALU and SPU counts so far are, for the most part, guesses. Not sure of the 100% shader count. Don't know what the dedicated silicon is for. Gflops are maybe 320 but if there is dedicated silicon for "boosting" performance than that would have to be taken into account. Makes me wonder if we'll ever get to the bottom of this thing.

Fixed. And with the surface barely being scratched even at this point, I know I've given up on seeing the bottom, haha. Seeing duplicate blocks in Latte that are singles in other GPUs made me tap out. A new question for me was, why duplicate these blocks instead of using the space for increasing the amount of SIMDs? I'd really like to know what they do for them to make that choice.
 
We are not going to find out the answer to the ALUs until someone comes forward with polygon numbers or something we can count. If it is 160ALUs it is a very very strange gpu, those clusters just shouldn't be that big.
 
Fixed. And with the surface barely being scratched even at this point, I know I've given up on seeing the bottom, haha. Seeing duplicate blocks in Latte that are singles in other GPUs made me tap out. A new question for me was, why duplicate these blocks instead of using the space for increasing the amount of SIMDs? I'd really like to know what they do for them to make that choice.
Good question, I could guess that it'd be a for the gamepad, it wasn't just "tacked on" at the end of the WiiU's development, they knew about it a long time ago, having duplicated logic, so that there wasn't a big performance hit when rendering a separate scene to it makes a lot of sense.

While PC chips can render to multiple targets, you get the expected performance hit for rendering out those extra pixels.
 
Good question, I could guess that it'd be a for the gamepad, it wasn't just "tacked on" at the end of the WiiU's development, they knew about it a long time ago, having duplicated logic, so that there wasn't a big performance hit when rendering a separate scene to it makes a lot of sense.

While PC chips can render to multiple targets, you get the expected performance hit for rendering out those extra pixels.

Yet they said in the future it would be possible for games to use TWO pads plus the TV. And i have two HD monitors hooked up to my HD4xxx* all the time, for graphical work and 3D... i doubt a 480p tablet needs "double" of anything.

(*admittedly, it's a 4890 lol)
 
^ This explanation as well.

We are not going to find out the answer to the ALUs until someone comes forward with polygon numbers or something we can count. If it is 160ALUs it is a very very strange gpu, those clusters just shouldn't be that big.

I agree on all accounts.

Good question, I could guess that it'd be a for the gamepad, it wasn't just "tacked on" at the end of the WiiU's development, they knew about it a long time ago, having duplicated logic, so that there wasn't a big performance hit when rendering a separate scene to it makes a lot of sense.

While PC chips can render to multiple targets, you get the expected performance hit for rendering out those extra pixels.

I still have a tough time seeing that since Wii U's pad is a performance hit as well. Obviously at this point I can't say it's guaranteed not to be that and I think we both would agree the controller was a BoM hit. That just seems to be even more than I would expect just for the controller.
 
you can guess at the numbers or look at what a amd gpu at 320:16:8 on 40nm does at 550mhz, a Radeon HD 5550. Its tpd is 39 watts. It no where here mid 20s.

You are looking at around 13-14 watts at most for the gpu core itself. I just dont see any way in the world to make these cards work. Even using amd 40nm binned parts.

Now dropping 320 to 160 fit perfect in the power range and also every other fact we have on the chip itself. Everything fits...

What will you do if it ends up being 320? You are always so confident until you are proven wrong.

I'm not even saying you are wrong but you're so one-track minded it must make for some interesting mental calisthenics when you end up on the wrong side of an argument.
 
What will you do if it ends up being 320? You are always so confident until you are proven wrong.

I'm not even saying you are wrong but you're so one-track minded it must make for some interesting mental calisthenics when you end up on the wrong side of an argument.

He'll twist it around and claim he was right all along.

See: GPGPU.

I still have a tough time seeing that since Wii U's pad is a performance hit as well. Obviously at this point I can't say it's guaranteed not to be that and I think we both would agree the controller was a BoM hit. That just seems to be even more than I would expect just for the controller.

Not to mention that it would be a horrific waste of silicone to have blocks solely dedicated to rendering to the gamepad. That die space could be much better used for extra SIMDs and dynamically allocated to gamepad output when needed.
 
Well, if anything, the benchmarks Blu posted seem to indicate the CPU just might not need all that much help from the GPU. So at least, that's that.
 
We are not going to find out the answer to the ALUs until someone comes forward with polygon numbers or something we can count. If it is 160ALUs it is a very very strange gpu, those clusters just shouldn't be that big.
It is a bizarre GPU either way. If it is, 160ALU's, that will just make more questions. I agree with your conclusion.
 
Not to mention that it would be a horrific waste of silicone to have blocks solely dedicated to rendering to the gamepad. That die space could be much better used for extra SIMDs and dynamically allocated to gamepad output when needed.

Silicone? Is that the special sauce?

I kid, I kid...
 
You guys are great for wanting to donate the money. That was my initial desire, but I worried that not enough people would be down to make it worth it. I'll contact people privately and see where we stand.

It is a bizarre GPU either way. If it is, 160ALU's, that will just make more questions. I agree with your conclusion.

The more I look at it, the less bizarre it actually seems. I mean, did you see that Brazos die shot? haha.

Actually, I'm working on a sort of "Theory of Eveything" for the die and then I think I'm gonna just leave it be and let the games do the talking. Also working on an article for NES on what an exhilarating and unique experience it has been working with you guys to get some answers about this thing.
 
Btw, I wish Nintendo would release high-quality videos of the Bird Demo and Zelda HD demo. ... Fuck! Better yet, let us download them from the eShop and play around with the camera angles and day/night thing. Nintendo needs to boost confidence in consumers that this kit has enough technology to output impressive games 4 years from now.
 
Btw, I wish Nintendo would release high-quality videos of the Bird Demo and Zelda HD demo. ... Fuck! Better yet, let us download them from the eShop and play around with the camera angles and day/night thing. Nintendo needs to boost confidence in consumers that this kit has enough technology to output impressive games 4 years from now.

I know. If the actual final machine is as powerful (Or moreso) than what was used for those demos it'd be great to see them now.
 
you can guess at the numbers or look at what a amd gpu at 320:16:8 on 40nm does at 550mhz, a Radeon HD 5550. Its tpd is 39 watts. It no where here mid 20s.

You are looking at around 13-14 watts at most for the gpu core itself. I just dont see any way in the world to make these cards work. Even using amd 40nm binned parts.

Now dropping 320 to 160 fit perfect in the power range and also every other fact we have on the chip itself. Everything fits...


Why would the GPU be 13-14W? Total power consumption is 33 or 34W (that does include PSU ineffiency, but that's true for every single piece of hardware), with the GPU at 13 or 14W, the CPU would probably at significantly less than 5W, I'd say ~3, which would leave the rest of the system with ~half of Wii U's total power consumption. And I really don't see anything on there that would need that much. I'd say the GPU should be in the ~20W range, the CPU in the ~5W range and the other ~8W are for the rest of the components (WiFi, Flash, DDR3 etc.).

Also please keep in mind that the initial measurements for say a HD5550 might not necessarily be true anymore. The 40nm process might be much, much better by now (AMD at one point sort of rereleased the HD5850 and reviewers noted that it's power consumption was significantly lower than what they got in the original reviews like 12-18 months earlier; just as an example).
 
Btw, I wish Nintendo would release high-quality videos of the Bird Demo and Zelda HD demo. ... Fuck! Better yet, let us download them from the eShop and play around with the camera angles and day/night thing. Nintendo needs to boost confidence in consumers that this kit has enough technology to output impressive games 4 years from now.

That is actually a really cool idea. Why the heck not? We never even got that Tokyo driving demo. Did that actually release in Japan or did it become that Google thing - because that looks much choppier. A few tech demos for the lack of any games coming out certainly can't hurt. Throw us a friggin bone here!
 
That is actually a really cool idea. Why the heck not? We never even got that Tokyo driving demo. Did that actually release in Japan or did it become that Google thing - because that looks much choppier. A few tech demos for the lack of any games coming out certainly can't hurt. Throw us a friggin bone here!
AAMOF, I suggested we made a petition for those demos some time ago in the nintendo downloads threads.
 
Why would the GPU be 13-14W? Total power consumption is 33 or 34W (that does include PSU ineffiency, but that's true for every single piece of hardware), with the GPU at 13 or 14W, the CPU would probably at significantly less than 5W, I'd say ~3, which would leave the rest of the system with ~half of Wii U's total power consumption. And I really don't see anything on there that would need that much. I'd say the GPU should be in the ~20W range, the CPU in the ~5W range and the other ~8W are for the rest of the components (WiFi, Flash, DDR3 etc.).

Also please keep in mind that the initial measurements for say a HD5550 might not necessarily be true anymore. The 40nm process might be much, much better by now (AMD at one point sort of rereleased the HD5850 and reviewers noted that it's power consumption was significantly lower than what they got in the original reviews like 12-18 months earlier; just as an example).

The CPU is likely more like 6w at least. Though I would agree that 13-14w for the GPU is a very very low estimate. For a start I seriously doubt 33w is the maximum power usage of WiiU. I've said it many times but for example XBox 360 slim has games that hit about 80w max, so looking at some games you could claim 360 Slim is a 80w console. But later games (measured by the same people in the same way) hit 90w, and that's just from launch to 2010.
 
Why would the GPU be 13-14W? Total power consumption is 33 or 34W (that does include PSU ineffiency, but that's true for every single piece of hardware), with the GPU at 13 or 14W, the CPU would probably at significantly less than 5W, I'd say ~3, which would leave the rest of the system with ~half of Wii U's total power consumption. And I really don't see anything on there that would need that much. I'd say the GPU should be in the ~20W range, the CPU in the ~5W range and the other ~8W are for the rest of the components (WiFi, Flash, DDR3 etc.).

Also please keep in mind that the initial measurements for say a HD5550 might not necessarily be true anymore. The 40nm process might be much, much better by now (AMD at one point sort of rereleased the HD5850 and reviewers noted that it's power consumption was significantly lower than what they got in the original reviews like 12-18 months earlier; just as an example).

Here is the math that we came up with in the wiiu tech thread.
33 watts max from wall @ 90% psu ~30 watts

Disk drive ~4 Watts
Cpu ~8 watts

Whats left ~18

My guesses, anyone have hard numbers
2GB DDR3 Ram ~2 Watts
wifi ~.5 Watts
Flash Storage .5 Watts

Leaves about 15 watts for the whole gpu chip
 
I think the only numbers that really seem like they may be out of spec would be the optical drive. I doubt it needs so many watts to keep the disk spinning. From what I understand the Wii U keeps the drive spinning at all times (doesn't spin up/down). The accelleration from lower speeds is what draws the most power on optical drives, keeping it spinning takes very little (As people who drive a motor vehicle would know, you depress the gass pedal once you reach the speed you want). Just my thoughts on it.

*edit* I'd love to take a WiiU apart, and actually tap the power usage of the individual components like the optical drive, could be interesting. That one would be the easiest, the actual console usage (on the DC side of the brick) would be easy as well, the rest of the components would be a lot more challenging.
 
I think the only numbers that really seem like they may be out of spec would be the optical drive. I doubt it needs so many watts to keep the disk spinning. From what I understand the Wii U keeps the drive spinning at all times (doesn't spin up/down). The accelleration from lower speeds is what draws the most power on optical drives, keeping it spinning takes very little (As people who drive a motor vehicle would know, you depress the gass pedal once you reach the speed you want). Just my thoughts on it.

*edit* I'd love to take a WiiU apart, and actually tap the power usage of the individual components like the optical drive, could be interesting. That one would be the easiest, the actual console usage (on the DC side of the brick) would be easy as well, the rest of the components would be a lot more challenging.
if you look back at the post i linked to in the wiiu thread. They have the photo of the wiiu drive its self.

here it is:

No clue if this is useful:
uGvd4aN.jpg
 
if you look back at the post i linked to in the wiiu thread. They have the photo of the wiiu drive its self.

here it is:

and...?

doesn't mean that it's actually drawing that when the people who tested the power usage took their readings.

We know for sure those numbers are upper bound limits, we also know that drives consume the most power when accelerating.

I should also mention, it's easy to find bluray drives that can be powered solely off a USB port, so I don't know why it'd take 5W to power the WiiU's at typical use.
 
Might be missing something important - but didn't they say in a ND that normal usage would consume about 45w? Where did 33w come from?
 
I think the only numbers that really seem like they may be out of spec would be the optical drive. I doubt it needs so many watts to keep the disk spinning. From what I understand the Wii U keeps the drive spinning at all times (doesn't spin up/down). The accelleration from lower speeds is what draws the most power on optical drives, keeping it spinning takes very little (As people who drive a motor vehicle would know, you depress the gass pedal once you reach the speed you want). Just my thoughts on it.

*edit* I'd love to take a WiiU apart, and actually tap the power usage of the individual components like the optical drive, could be interesting. That one would be the easiest, the actual console usage (on the DC side of the brick) would be easy as well, the rest of the components would be a lot more challenging.

I'm pretty certain Wii U's optical drive is CAV (as both Gamecube and Wii's were), which as you say would mean a lower power draw.
 
Status
Not open for further replies.
Top Bottom