WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
We been over this back in Sept of 12. Wasn't that in Japanese?

I have stated facts. Go check your wiiu power brick, it will have 75watts on it. In mass consumer products never run the psu more than about 50%.

Like others have said it take power to convert power so the wiiu uses less than 33watts.

This has been covered many times before...this is nothing new.


You sure about that?

(serious question)
 
I can see how some would consider clothes physics in a game like this to be a nice surprise. In contrast, that Uncharted 3 video had alot of heavily scripted parts and had some budget on making the scene very cinematic.

I do not believe I am undervaluing it at all. I fully agree that there are few developers out there that actually take the time to model and animate additional details like that. I am merely pointing out that it has been done before as my response was to someone who said they never saw it in the current generation, not that it was just "a nice surprise"

The fact that Uncharted was heavily scripted has no bearing on the fact that in gameplay, Drake's Suit jacket is flaring out exactly like Link's tunic, which is the point of argument, not spending more money for a more cinematic feel.
 
If a power supply says it can output 70w isn't that what it can output? I always thought the "efficiency" part applies to how much power it requires (input) to output that 70w. In this case, say it draws 100w to output that maximum 70w, making it 70% efficient. The other 30% is lost as heat etc. Afaik, the power adaptor has to display the exact power it can output.

That's how PC PSUs work anyhow. You buy an 80% rated 700w PSU, it outputs 700w. Not 80% of 700w. Probably draws ~900w from the wall.

And even if WiiUs psu is only drawing 70w and outputting something else, it's likely at least 70% of that (~50w). This is 2013....psu's aren't 50% efficient nowadays.

fwiw: Iwata's comments (when detailing WiiUs hardware) were that the WiiU uses a maximum 70w, but normal use will be ~40w.


Disclaimer: I'm not claiming to know anything about the subject, just rambling. Probably miles off :D

This is EXACTLY how I imagined it would work. Same method I used when I bought my PSU for my PC.
 
Not that it really matters/is reliable, but TVtropes just added this bit of info (seems kinda off though):

Changed line(s) 31,32 (click to see context) from:

* GPU: AMD Radeon High Definition processor based on the Radeon 6850 [[note]](the chip was going to be based on the Radeon 4870 back when the system was first showed of in [=E3=] 2011, but was updated to a 6850 in order to cut costs due to the age of the chip)[[note]] however, it has been rumored that the GPU has been over clocked to 800 [=MHz=] after the 3.0.0 update[[/note]] codenamed "Latte" with an eDRAM cache built onto the die reportedly clocked at 550 [=MHz=] with 320 shaders, resulting in a performance of 0.352 TFLOPS.

To:

* GPU: AMD Radeon High Definition processor based on the Radeon 6850 [[note]](the chip was going to be based on the Radeon 4870 back when the system was first showed of in [=E3=] 2011, but was updated to a 6850 in order to cut costs due to the age of the chip)[[note]] however, it has been rumored that the GPU has been over clocked to 800 [=MHz=] after the 3.0.0 update[[/note]] codenamed "Latte" with an eDRAM cache built onto the die reportedly clocked at 550 [=MHz=] with 960 shaders, resulting in a performance of 1.488 TFLOPS.

Link: http://tvtropes.org/pmwiki/article_history.php?article=Main.WiiU

Not that I believe it's valid, but where do they get this info? A Wii U gpu that has 960 shaders and does 1.488 TFLOPS? Outrageous...
 
he Wii U has more the 320 shaders, I have counted them in game, plus if the Wii U was just 3 overclocked Wiis with a Radeon 4650 then the Wii U will just be a 7th gen system, not a 8th.

That's what the guy said.

I smell bullshit.
 
The fan + heatsink is the Wii U is kinda cute and pathetic compared to what's in the PS3.

5la4lOZUITBmxZfC.medium

Fo41rMXMGOl2HU2t.medium


vs.

vMvqvAhXerxQqTvO.medium

qyrkxEtrcdmTJmHu.medium

ummm, when it comes to heatsinks, the bigger the more pathetic.
 

If you need a bigger heatsink, that means that your hardware produces more heat and uses more energy. That is a bad thing, not a good thing.

The fact that the Wii U only needs a small heat sink means that is extremely efficient, or as many posters have attested when talking about the Wii U's power draw "it blows cool air". That is great.

Pathetic is needing a monster heatsink like the older 360s if you want to keep them from red ringing. http://www.youtube.com/watch?v=gPhiTYKm3PU This is pathetic.

One thing that Nintendo can brag about is that they have the most heat/energy efficient console of the next gen.

The bigger the heatsink the more pathetic. How he came to the conclusion that having a small heatsink was pathetic is beyond me. Ideally, you would want hardware that doesn't need a heatsink at all. We are far from nano computers though.
 
If you need a bigger heatsink, that means that your hardware produces more heat and uses more energy. That is a bad thing, not a good thing.

The fact that the Wii U only needs a small heat sink means that is extremely efficient, or as many posters have attested when talking about the Wii U's power draw "it blows cool air". That is great.

Pathetic is needing a monster heatsink like the older 360s if you want to keep them from red ringing. http://www.youtube.com/watch?v=gPhiTYKm3PU This is pathetic.

One thing that Nintendo can brag about is that they have the most heat/energy efficient console of the next gen.

The bigger the heatsink the more pathetic. How he came to the conclusion that the small heatsink made it pathetic is beyond me.

for the love of god chill out :p its a typical "look at that its a monster! oh the wiiu is diddy!" kind of comment. no malice, so dont get your back up. Also 'bigger is worse' is a terrible vague statement. More heat isnt a bad thing, can mean either inefficient and overpushed, or it means its doing a lot more work.
The wiiu is pretty efficient/low power, and that isnt necessarily a good thing.
 
If you need a bigger heatsink, that means that your hardware produces more heat and uses more energy. That is a bad thing, not a good thing.

The fact that the Wii U only needs a small heat sink means that is extremely efficient, or as many posters have attested when talking about the Wii U's power draw "it blows cool air". That is great.

Pathetic is needing a monster heatsink like the older 360s if you want to keep them from red ringing. http://www.youtube.com/watch?v=gPhiTYKm3PU This is pathetic.

One thing that Nintendo can brag about is that they have the most heat/energy efficient console of the next gen.

The bigger the heatsink the more pathetic. How he came to the conclusion that having a small heatsink was pathetic is beyond me. Ideally, you would want hardware that doesn't need a heatsink at all. We are far from nano computers though.

The gift that keep on giving!

lol

So how does wiiu compare to ipad? Not even a FAN in the ipad!

Yeah the ps4/xbone are terrible energy efficient. 28nm vs 40/45nm wonder who wins...
 
If you need a bigger heatsink, that means that your hardware produces more heat and uses more energy. That is a bad thing, not a good thing.

The fact that the Wii U only needs a small heat sink means that is extremely efficient, or as many posters have attested when talking about the Wii U's power draw "it blows cool air". That is great.

Pathetic is needing a monster heatsink like the older 360s if you want to keep them from red ringing. http://www.youtube.com/watch?v=gPhiTYKm3PU This is pathetic.

One thing that Nintendo can brag about is that they have the most heat/energy efficient console of the next gen.

The bigger the heatsink the more pathetic. How he came to the conclusion that having a small heatsink was pathetic is beyond me. Ideally, you would want hardware that doesn't need a heatsink at all. We are far from nano computers though.

bless-this-post.jpg
 
The gift that keep on giving!

lol

So how does wiiu compare to ipad? Not even a FAN in the ipad!

Yeah the ps4/xbone are terrible energy efficient. 28nm vs 40/45nm wonder who wins...

The Wii U isn't a tablet...

Also, I already detailed the context of heat consumption in power draw in the post. Most older computers didn't use heatsinks at first. Some of them did'nt even use fans. They were extremely weak and incapable though.

The key word in all of this that I've stated endlessly in this thread when people start talking about power is "efficiency". What can is it capable of for the amount of power it consumes.
 
Interestingly, just checked the bottom of my WiiU and it says:

Rating 15V 5A
AC Output 15V 5A

So why would the console rating also be 75w if it can only draw 33w max from the PSU?
 
If you need a bigger heatsink, that means that your hardware produces more heat and uses more energy. That is a bad thing, not a good thing.

The fact that the Wii U only needs a small heat sink means that is extremely efficient, or as many posters have attested when talking about the Wii U's power draw "it blows cool air". That is great.

Pathetic is needing a monster heatsink like the older 360s if you want to keep them from red ringing. http://www.youtube.com/watch?v=gPhiTYKm3PU This is pathetic.

One thing that Nintendo can brag about is that they have the most heat/energy efficient console of the next gen.

The bigger the heatsink the more pathetic. How he came to the conclusion that having a small heatsink was pathetic is beyond me. Ideally, you would want hardware that doesn't need a heatsink at all. We are far from nano computers though.

Pathetic in an itsy-bitsy sort of way. I'm not bashing the design choice.

It's unlikely the Wii U is more energy efficient than the XB!/PS4, however. At least from a perf/W perspective. The GPU is going to be the biggest power drain and the XB1/PS4 are a full process node ahead of the Wii U in that regard.
 
Interestingly, just checked the bottom of my WiiU and it says:

Rating 15V 5A
AC Output 15V 5A

So why would the console rating also be 75w if it can only draw 33w max from the PSU?

The max it can draw is 45 watts. These were the words of Iwata himself, and I see no reason to disregard them.
 
The Wii U isn't a tablet...

Also, I already detailed the context of heat consumption in power draw in the post. Most older computers didn't use heatsinks at first. Some of them did'nt even use fans. They were extremely weak and incapable though.

The key word in all of this that I've stated endlessly in this thread when people start talking about power is "efficiency". What can is it capable of for the amount of power it consumes.

Efficiency is not using less power. It is performance per watts. Wiiu has an uphill battle aginst the other next gen console because both are on 28nm. Which boost their efficiency way above what you can do on 40nm.

Interestingly, just checked the bottom of my WiiU and it says:

Rating 15V 5A
AC Output 15V 5A

So why would the console rating also be 75w if it can only draw 33w max from the PSU?
It doesnt draw 33w, it draws less than that. The power consumption readings included the psu. Even the best psu only convert 90% in consumer products. So 33w at 90% means the wiiu is really using 29.7 watts.
 
Efficiency is not using less power. It is performance per watts. Wiiu has an uphill battle aginst the other next gen console because both are on 28nm. Which boost their efficiency way above what you can do on 40nm.


It doesnt draw 33w, it draws less than that. The power consumption readings included the psu. Even the best psu only convert 90% in consumer products. So 33w at 90% means the wiiu is really using 29.7 watts.

Do you actually read what you respond to?
 
The bigger the heatsink the more pathetic. How he came to the conclusion that having a small heatsink was pathetic is beyond me. Ideally, you would want hardware that doesn't need a heatsink at all. We are far from nano computers though.
Indeed bigger heat sinks means it is likely having to dissipate more heat, which could be bad... but if you look at markets like GPUs, you have low end cards which are often passively cooled or have tiny heatsink/fans, and you look at high end cards, they always have the massive heatsink/fans.

If you want performance, generally you need to bump up the clocks, and have more transistors which generates more heat. Efficiency could be identical, but performance can be totally different, but if efficiency is identical, you'd need a bigger heatsink/fan, it's pretty simple...

If Nintendo had some magic CPU/GPU that had > 100% efficiency, then you'd have a point, but given that the GPU is AMD tech, as is PS4/XBone/PC, I doubt the efficiency would be much better if at all compared to those parts. So smaller HSF means less performance (Obviously). Though, compared to XBox360/PS3, the HSF is smaller on the WiiU, but since those are high clock CPU's and GPU generations older, this is to be expected.

Edit, oh and to reply directly to what you said in the quote, bigger heat sinks isn't pathetic, bigger heat sinks give better disippation, which means lower temperatures, which can give better reliability, or even better performance, as it can allow for higher clocks. Personally, I'd prefer the WiiU to have a bigger HSF so that it's CPU/GPU could be clocked higher...
 
Indeed bigger heat sinks means it is likely having to dissipate more heat, which could be bad... but if you look at markets like GPUs, you have low end cards which are often passively cooled or have tiny heatsink/fans, and you look at high end cards, they always have the massive heatsink/fans.

Just cause it's massive don't mean it's efficient. When it comes to oc'ing these days you need better form of cooling than air only be it cpu or gpu. You can still do a lot with air but there is clear wall of performance that it offers.
 
Not that it really matters/is reliable, but TVtropes just added this bit of info (seems kinda off though):



Link: http://tvtropes.org/pmwiki/article_history.php?article=Main.WiiU

Not that I believe it's valid, but where do they get this info? A Wii U gpu that has 960 shaders and does 1.488 TFLOPS? Outrageous...

You would have to believe in fairy tales to believe that an update will turn the GPU from 0.3 Gflops to 1.5tflops. I mean seriously the guy probably writes fan fiction about this shit.
 
Does this mean they have a new source or something, it sounds really good but I really want to know there sources.
I suspect that guy's just a troll. What kind of justification is this???
The Wii U has more the 320 shaders, I have counted them in game, plus if the Wii U was just 3 overclocked Wiis with a Radeon 4650 then the Wii U will just be a 7th gen system, not a 8th.
 
Yes but it clear you don't understand it.

It better if you go back to posting screen shot. Lol

Is that so?

Efficiency is not using less power. It is performance per watts.

In the very text you quoted.
What is it capable of for the amount of power it consumes.

That redundancy made it clear you didn't and accompanied with your response just now has told me all I ever need to know from you.
 
I agree, just find it really odd, where did we get the 160 shaders from? (Or whatever else we currently believe it is)

That is a theory, though only the person who forwarded it in this thread originally is supporting it for a legitimate reason. The rest push for the possibility to be accepted as absolutely unquestionable fact without challenge for no other reason than it given them the ability to mispromote the Wii U as having less shaders than the 360 and thus forward a the misconception to people that it is using weaker hardware that has inferoior graphics capabilities which is far from the case even if that turns out to be true.

They know most people are just going to look at the number and go, more=stronger, lower=weaker like they did with the Wii U's CPu clock and RAM clock.
 
No, Sony at the start of this gen was a Console company with nearly all the other sections having loses.
Two years ago, it was an Insurance company with all the other sections having loses.

After the restructuring of the company, other sections have become profitable, but not the videogames one (and this is why some shareholders wanted to close that section). Sony lost nearly as much money during the PS3 generation, than he earned during the PS + PS2 ones, and yet there is nobody here suspicious of them quitting the business.

In fact, the whole videogames section has been restructured also, because the philosophies also changed. While Sony always wanted personalized hardware for their consoles, both Vita and PS4 are of-the-shelf parts with 0 customizations on them. Hell, next year we will see the PS4 APU (maybe cut in half) being sold on tons of mobile PCs, and the PSVita is just a generic mobile phone with more cores for the CPU and a modern Mobile GPU, and all that is made in order to cut expenses and save money.

Keep your facts and reality out of this thread, sir. ;)
 
That is a theory, though only the person who forwarded it in this thread originally is supporting it for a legitimate reason. The rest push for the possibility to be accepted as absolutely unquestionable fact without challenge for no other reason than it given them the ability to mispromote the Wii U as having less shaders than the 360 and thus forward a the misconception to people that it is using weaker hardware that has inferoior graphics capabilities which is far from the case even if that turns out to be true.

They know most people are just going to look at the number and go, more=stronger, lower=weaker like they did with the Wii U's CPu clock and RAM clock.

People dont seem to recognize there is a difference between x86 and PPC.

So, what is your theory on the specs on CPU and GPU?
 
The max it can draw is 45 watts. These were the words of Iwata himself, and I see no reason to disregard them.

He said 70w max, 40w average use.

It doesnt draw 33w, it draws less than that. The power consumption readings included the psu. Even the best psu only convert 90% in consumer products. So 33w at 90% means the wiiu is really using 29.7 watts.

That's...not how it works.

In any case, Anandtech confirmed the WiiU (as in the console) uses ~33w at boot and playing various launch games. Not the PSU uses 33w, the console sucks in 33w from the PSU.

What I was pointing out was that it says on the console "Rating 15v 5a. AC Output 15v 5a". Which I thought was interesting as everyone has only ever looked at the PSU.
 
Guys stop. The Wii u GPU isn't even close to 1.5tf or whatever that site says.

The laws of physics and thermodynamics never lie. Ever. It's best not to pay attention to garbage like that and trust physics .
 
He said 70w max, 40w average use.



That's...not how it works.

In any case, Anandtech confirmed the WiiU (as in the console) uses ~33w at boot and playing various launch games. Not the PSU uses 33w, the console sucks in 33w.

What I was pointing out was that it says on the console "Rating 15v 5a. AC Output 15v 5a". Which I thought was interesting as everyone has only ever looked at the PSU.
It is how it works. Ads someone in this thread even did. Just hook your wiiu up to a power meter.
 
People dont seem to recognize there is a difference between x86 and PPC.

So, what is your theory on the specs on CPU and GPU?

I'm not the best choice of people to ask becdause I am far from being as knowledgeable in as the big analysts in this thread like blu, Fourth Storm, Zomie and BG Assassin.

From what I can tell though(and the reason I don't roll with the 160 shader theory) the SP count per component should be around 24,28, or 32(what the 360's is) do to the components on the GPU being 90% larger than AMD's 20 SP components.

The 160 shader theory is founded on the belief that there are 8 TMU's in the Wii U and are all 20 ALU comonents I just find that unlikely given that they are nearly twice the size because one of the key facts we got from the devs in the Iwata' asks on the Wii U is that there is no wasted silicon and that they targeted a low power draw. 20 ALU components that are 90% larger than stock 20 ALU components would be one hell of a waste of silicon and energy.

There are a lot of other theories supporting it though. The biggest is power draw which is why the max power consumption is being discussed.

I'm looking towar the possibility that the TMU's on Latte have more shaders packed per square micrometer? than the standard ones do to the tech being more modern and more advanced.

The die size is also still kind of up in the air as well, so that could be a contributing factor.
 
It is how it works. Ads someone in this thread even did. Just hook your wiiu up to a power meter.


Why are people using 33w to base their TDP calculations from then?

Are we really assuming a 75w PSU can only actually deliver 29w to the device?

That's an efficiency rating of 38%.
 
He said 70w max, 40w average use.
Thanks for the correction.
Here is a source.
...banned source... sigh... well. I think it's in a Nintendo Direct somewhere.

Interestingly the source says that the Wii U draws 40w at the dash and can pull up to 75. Which doesn't make sense... because during gameplay, you hit ~33w at the plug... meaning the console itself (minus the PSU) is pulling around 25w or so...
Notsureifsarcasm? Just in case, I was asking a genuine question, not assuming/implying anything :)
No sarcasm. As I said above... the PSU is capable of 75 watts.. but is never near that.
 
He said 70w max, 40w average use.



That's...not how it works.

In any case, Anandtech confirmed the WiiU (as in the console) uses ~33w at boot and playing various launch games. Not the PSU uses 33w, the console sucks in 33w from the PSU.

What I was pointing out was that it says on the console "Rating 15v 5a. AC Output 15v 5a". Which I thought was interesting as everyone has only ever looked at the PSU.

USC is right. It measures 33w at the wall, so you have to grant a few watts for the psu itself, unless I'm missing something. Nothing connects to the Wii U itself without the PSU...

Wii U also has 4 USB slots, so in theory you could connect 4 USB HDDs at once, which is why the psu rating leaves so much breathing room.
 
USC is right. It measures 33w at the wall, so you have to grant a few watts for the psu itself, unless I'm missing something. Nothing connects to the Wii U itself without the PSU...

Wii U also has 4 USB slots, so in theory you could connect 4 USB HDDs at once, which is why the psu rating leaves so much breathing room.

It's either 2 y connected or 4 with their own Power supplies... So, it shouldn't draw 75 with 4 hdd's or 2 y connected hdds..
 
Status
Not open for further replies.
Top Bottom