WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I believe wii U gpu needs a re-review. What if the wii U gpu uses AMD based gpu technology (not ATI). I pointed this out, because in every page I read about rv7... But it conflicts, so there is a confusion. Is the entire area of the GPU explained? Or it may hide more spus or other features.

ATI and AMD are the same company. Another review wouldn't change anything about our base assumptions.
 
and what about this http://www.nintendo.com/wiiu/features/tech-specs/

and this http://www.amd.com/uk/products/desktop/graphics/ati-radeon-hd-5000/Pages/ati-radeon-hd-5000.aspx

Maybe they pointing the base chipset they used for it.

And how about that

I believe wii U gpu needs a re-review. What if the wii U gpu uses AMD based gpu technology (not ATI). I pointed this out, because in every page I read about rv7... But it conflicts, so there is a confusion. Is the entire area of the GPU explained? Or it may hide more spus or other features.
ATI has been absorbed into AMD, so I'm unsure on where you are going with that.

From what we are seeing, it is best to not get too caught up to whatever the base of the chip is, because the GPU has been customized to something very different. We know that development for the GPU started in 2008/2009 (matches the time when the r7xx series was AMD's focus), but we are seeing some parts that look similar to ones in Brazos/Llano that were released in 2011. We are still unsure on several parts of the GPU, like the location or lack of ROPs. Unless an actual Wii-U dev give out more info, we are near the limit on what we can figure out with the info we have now.
 
Given all that I have seen, I'm leaning heavily towards this being derived from the HD5550. The number just fall right into place.

I don't understand why people are so hung up on insisting that it is a 4XXX chip. Its like they don't even want to humor the idea of it being a more capable chip than the low end guess. That entire line of options is just completely dismissed without study.

For this basic comparison I will be using the HD 46XX since Digital Foundry was so "certain" that this lower end chip is the bases and since most people I've encounter believe it to be the 100% unquestionable, end of story, truth... Its as if people have some stock in Latte being as bad a possible.

http://www.amd.com/uk/products/desk...hd-5000/hd-5550/Pages/hd-5550-overview.aspx#2
http://www.amd.com/us/products/desk.../Pages/ati-radeon-hd-4600-specifications.aspx

Just a short comparison.

1. Latte is 40 nm

46XX - 55 nm
5550 - 40 nm

2. Latte uses DX11 equivalent graphics

46XX - DX10.1
5550 - DX11

3. Latte is clocked at 550 Mhz

4650 - 600/725 MHz
5550 - 550 Mhz

4. Latte is theorized at 352 Gigaflops

46XX - 384 GFLOPS
5550 - 352 Gigaflops

5. While neither of the chip are as low on energy consumption as the Wii U, the HD5550 has a lower watt draw and we know that Nintendo was targeting energy efficiency and lower cost.

6. The HD5550 has much larger and more efficient multi-display support.

I also remember some mention of 320 stream processor which the HD5550 has as well, but I can't find it.

In the first post it list "Trisetup and rasterizer (R800 dropped that and delegated the workload to SPs)" but we know that is is a heavily customized chip, that that could very well be the result of customization to lower cost and make it more compatible with the Wii.

Now this is a long shot in dark right here, but another thing of note is the RAM supported by the HD46XX was GDDR3/DDR3/DDR and for the HD5550 is was GDDR5/DDR3. GDDR3 was used for the Wii making it the more natural choice to use it for the Wii U, but there is no GDDR3 HD5550 model. DDR3 would have been the cheaper and more energy efficient option of the two RAM types is supported. This may explain why they went with DDR3 over GGDDR3. The DDR3 version of the HD4650 seems to have the worst energy consumption and the highest clock.
http://www.cnet.com/graphics-cards/ati-radeon-hd-4650/4507-8902_7-33780428.html

The biggest thing that makes me lean towards the HD5550 is performance per watt. The HD5550 is all around superior. The HD 46XX is just too power hungry. Finally, the actual "price" of the H5550 is far lower than the HD46XX. The loest I saw a HD4650 for was $95 while the highest I saw a HD5550 for was $59. Why would Nintendo go for a higher cost, less energy efficient card with inferior multi-display tech, and lower real world visual performance? Combine this with the specs and its just common sense to me.

Well, that's my very non-professional analysis.
 
Doesnt matter. That doesnt change performance.

Its based on r700 and there are 40nm version of this card. What we know about the features the gpu support match this feature set completely.

I do agree it would be likely really close to 5550 performance if it does have 320 ALU.

DX11 equivalent function is just PR talk, doesnt mean anything.
 
Now I'm really confused, haha. What did I ask?

I must have missed something as I don't recall Nintendo claiming specific costs for parts. And whose results are you talking about?

...


Given all that I have seen, I'm leaning heavily towards this being derived from the HD5550. The number just fall right into place.

It's clear the chip was heavily altered, so those numbers from stock cards don't mean much anyway. If we're ever going to find out what it's capapble of, it will have to be from how games look, or if a dev leaks some info.
 
Doesnt matter. That doesnt change performance.

Its based on r700 and there are 40nm version of this card. What we know about the features the gpu support match this feature set completely.

I do agree it would be likely really close to 5550 performance if it does have 320 ALU.

DX11 equivalent function is just PR talk, doesnt mean anything.
The GPU apparently has an extended instruction set compared to a regular R700, so it's actually a custom and unique superset of R700 - just like Espresso is a superset of the PPC750. The R700 feature set is just a baseline.
 
LOL. Why didn't you just quote that to begin with? I was trying to find out who Orionas was referring to when he said "Because ur results", so I'm assuming he was talking about someone specifically.

To annoy you... seriously, i don't know. I either thought the discussion was still about that, or i just messed up with the quote. Though, you could have guessed i was talking about that, not? It was just one post higher. But anyway, nvm.
 
To annoy you... seriously, i don't know. I either thought the discussion was still about that, or i just messed up with the quote. Though, you could have guessed i was talking about that, not? It was just one post higher. But anyway, nvm.

LOL. How could I guess at something I didn't even remember? Let this be a lesson to people and don't over work yourself for extended periods of time. Your memory will suffer.
 
The GPU apparently has an extended instruction set compared to a regular R700, so it's actually a custom and unique superset of R700 - just like Espresso is a superset of the PPC750. The R700 feature set is just a baseline.

Is this based on some new info or just whats already publicly known? I mean I've always assumed it would have extra features over R700 and comments from certain sources in the past have suggested this, but I'd be interested to know if you've heard something new on the subject.
 
Did anyone ever come up with a solid theory as to what the '30%' of unknown chip space may bring to the full picture?

First post here by the way. Thanks for having me.
 
I guess itt could mean that they've built in something new/different/unconventional that we'll find out about at some point. But it seems more likely to me that he's referring to developers tapping into their creativity and coming up with excellent art styles to compliment whatever a console is capable of.
 
I think this quote from Iwata needs to be looked into a little bit more.

"The other point is that many of our third-party software developers have been dedicated to technologies like shaders. As Wii U is designed to bring out their real strengths"

"Wii U designed to bring out their real strengths"???? What does that mean... What could it mean? Is there a way to do shaders better? "Bring out their real strengths"? Man nintendo has been dropping hint after hint and we can't decipher them.
Mate, I think you're looking into this way too much for something that isn't there. Why do people always assume that Nintendo is hiding something super-secret that is going to blow their socks off when it's revealed? Most of that time, this isn't the case.
 
I think this quote from Iwata needs to be looked into a little bit more.

"The other point is that many of our third-party software developers have been dedicated to technologies like shaders. As Wii U is designed to bring out their real strengths"

"Wii U designed to bring out their real strengths"???? What does that mean... What could it mean? Is there a way to do shaders better? "Bring out their real strengths"? Man nintendo has been dropping hint after hint and we can't decipher them.

Miyamoto said that, not Iwata.

Mate, I think you're looking into this way too much for something that isn't there. Why do people always assume that Nintendo is hiding something super-secret that is going to blow their socks off when it's revealed? Most of that time, this isn't the case.

Nintendo has this weird habit of keeping important things secret.
 
I think this quote from Iwata needs to be looked into a little bit more.

"The other point is that many of our third-party software developers have been dedicated to technologies like shaders. As Wii U is designed to bring out their real strengths"

"Wii U designed to bring out their real strengths"???? What does that mean... What could it mean? Is there a way to do shaders better? "Bring out their real strengths"? Man nintendo has been dropping hint after hint and we can't decipher them.
He (Miyamoto) is probably just saying hey, we support shaders this time (unlike Wii's TEV)!

Even if they had DX11 level stuff the performance will probably keep their use to a minimum compared to the others.

And anything more custom than that probably won't be used much by third parties just like TEV's fancier stuff anyway.
 
All he's saying is that third parties have been dealing with programmable shaders for years, which wasn't possible on the Wii and now the Wii U has programmable shaders, which will play into the developers strengths since they've been doing this for many years.
 
Totally random, but I tried to match similar blocks between Latte and Llano by looking at the basic structure and especially the dual port SRAM cells. Evidently, dual port SRAM appear in a different, darker color compared to single port SRAM, so they're easy to spot on the Llano die shot. So far, I found two that seem similar enough:

- Block D (Llano: bottom left corner)
- Block F (Llano: bottom right corner)
 
Totally random, but I tried to match similar blocks between Latte and Llano by looking at the basic structure and especially the dual port SRAM cells. Evidently, dual port SRAM appear in a different, darker color compared to single port SRAM, so they're easy to spot on the Llano die shot. So far, I found two that seem similar enough:

- Block D (Llano: bottom left corner)
- Block F (Llano: bottom right corner)

Assuming its the same "part" on liano and espresso, what does that mean in the end?
 
Never said anything about secret sauce maybe its him saying PR BS... But that seems like a weird way to word it. Why say it that way. Why not just say Wii U uses shaders now compared to Wii. And like someone said below nintendo is secretive. We still don't know the specs of Wii U but know what PS4 will have and pretty much what nextbox will have.

The implications of the statement were pretty clear. The big deal with the Wii not being modern was its lack of programmable shaders...
 
NO ONE DOES. Plain and simple. Nintendo developed this thing so one one would know what it is. Complete custom job.

Maybe I should simplify: I don't care about how much GigaUltraFloppersTripleA+ is can put out, I, like most normal folk, care about how it'll perform. So:

1) Can it hold its own against Orbirango?
2) Can visuals it look significantly better on it than PS360?
 
So you don't care about its specs, but want to know how it will perform?

1)No
2)Somewhat
 
Maybe I should simplify: I don't care about how much GigaUltraFloppersTripleA+ is can put out, I, like most normal folk, care about how it'll perform. So:1) Can it hold its own against Orbirango?
Against Orbis: not as wide as Wii -> PS3, but still quite considerable.

Against Durango: we don't know yet, but it seems like Wii U is almost a stop gap between last gen and Durango.

2) Can visuals it look significantly better on it than PS360?
PS4 demos didn't look significantly better than the last gen to me, so probably no?

---
In the end, however, most things will probably boil down to how much Wii U manages to sell and what type of customers it can snatch.

If it sells rather well and to core gamers in particular, we will most probably see most multiplats on the system; it seems to be that capable, for at least first few years.
 
Maybe I should simplify: I don't care about how much GigaUltraFloppersTripleA+ is can put out, I, like most normal folk, care about how it'll perform. So:

1) Can it hold its own against Orbirango?
2) Can visuals it look significantly better on it than PS360?

No, and no, noticeably better sure, significantly better probably not.

PS4 demos didn't look significantly better than the last gen to me, so probably no?

I'd say they did, diminishing returns is starting to hit, but i'd still place them at significantly better.
 
Yes ground up games(nintendo 1st and 2nd party) will look significantly better. But just look at need for speed U. All they did is use the extra ram applied some PC textures and we have a game (that to me) looks notice noticeably better than its ps360 counterpart. I like the fact that the GPU and CPU are custom and can't be recognized by (so called) tech experts. Just like the developers of Need for Speed said it punches WAY above its weight( I added the WAY they just said above its weight). I expect ground up games from Nintendo that are developed to show off graphics(zelda, metroid, f-zero, starfox, smash brothers) to look amazing and significantly better than this gen not just because of power but... Nintendo knowing their system better than anyone else and how to take advantage of its custom parts and artstyle.

Do you think built from the ground up Nintendo games (or any game that pushes the Wii U hardware) could look graphically as good as a PS4 game or a little worse?
 
Do you think built from the ground up Nintendo games (or any game that pushes the Wii U hardware) could look graphically as good as a PS4 game or a little worse?
Not very likely, but who knows? We don't really understand the chip after all, and two generations ago, Nintendo had a platform that looked rather weak on paper, with much smaller chips and much lower power consumption, yet it managed to keep up very well.

Either way, I really don't think we've seen what the system is truly capable of.
 
Maybe I should simplify: I don't care about how much GigaUltraFloppersTripleA+ is can put out, I, like most normal folk, care about how it'll perform. So:

1) Can it hold its own against Orbirango?
2) Can visuals it look significantly better on it than PS360?

My thoughts? The average consumer won't be able to tell the difference between Wii U & Orbirango title without doing a side by side comparison or having someone point it out. Average consumer = minimal difference.

Those who play lots of games, but not really into technology will be able to tell the difference fairly easily. However the difference won't necessarily be great enough to heavily influence purchasing decisions. These people = moderate difference.

Those who are really into graphics & technology; those who can easily tell what techniques & effects are being used from screenshots & short video clips will say the difference in quality is significant and fairly large.

The size of these differences will also depend on how much effort developers put into each platform. I expect most 3rd parties to put more effort in Orbirango titles than Wii U ones. However, I don't think we'll see a PS3/XBOX360 vs. Wii situation. A few 3rd parties will at least attempt to treat the platforms equally.

As for 1st parties, I expect Sony's developers to push the system. Nintendo's won't. They'll have their artists do the heavy lifting instead. Nintendo games, while often beautiful, never really make you question how they got their system to do x,y,& z. You realize what can be achieved when a studio makes good use of the system's features, but that's it.
 
Not very likely, but who knows? We don't really understand the chip after all, and two generations ago, Nintendo had a platform that looked rather weak on paper, with much smaller chips and much lower power consumption, yet it managed to keep up very well.

Either way, I really don't think we've seen what the system is truly capable of.

I agree, that's why it really makes me think of how 3D Mario will look on Wii U.
 
My thoughts? The average consumer won't be able to tell the difference between Wii U & Orbirango title without doing a side by side comparison or having someone point it out. Average consumer = minimal difference.

Those who play lots of games, but not really into technology will be able to tell the difference fairly easily. However the difference won't necessarily be great enough to heavily influence purchasing decisions. These people = moderate difference.

Really? Not able to tell the difference between systems that are 4-5x as powerful? Maybe you could say that with the first generation games, especially the cross gen ones. Some may say they won't be able to tell the difference between the ps4 and ps3 versions! We'll see when Watch dogs comes out and AC4. Difference on those titles may be hard for the casuals to see. Come the 3rd gen games, it will be night and day. Even more night and day when the games don't even launch on the Wii U because it wouldnt be able to run these 3rd and 4th gen games at all. The gpu wont be able to handle it at a framerate above 10fps, but also you can't fit 6+gbs of RAM in 1-1.5gb of ram.

The Wii U is a lot closer to PS3/360, then it is the next gen systems. Wii U is basically 2x as powerful as PS3/360, where as the next gen systems are 8-10x as powerful as there predecessors. It would be a damn shame if the same games could run on Wii U, cause that's means they could run on ps3/360 too. Cross gen games for the whole gen...yea that would be dissapointing.
 
Really? Not able to tell the difference between systems that are 4-5x as powerful? Maybe you could say that with the first generation games, especially the cross gen ones. Some may say they won't be able to tell the difference between the ps4 and ps3 versions! We'll see when Watch dogs comes out and AC4. Difference on those titles may be hard for the casuals to see. Come the 3rd gen games, it will be night and day. Even more night and day when the games don't even launch on the Wii U because it wouldnt be able to run these 3rd and 4th gen games at all. The gpu wont be able to handle it at a framerate above 10fps, but also you can't fit 6+gbs of RAM in 1-1.5gb of ram.

The Wii U is a lot closer to PS3/360, then it is the next gen systems. Wii U is basically 2x as powerful as PS3/360, where as the next gen systems are 8-10x as powerful as there predecessors. It would be a damn shame if the same games could run on Wii U, cause that's means they could run on ps3/360 too. Cross gen games for the whole gen...yea that would be dissapointing.

I think part of his point is that Joe Blow isn't even going to see the difference between PS3/360 and the PS4/720. The fact WiiU falls in the middle means they won't see the difference there either.

If you think otherwise, you're overestimating how much the average person knows and what they look for.
 
Here's something that's been bugging me. Chipworks says it's completely custom, but Digital Foundry were just claiming up and down they were right about it being a 4680 base GPU. So, did DF take their word back?
 
My thoughts? The average consumer won't be able to tell the difference between Wii U & Orbirango title without doing a side by side comparison or having someone point it out. Average consumer = minimal difference.

Those who play lots of games, but not really into technology will be able to tell the difference fairly easily. However the difference won't necessarily be great enough to heavily influence purchasing decisions. These people = moderate difference.

Those who are really into graphics & technology; those who can easily tell what techniques & effects are being used from screenshots & short video clips will say the difference in quality is significant and fairly large.

The size of these differences will also depend on how much effort developers put into each platform. I expect most 3rd parties to put more effort in Orbirango titles than Wii U ones. However, I don't think we'll see a PS3/XBOX360 vs. Wii situation. A few 3rd parties will at least attempt to treat the platforms equally.

As for 1st parties, I expect Sony's developers to push the system. Nintendo's won't. They'll have their artists do the heavy lifting instead. Nintendo games, while often beautiful, never really make you question how they got their system to do x,y,& z. You realize what can be achieved when a studio makes good use of the system's features, but that's it.
I would agree if you were saying wiiu vs ps360. Like need for speed were if you compare them side by side you can see the texture/lighting improvements.

Now we are talking about systems that have 10x the power of those systems.
 
Here's something that's been bugging me. Chipworks says it's completely custom, but Digital Foundry were just claiming up and down they were right about it being a 4680 base GPU. So, did DF take their word back?

Digital Foundry do not know shit about the Wii U tech. The guys were saying that the Trine 2 version of Wii U was the same sub par HD and identical with the Xbox 360 version of the game. I can not provide this articles because they put it down after frozenbyte said that their game was the definite version running at 720p 30fps.

edit: I found it they put an update after frozenbyte inform them about their "mistakes" on the Wii U version. I do not know how may of you have both versions of the game but they look NOTHING alike Wii U version>>>>>>>>>>>>>360,Ps3. IQ,Framerate the native resolution is far better on Wii U. Why digital foundry as they are so expert on more complicated analyses on more complicated games made those "mistakes".

http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

I will put a small quote.

All three console versions render in 720p [Update: Frozenbyte has now confirmed dynamic resolution scaling on PS3 and 360 to sustain frame-rate vs. locked native resolution on Wii U], so resolution certainly isn't the issue with regards to the overly soft image on the PS3. Instead, the distinct Vaseline-style blur is a result of the anti-aliasing method used in the game. NVIDIA'S FXAA is present on all three consoles, but a cheaper implementation is used on Sony's console, which causes the screen to be covered by a heavy smudging that robs the game's artwork of fine detail and softens the look of foliage dramatically. The FXAA pass is also performed after the HUD has been rendered, so we find that the various on-screen elements are smoothed over too, but particularly heavily on the PS3.

Sub HD now on Xbox 360 and Ps3 is Dynamic Resolution....This guys at Eurogamer are hggggg.
 
Digital Foundry do not know shit about the Wii U tech. The guys were saying that the Trine 2 version of Wii U was the same sub par HD and identical with the Xbox 360 version of the game. I can not provide this articles because they put it down after frozenbyte said that their game was the definite version running at 720p 30fps.

edit: I found it they put an update after frozenbyte inform them about their "mistakes" on the Wii U version. I do not know how may of you have both versions of the game but they look NOTHING alike Wii U version>>>>>>>>>>>>>360,Ps3. IQ,Framerate the native resolution is far better on Wii U. Why digital foundry as they are so expert on more complicated analyses on more complicated games made those "mistakes".

http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

I will put a small quote.



Sub HD now on Xbox 360 and Ps3 is Dynamic Resolution....This guys at Eurogamer are hggggg.

That's kind of messed up actually. It's sub HD if Nintendo's console is involved, but when it's the other consoles, it's Dynamic Resolution. Do I smell a bias?
 
Here's something that's been bugging me. Chipworks says it's completely custom, but Digital Foundry were just claiming up and down they were right about it being a 4680 base GPU. So, did DF take their word back?

That's kind of messed up actually. It's sub HD if Nintendo's console is involved, but when it's the other consoles, it's Dynamic Resolution. Do I smell a bias?

Do not let me star about the comparisson about call of duty black ops 2. The lies there skyrocket to the next level. The guys even copy/paste post from GAF to explain the "inards of Wii U" after the shot of the die GAF paid for an analysis. This guys are totally biased.
 
That's kind of messed up actually. It's sub HD if Nintendo's console is involved, but when it's the other consoles, it's Dynamic Resolution. Do I smell a bias?

There is a difference. Dynamic resolution switches between a higher res and lower ones in order to maintain performance whereas sub-HD is generally used to refer to a fixed rendering resolution lower than 720p. Unless you're talking about an example in which a game with dynamic resolution on the Wii-U was referred to flatly as sub-HD without qualifying that the resolution is dynamic...
 
WIth all the GPU discoveries and confirmations one thing that has gone unspoken for is the GX2 API used for it. I am most certainly curious if its truly a match for Open GL 3,3 or higher. Also would love to hear of the Wii U support for Compute Shader and if we may see a port of the TressFX tech.
 
Do you think built from the ground up Nintendo games (or any game that pushes the Wii U hardware) could look graphically as good as a PS4 game or a little worse?

I thought super mario galaxy was comparable to 360 and ps3. with custom tech much closer in power, i wouldnt doubt if we see games look as good, seeing as nintendo are the best to get the most out of hardware.

I might be one of the few who think there is a lot of secret sauce in this gpu, what kind of sauce, who knows but it is definitely a secret.
 
Status
Not open for further replies.
Top Bottom