WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
It was said 720p was chosen because it performs better at that but it would still perform better because 720p uses less resources than 1080p. Unless you really love 720p above all else, there is no reason to not to include 1080p.

?! You're making less and less sense.
 
It was said 720p was chosen because it performs better at that but it would still perform better because 720p uses less resources than 1080p. Unless you really love 720p above all else, there is no reason to not to include 1080p.

I can't tell if you're being intentionally obtuse or what. 720p requires less powerful hardware. Less powerful hardware is generally smaller/cheaper. When you put the hardware together your GPU, available bandwidth, etc. are going to determine how well games of various complexities and asset qualities will run at whatever resolution.

In the WiiU's case 720p ideal, but nothing is stopping someone from making a 1080p (or 480p or whatever) game if it runs well at that resolution. It's not really any different than the PS4 not going 4K over 1080p.
 
In the WiiU's case 720p ideal, but nothing is stopping someone from making a 1080p (or 480p or whatever) game if it runs well at that resolution. It's not really any different than the PS4 not going 4K over 1080p.


You know, I'd love for a developer to add a 480 or custom res options. Be nice to trade quality for framerate sometimes.

Resolution is highly overrated IMO; it's the fidelity at said resolution that matters. We watched SD movies for decades on DVD and they looked stunning.
 
I can't tell if you're being intentionally obtuse or what. 720p requires less powerful hardware. Less powerful hardware is generally smaller/cheaper. When you put the hardware together your GPU, available bandwidth, etc. are going to determine how well games of various complexities and asset qualities will run at whatever resolution.

In the WiiU's case 720p ideal, but nothing is stopping someone from making a 1080p (or 480p or whatever) game if it runs well at that resolution. It's not really any different than the PS4 not going 4K over 1080p.
Then I'm confused how this whole 720p debacle started. I know it uses less resources. Creating a console that doesn't focus on 720p will still give you the same results. Just don't use 1080p (but then you end up with worse graphics).


Nintendo isn't holding back anything or anyone, the tools simply weren't ready in time. Happened to Sony as well back in 2006.
So from CPU to tools, why was the first even mentioned by Iwata if it clearly has no importance?
 
What coprocesssors? Audo DSP is official, but what else?

A fucking GBA is good for mixing 16 channels of CD-frequency audio.

Pretty stupid statement considering the GBA had only an 8-bit DAC at 32.8 kHz of which often only half or quarter the range was used to save on processing power. And most GBA games still used between 25 and 50% of the (single) CPU for sound unless they chose to additionally use the old GB synthesizer.
 
Sorry I went to sleep not long after posting.

I was going by faulty intel with part of that. I thought Starbuck handled more of the OS load than it apparently does.

The Mem architecture is interesting because of how many separate pools with separate speeds there is in the system. I mean you've got the anemic Mem2 pool of 2gigs DDR3 at 12.8gb/s, Mem1 or 0 is either the 32MB eDram at 70.6gb/s or the 3MB SRAM at 100+gb/s.

It's an odd memory design. Far from the UMA design most devs seem to prefer.


When was memory bandwidth concluded for Mem 0/1? What's odd is 3MB @ 100+GB/s, don't you think that should be the other way around.
 
Well... those are more projections than actual figures.

The 3MB SRAM is apparently on a very wide bus. Which is where its speed come from. Necessary for Wii emulation.

Its running natively on Wii Us hardware, so its not emulation. But yeah they need it to run Wii software.

Also Gamecube VC could basically run without much port work since even those games should run natively on the hardware. Just add all the Wii U specific features and yer good to go. (If thats not the case, feel free to correct me)
 
Its running natively on Wii Us hardware, so its not emulation. But yeah they need it to run Wii software.

Also Gamecube VC could basically run without much port work since even those games should run natively on the hardware. Just add all the Wii specific features and yer good to go. (If thats not the case, feel free to correct me)
No it's still emulation.

Just purely hardware based. Any time you're running software on another distinct platform you're in essence emulating the hardware it ran on. Most of the time that's through software means. Nintendo for two generations now literally has a lot of legacy hardware in their systems to allow easier hardware based emulation.

This means those games will run natively and unchanged.
 
When was memory bandwidth concluded for Mem 0/1? What's odd is 3MB @ 100+GB/s, don't you think that should be the other way around.

Not really, small pools of cache with incredibly high bandwidth are pretty much a staple of processor design.

i7 cache for example
PGvFaQf.jpg
 
No it's still emulation.

Just purely hardware based. Any time you're running software on another distinct platform you're in essence emulating the hardware it ran on. Most of the time that's through software means. Nintendo for two generations now literally has a lot of legacy hardware in their systems to allow easier hardware based emulation.

This means those games will run natively and unchanged.

So what you are saying is DSi emulates DS, DS emulates GBA, GBA emulates GBC/GB even though those platforms have the hardware to run those platforms natively?

Emulation is different than what you are saying afaik, it is when you have to emulate on hardware the code was not written for. Having hardware that is compatible with code natively means you are not Emulating. Even hardware emulation requires you to trick the code to run on the hardware in ways it was not intended. This might be true for Wii U since components have changed, but Wii ran Gamecube code natively without having to preform any "tricks"
 
So what you are saying is DSi emulates DS, DS emulates GBA, GBA emulates GBC/GB even though those platforms have the hardware to run those platforms natively?

Emulation is different than what you are saying afaik, it is when you have to emulate on hardware the code was not written for. Having hardware that is compatible with code natively means you are not Emulating. Even hardware emulation requires you to trick the code to run on the hardware in ways it was not intended. This might be true for Wii U since components have changed, but Wii ran Gamecube code natively without having to preform any "tricks"
The DSi is literally a DS with extra functionality. That's not exactly the same thing.

Any time you're talking about architectural changes... even minor ones like increased clocks you are emulating hardware. Nintendo for the past couple of generations has just had settings for that hardware that limits clock speeds, or aligns the hardware in a similar configuration to their prior hardware.

I consider that a form of hardware emulation. It's a strange way to do it (leaving an abundance of legacy hardware in new hardware) but I still consider it emulation. Even if it doesn't follow a strict definition.
 
You know, I'd love for a developer to add a 480 or custom res options. Be nice to trade quality for framerate sometimes.

Resolution is highly overrated IMO; it's the fidelity at said resolution that matters. We watched SD movies for decades on DVD and they looked stunning.
On a good crt I agree. Unfortunately, most of us game on fixed pixel displays that require upscaling. I would say that 480p is tolerable at best but far from ideal on my Panasonic plasma.
 
So what you are saying is DSi emulates DS, DS emulates GBA, GBA emulates GBC/GB even though those platforms have the hardware to run those platforms natively?

Emulation is different than what you are saying afaik, it is when you have to emulate on hardware the code was not written for. Having hardware that is compatible with code natively means you are not Emulating. Even hardware emulation requires you to trick the code to run on the hardware in ways it was not intended. This might be true for Wii U since components have changed, but Wii ran Gamecube code natively without having to preform any "tricks"

The DS literally has GBA hardware inside it. The Wii U simply has hardware made to run the same code and act in the exact same manner (roughly) as the Wii. It doesn't have Wii hardware in it.
 
The DSi is literally a DS with extra functionality. That's not exactly the same thing.

Any time you're talking about architectural changes... even minor ones like increased clocks you are emulating hardware. Nintendo for the past couple of generations has just had settings for that hardware that limits clock speeds, or aligns the hardware in a similar configuration to their prior hardware.

I consider that a form of hardware emulation. It's a strange way to do it (leaving an abundance of legacy hardware in new hardware) but I still consider it emulation. Even if it doesn't follow a strict definition.

Ah I see, yeah your definition is widely accepted. Wii U might be slightly different in this regard, as the hardware from Gamecube/Wii is not directly found more of a blending of Radeon technology and flipper, it should be quite confusing for us, and I think that is why this thread has had so much off topic discussion, just nothing we can really make out of the GPU any longer.
 
Well, we can throw my SPU+TEV theory out the window. Marcan has tweeted that there is logic (and even a tiny 8-bit CPU) on Latte that translates Wii code into stuff a modern Radeon can process. I kind of suspected this as a possibility when the CPU performing translation was ruled out, but I guess my optimism got the best of me once again. I'm still confident in the rest of my analysis, though. Well, having a couple texture coordinate processors in a modern GPU might still be a bit out there, but until those Q blocks are positively identified, they remain a possibility in my eyes. They could also be related to video output, although it would be an odd place for them.
 
Well, we can throw my SPU+TEV theory out the window. Marcan has tweeted that there is logic (and even a tiny 8-bit CPU) on Latte that translates Wii code into stuff a modern Radeon can process. I kind of suspected this as a possibility when the CPU performing translation was ruled out, but I guess my optimism got the best of me once again. I'm still confident in the rest of my analysis, though. Well, having a couple texture coordinate processors in a modern GPU might still be a bit out there, but until those Q blocks are positively identified, they remain a possibility in my eyes. They could also be related to video output, although it would be an odd place for them.

Are you ruling out shader customization entirely?
 
I'm not in any position to say, but it seems like it would be more tweaks at this point. Like how devs were saying that it does some stuff beyond DirectX10.1/SM4.0 but not fully DirectX11/SM5.0.

So essentially we're dealing with a modern radeon core, tuned for efficiency and compatibility, with a slight to moderate power advantage over previous generation consoles? I guess we'll have to learn to deal.
 
Well, we can throw my SPU+TEV theory out the window. Marcan has tweeted that there is logic (and even a tiny 8-bit CPU) on Latte that translates Wii code into stuff a modern Radeon can process. I kind of suspected this as a possibility when the CPU performing translation was ruled out, but I guess my optimism got the best of me once again. I'm still confident in the rest of my analysis, though. Well, having a couple texture coordinate processors in a modern GPU might still be a bit out there, but until those Q blocks are positively identified, they remain a possibility in my eyes. They could also be related to video output, although it would be an odd place for them.

It is interesting how the issue with the TEVs is solved by just using some logic and a very tiny 8-bit CPU as a translator.
You're getting warmer

You sound as if you know exactly what is going on. I thought all the info you were hiding was now out to the public :D
 
You're getting warmer

I'm with lwilliams. If you have some info, spill it! :D And hey, when did you lose your tag?

So essentially we're dealing with a modern radeon core, tuned for efficiency and compatibility, with a slight to moderate power advantage over previous generation consoles? I guess we'll have to learn to deal.

In essence, it seems like they are squeezing out ~Xbox360/PS3 performance using less but more sophisticated logic in combination with larger/faster memory pools.

Here is hoping that 3D Mario/Mario Kart/X/Smash make dealing easier, but I still think Nintendo deserve a smackdown for aiming so low.

It is interesting how the issue with the TEVs is solved by just using some logic and a very tiny 8-bit CPU as a translator.

It certainly is. It seems somewhat odd that they still needed to add some additional logic, but it probably does end up amounting to less silicon/complexity in the end than just including a shrunk down Flipper on die. I'd like to try and identify where that 8-bit CPU is out of sheer curiosity. Shame that there doesn't seem to be more interest in Wii U homebrew.
 
The TEVs aren't exactly very complicated compared to a modern GPU.
They're not very complicated, but as far as I understand, they can handle certain operations in one cycle that would require several cycles on modern GPUs. So even if there's some translation going on, the GPU needs to have those shortcuts in physical form.
 
They're not very complicated, but as far as I understand, they can handle certain operations in one cycle that would require several cycles on modern GPUs. So even if there's some translation going on, the GPU needs to have those shortcuts in physical form.

True, but perhaps it is possible if the work is distributed among all the shaders. How many shaders is one TEV equal to?
 
Resolution is highly overrated IMO; it's the fidelity at said resolution that matters. We watched SD movies for decades on DVD and they looked stunning.

This is kind of a false equivalence because those DVDs are captured from film which stores anything between 2k and 8k worth of information (there's also the fact that real life doesn't have any aliasing). The result is quite different from a video captured at 480p from the get go. In a game that information simply doesn't exist when rendering at a lower resolution; MSAA will tidy up the image but doesn't resolve extra detail uniformly. SSAA will, but why bother with supersampling a lower res instead of just increasing it?
 
Yes, and graphics are just a few JPGs. You're really underestimating game audio massively. Here, maybe you want to read a bit:

http://www.fmod.org/fmod-studio.html
http://www.audiokinetic.com/products/208-wwise/
I know, for the most part, what those audio libraries do - they provide event triggered playbacks, sound mixing, EQ, and simple effects that run at comparably (to an offline studio quality audio processor) low quality, so that they'd run quickly for realtime purpose. Complexities can of course arise if the game goes a step beyond standard positional audio, so that it calculates audio bouncing and occlusion. I don't think many games do this, but you'd think that a game like BF3 that is so taxing on all fronts, wouldn't be able to afford such audio extravagance if it was very cpu demanding.
 
In essence, it seems like they are squeezing out ~Xbox360/PS3 performance using less but more sophisticated logic in combination with larger/faster memory pools.

Here is hoping that 3D Mario/Mario Kart/X/Smash make dealing easier, but I still think Nintendo deserve a smackdown for aiming so low.

Well, the way I look at it, you have to make your peace with it or move on from Nintendo. That being said, I am willing to forgive a lot of things if we get some unique and interesting game concepts out of the hardware. If nothing else I'm mighty impressed by the streaming tech and I think there are a lot of fun things that can come out of it if it's leveraged in the right way. Potential is worth a hill of beans, so I guess we'll have to see what plays out.
 
Well, the way I look at it, you have to make your peace with it or move on from Nintendo. That being said, I am willing to forgive a lot of things if we get some unique and interesting game concepts out of the hardware. If nothing else I'm mighty impressed by the streaming tech and I think there are a lot of fun things that can come out of it if it's leveraged in the right way. Potential is worth a hill of beans, so I guess we'll have to see what plays out.

I don't want to get too off topic, but I do largely agree. As for moving on, I may personally do so if Nintendo merely release Super Mario Galaxy 3 and another Zelda that sticks to the OoT formula. Of course, I already have the U and will be picking up whatever games strike my fancy. There is the real possibility, though, that this miscalculation with Wii U, its image, and its level of horsepower will really hurt Nintendo. As Adam Sessler stated in that vlog he did on Nintendo not having a big E3 conference, that's just not a good thing for the industry as a whole. Without Nintendo and its unique combination of software/hardware offerings, there would be a huge gap to fill, and I'm not sure any company is up to the task of filling its boots.
 
I don't want to get too off topic, but I do largely agree. As for moving on, I may personally do so if Nintendo merely release Super Mario Galaxy 3 and another Zelda that sticks to the OoT formula. Of course, I already have the U and will be picking up whatever games strike my fancy. There is the real possibility, though, that this miscalculation with Wii U, its image, and its level of horsepower will really hurt Nintendo. As Adam Sessler stated in that vlog he did on Nintendo not having a big E3 conference, that's just not a good thing for the industry as a whole. Without Nintendo and its unique combination of software/hardware offerings, there would be a huge gap to fill, and I'm not sure any company is up to the task of filling its boots.

No matter what, the Wii U will not sink Nintendo. They have already pre-paid for a ton of inventory and with the yen finally losing ground to the dollar, they will likely turn a profit even if sales remain anemic. They also have a relatively strong year coming with the 3DS.

It's really a matter of them turning out software that attracts attention. I'm not one who thinks the industry sinks or swims at E3, but I think Sessler is right insofar that Nintendo is losing the war for attention from the gaming media, and that will be damaging long term. I'm interested to see the impressions of their E3 Media Event. Personally, I think what they are saying with their E3 schedule is that they think the value of the Wii U is better explained hands on than from a stage.

The risks inherent in not holding a giant E3 event against two console launches are not difficult to discern. It may be more tactical than would appear at face value, however. I think Nintendo would be smart to monopolize as much media time as possible in the two weeks leading to E3. If they hold ten Nintendo directs, all 10-20 minutes and focused tightly on individual titles or future OS upgrades and platform features, they could end up getting their stories attention on news sites for a longer period than they would normally get if they were doing rapid fire announcements at a large presser.
 
No matter what, the Wii U will not sink Nintendo. They have already pre-paid for a ton of inventory and with the yen finally losing ground to the dollar, they will likely turn a profit even if sales remain anemic. They also have a relatively strong year coming with the 3DS.

It's really a matter of them turning out software that attracts attention. I'm not one who thinks the industry sinks or swims at E3, but I think Sessler is right insofar that Nintendo is losing the war for attention from the gaming media, and that will be damaging long term. I'm interested to see the impressions of their E3 Media Event. Personally, I think what they are saying with their E3 schedule is that they think the value of the Wii U is better explained hands on than from a stage.

The risks inherent in not holding a giant E3 event against two console launches are not difficult to discern. It may be more tactical than would appear at face value, however. I think Nintendo would be smart to monopolize as much media time as possible in the two weeks leading to E3. If they hold ten Nintendo directs, all 10-20 minutes and focused tightly on individual titles or future OS upgrades and platform features, they could end up getting their stories attention on news sites for a longer period than they would normally get if they were doing rapid fire announcements at a large presser.

If you think about last year's E3, people generally liked the Nintendo Directs but was critical about stage event due to:

1) Alot of the infomation was already covered in the pre-E3 Nintendo Directs (which proved that the media was generally paying attention to them)

2) They tried to address too many types of gamers in the same conference.

They tried to cover so many things that they didn't really completely satisfy anyone. They even had to shift out alot of the 3DS stuff into a special ND, (which was generally well received.) In retrospect, Iwata was probably like, "Instead of spending so much time with Nintendoland, we could have did that with a 20 min ND," and so on. With this new approach, Nintendo can potentially set up hours of stuff about 3DS and Wii U and split them into different E3 events with the appropriate audience.

Anyway, back on topic.

I know, for the most part, what those audio libraries do - they provide event triggered playbacks, sound mixing, EQ, and simple effects that run at comparably (to an offline studio quality audio processor) low quality, so that they'd run quickly for realtime purpose. Complexities can of course arise if the game goes a step beyond standard positional audio, so that it calculates audio bouncing and occlusion. I don't think many games do this, but you'd think that a game like BF3 that is so taxing on all fronts, wouldn't be able to afford such audio extravagance if it was very cpu demanding.

Actually.. that could have been a reason why the FB engine was having trouble running on the Wii U. The system was designed to let its DSP to handle most sound tasks, while that engine was likely not optimized to use it.
 
At this point, I see the Wii U as an Xbox 360 with better textures/filtering with more advanced graphical features (DirectX 10.1+) sprinkled on top.

I'm fine with that I guess.
 
Actually.. that could have been a reason why the FB engine was having trouble running on the Wii U. The system was designed to let its DSP to handle most sound tasks, while that engine was likely not optimized to use it.
Possibly. Large parts of the audio engine would need to be rewritten to use the DSP. They'd also need to use Nintendo's proprietary sample format (GCADPCM), any other format would destroy the CPU.
 
Actually.. that could have been a reason why the FB engine was having trouble running on the Wii U. The system was designed to let its DSP to handle most sound tasks, while that engine was likely not optimized to use it.
In all honesty though, it's not like they had trouble, they tested it with FB2, realized architecture wasn't in the same nature PS3/X360 is and called it a day; this before assigning proper resources to it or investing any time.

That's kinda like when you test drive a car you don't want just to enforce your made-up opinion. I call that barely trying.


If they stuck with it who knows what kind of trouble they'd run into, but preliminarily they were just lazy (and perhaps a little bit prejudiced), and their parent company just let them because it seems to fit their agenda nicely.

I'm guessing FB3 on PS3/X360 isn't a walk in the park either, but EA would force them to do it regardless of their opinion regarding it. (BF3 on PS3 has all that input lag, far from an optimal port)
 
At this point, I see the Wii U as an Xbox 360 with better textures/filtering with more advanced graphical features (DirectX 10.1+) sprinkled on top.

I'm fine with that I guess.


No, I see it as an extension of the GameCube design and philosophy, which now
allows for HD gaming.
 
The TEVs aren't exactly very complicated compared to a modern GPU.
True that. You could probably do a completely simplistic lookup-based "emulation" of TEV stage config => shader code. Worst case you're probably looking at 5 hardware instructions per emulated TEV stage*. The resulting shader code won't be optimal, but the GPU can just burn through it with brute force. It's much wider and higher clocked anyway. Sth like 80 cycles per pixel at worst vs 32 cycles on a real Hollywood in the same worst case.

Maybe that's what the speculated/rumored "8 bit CPU" is doing: remap TEV stage configs to shader instructions in the simplest possible way.

If they aim for cycle-perfect emulation, they'd probably need some extra logic in the GPU to throttle its pixel shading speed down to Hollywood levels.


*not sure about actual hardware details, but I haven't seen constant-prescaling/-postscaling (factors of 1/8, 1/4 ... 2, 4, 8) operations exposed in anything past Geforce FX 5xxx/Radeon 8500; these alone can account for two OPs blown per TEV stage. Then you have to account for a texcoord perturb+dependent read for EMBM. That's another two. And then the "main" color combine part of the OP for the fifth.

Even controlled degradation of channel precision can be emulated with shader OPs, but I'd wager the hardware can do this internally, with trivial logic overhead, and it's probably not even necessary for a pixel-perfect result. Alpha test fail/pass conditions based on math with a very specific channel precision is the only case where "too much" precision can cause artifacts IME.
 
At this point, I see the Wii U as an Xbox 360 with better textures/filtering with more advanced graphical features (DirectX 10.1+) sprinkled on top.

I'm fine with that I guess.

Using that logic, the PS4 would be and Xbox 360 with even better textures/fitlerting graphical features and DX11 sprinkled on top.
 
Well, we can throw my SPU+TEV theory out the window. Marcan has tweeted that there is logic (and even a tiny 8-bit CPU) on Latte that translates Wii code into stuff a modern Radeon can process. I kind of suspected this as a possibility when the CPU performing translation was ruled out, but I guess my optimism got the best of me once again. I'm still confident in the rest of my analysis, though. Well, having a couple texture coordinate processors in a modern GPU might still be a bit out there, but until those Q blocks are positively identified, they remain a possibility in my eyes. They could also be related to video output, although it would be an odd place for them.

Even if this is true, how does this logic change your view on the gpu from before. Would this logic take up a lot of die space?
 
Status
Not open for further replies.
Top Bottom