WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I slightly disagree with Log4Girlz. While I don't expect improvements from 3rd parties with games such as Assassin's Creed, which are most likely pushing the console, I do expect some improvements from Nintendo. IMO, Super Mario 3D World looks good because of artstyle, but other than that it doesnt seems to push the Wii U at all. In fact, the same applies for nearly all Nintendo titles, except from the Monolith Soft RPG.
 
Off topic but I am pretty sure the ps4 wil have pretty mindblowing performance gains over time. I dont think any of the launch titles even use GPGPU or paralell processing or whatnot.

Yes, and I am not saying it won´t. In the end the PS4 will have the better graphics on all platforms, I was referring to gains relative to current results and the hardware. But saying Wii U is max out is nuts.
 
Even though devs are familiar with modern shaders and graphical features, the Wii U does have an unique architecture. It will still take time for developers to get the full capabilities of the hardware. The original SKUs for Wii U were rough (gpgpu features not utilized, some launch titles using all of the CPU cores, etc..), so that has to be considered too.

I slightly disagree with Log4Girlz. While I don't expect improvements from 3rd parties with games such as Assassin's Creed, which are most likely pushing the console, I do expect some improvements from Nintendo. IMO, Super Mario 3D World looks good because of artstyle, but other than that it doesnt seems to push the Wii U at all. In fact, the same applies for nearly all Nintendo titles, except from the Monolith Soft RPG.

The crazy thing about Nintendo is that they are still getting used to modern shaders and abilities. Unlike other companies, they didn't have a decade of experience working on that type of hardware. This means that we should see some even more impressive games from Nintendo as time goes on.
 
While I don't expect improvements from 3rd parties with games such as Assassin's Creed, which are most likely pushing the console...

http://www.neogaf.com/forum/showpost.php?p=90372446&postcount=1042

How do you explain that? Assassins Creed 3 vs Assassins Creed 4, on WiiU. Less than 12 months in between. No improvements? So much for pushing the console (in the case of AC3).

Ubisoft has mentioned that moving forward, the Assassin's Creed series will become more "next-generation" vs. this cross-generation title. Let's see how the Wii U version stacks up then. I'd imagine since you're expecting huge strides on the order of AC3=>AC4 to continue, and that AC4 is not coming close to maxing the Wii U out, then future titles should run juuuust fine. Just wait for those huge increases to resolution, geometric detail, physics that'll blow AC4 out of the water.

Funny, that after being proven wrong, you move your goalpost. In less than 12 months we just saw a noticable step up. But all of a sudden THIS is now supposed to be the best we can expect. Hilarious.
 
You can call it 'magic', 'special sauce' or whatever you want but this isn't a normal GPU. All you have to do is look at the die shot and see that the ALUs are 90% larger than they should be to realise that.

Are people taking into account that A) not all fabrication processes at the "same" node are actually the same size, ie Intels 45nm is still smaller than TSMCs or GloFlos 40nm because they are measured differently, laid out differently, and optimized for different things? You can optimize for speed, size, power draw, or make it less prone to failing/bad yields, all of which trade off some of the other things and B) this is a relatively unknown fab for the electronics we usually buy?


Even among the same fabrication process at the same factory, you can have dramatically different layout optimizations. And the size only describes the minimum feature size, it's not like every single transistor in there is that size.

So I'm not convinced that 90% bigger means secret sauce.
 
I wonder what games will be the graphical show pieces for the WiiU

Ia'm thinking

FPS = metroid?
third person action adventure = possibly zelda depending on the art style



I dont know if metroid will be able to look better then halo4 or if the new zelda will look better then the witcher 2,uncharted 2 etc?

I dont think its entirely about hardware either, you have to take into account of budget and dev resources, does Nintendo have devs like 343 industries that can create AAA shooters? or devs like naughty dog and sony SM that can create at AAA 3rd person action games?

As impressive as X is on the WiiU it does not have the features that make the difference between a next gen game and a current gen game. I'am talking about advanced next gen lighting,High res textures, increased polycount, next gen particles and other effects such as smoke and fire, fluid and material simulation, and physics.
 
Are people taking into account that A) not all fabrication processes at the "same" node are actually the same size, ie Intels 45nm is still smaller than TSMCs or GloFlos 40nm because they are measured differently, laid out differently, and optimized for different things? You can optimize for speed, size, power draw, or make it less prone to failing/bad yields, all of which trade off some of the other things and B) this is a relatively unknown fab for the electronics we usually buy?


Even among the same fabrication process at the same factory, you can have dramatically different layout optimizations. And the size only describes the minimum feature size, it's not like every single transistor in there is that size.

So I'm not convinced that 90% bigger means secret sauce.

Even if the size of the space can be explained as you've suggested that doesn't explain how a 160 ALU part with a power draw of 20 Watts can produce eye candy seen from Pikmin 3, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU. And these are just late first generation and early second generation titles, and a fair few of those are 720p native, 60fps and have v-synch enabled.

A bog standard GPU with 160 ALUs drawing 20 Watts of juice shouldn't be able to do that. I'm still in the fixed function camp and have been for some time. If anyone else can come up with an alternative reason for the GPU clearly punching well above its weight I'd love to hear it.
 
As impressive as X is on the WiiU it does not have the features that make the difference between a next gen game and a current gen game. I'am talking about advanced next gen lighting,High res textures, increased polycount, next gen particles and other effects such as smoke and fire, fluid and material simulation, and physics.
While this is probably from a cut scene, it's clearly generated from the game engine, and not CG. I think that we'll see everything in this game that you just mentioned. Though I would imagine that they'll probably use high resolution textures sparingly due to the size of the environments.

x.gif
 
I wonder what games will be the graphical show pieces for the WiiU

Ia'm thinking

FPS = metroid?
third person action adventure = possibly zelda depending on the art style



I dont know if metroid will be able to look better then halo4 or if the new zelda will look better then the witcher 2,uncharted 2 etc?

I dont think its entirely about hardware either, you have to take into account of budget and dev resources, does Nintendo have devs like 343 industries that can create AAA shooters? or devs like naughty dog and sony SM that can create at AAA 3rd person action games?

As impressive as X is on the WiiU it does not have the features that make the difference between a next gen game and a current gen game. I'am talking about advanced next gen lighting,High res textures, increased polycount, next gen particles and other effects such as smoke and fire, fluid and material simulation, and physics.

I don't know....
watch the trailers again. there is def lightning and some effects I haven't seen on last gen consoles. especially the scene in the hangar with the mechs and the outdoor locations. Also don't forget that the draw distance is insane and ,still ,the backgrounds are highly detailed.
 
Even if the size of the space can be explained as you've suggested that doesn't explain how a 160 ALU part with a power draw of 20 Watts can produce eye candy seen from Pikmin 3, The Wonderful 101, Super Mario 3D World, Mario Kart 8, X, Bayonetta 2 and SSBU. And these are just late first generation and early second generation titles, and a fair few of those are 720p native, 60fps and have v-synch enabled.

A bog standard GPU with 160 ALUs drawing 20 Watts of juice shouldn't be able to do that. I'm still in the fixed function camp and have been for some time. If anyone else can come up with an alternative reason for the GPU clearly punching well above its weight I'd love to hear it.

160 shaders, but on a much more modern architecture than the previous generation, with a probably many times over better tesselator, would alone be able to do things the previous gen could not (even if I were convinced those games are definitely undoable by the PS360). I'm still not buying secret shader sauce.
 
http://www.neogaf.com/forum/showpost.php?p=90372446&postcount=1042

How do you explain that? Assassins Creed 3 vs Assassins Creed 4, on WiiU. Less than 12 months in between. No improvements? So much for pushing the console (in the case of AC3).



Funny, that after being proven wrong, you move your goalpost. In less than 12 months we just saw a noticable step up. But all of a sudden THIS is now supposed to be the best we can expect. Hilarious.

At what point did I claim the Wii U was maxed at launch and wouldn't show an iota of improvement? When did I say AC4 was not an improvement over AC3? My claim is that there is little to no improvement to be made as AC4 is probably close to its max. You are expecting further improvement on that level.
 
160 shaders, but on a much more modern architecture than the previous generation, with a probably many times over better tesselator, would alone be able to do things the previous gen could not (even if I were convinced those games are definitely undoable by the PS360). I'm still not buying secret shader sauce.
The problem is that the tessellator has still to be put to work on those first generation engines, and I don't think that they're already implementing geometry shaders.
Modern architecture allows for different techniques that could allow more efficient approaches, but at this point I think that those DOF effects are more probably related to those SPUs having something that normal SPUs lack than the engines being build entirely around those new features.

Log4Girlz said:
At what point did I claim the Wii U was maxed at launch and wouldn't show an iota of improvement? When did I say AC4 was not an improvement over AC3? My claim is that there is little to no improvement to be made as AC4 is probably close to its max. You are expecting further improvement on that level.
AC4 was designed around old generation hardware, and as I said to you in a previous post:
In other words, could AC4 be rebuilt using geometry shaders, tessellation and a deferred rendering in order to improve the polygon count and the amount of real time lights?
Yes, that could be done on the WiiU and is not supported on current gen engines.
 
At what point did I claim the Wii U was maxed at launch and wouldn't show an iota of improvement? When did I say AC4 was not an improvement over AC3? My claim is that there is little to no improvement to be made as AC4 is probably close to its max. You are expecting further improvement on that level.
You have weak basis for even that (post-goalpost-move) claim.

Tipoo seems to mostly agree with you but he's being far more rational and level headed about it, while your style more closely resembles console warz drivel. He's at least leaving the door open to being proven wrong eventually while you're dealing in absolutes.

This is a technical thread, and shitting it up with this kind of barely-founded (I won't completely call it unfounded) speculation isn't helping anybody.
 
At what point did I claim the Wii U was maxed at launch and wouldn't show an iota of improvement? When did I say AC4 was not an improvement over AC3? My claim is that there is little to no improvement to be made as AC4 is probably close to its max. You are expecting further improvement on that level.
A multi-platform game designed to look basically the same on over 5 different platforms and that is probably not even pushing the current-gen systems to its breaking point? Perhaps you need to define what you mean by "little improvement."

There will not be as much improvement with the Wii U compared to the 360/PS3 due to modern shader experience (except for Nintendo), but to make that claim is a bit far-fetched.
 
Ubisoft has mentioned that moving forward, the Assassin's Creed series will become more "next-generation" vs. this cross-generation title. Let's see how the Wii U version stacks up then. I'd imagine since you're expecting huge strides on the order of AC3=>AC4 to continue, and that AC4 is not coming close to maxing the Wii U out, then future titles should run juuuust fine. Just wait for those huge increases to resolution, geometric detail, physics that'll blow AC4 out of the water.

I pretty much doubt that the wii-u will run any next gen titles, especially when developers start to design the games around the XO limitations.
 
A multi-platform game designed to look basically the same on over 5 different platforms and that is probably not even pushing the current-gen systems to its breaking point? Perhaps you need to define what you mean by "little improvement."

There will not be as much improvement with the Wii U compared to the 360/PS3 due to modern shader experience (except for Nintendo), but to make that claim is a bit far-fetched.

Yep, it is a multi-plat. Like I've said in other posts, TLoU and Uncharted series better than anything on Wii U (seeing as how they are PS3 exclusives, it shouldn't surprise that it looks better). The Wii U is perfectly capable of those graphics, and with better textures a slightly higher resolution with the right budget. That to me is a slight improvement over last gen, which is exactly what the Wii U is. A slight improvement to the last generation. The best I expect to see are games that look like they could have been done last gen but with slightly better textures and resolution and we're almost there.

I do not anticipate the same leap in graphics from gen 1 PS3 games to TLoU on the Wii U. It is not getting an equivalent graphics increase over time as devs already have a huge amount of knowledge on that level of hardware and are able to put it to much more efficient use out of the gate vs. a normal console cycle, like the PS3. While others in this thread are expecting that kind of vast improvement and don't think Wii U is even breaking a sweat.
 
Ubisoft has mentioned that moving forward, the Assassin's Creed series will become more "next-generation" vs. this cross-generation title. Let's see how the Wii U version stacks up then. I'd imagine since you're expecting huge strides on the order of AC3=>AC4 to continue, and that AC4 is not coming close to maxing the Wii U out, then future titles should run juuuust fine. Just wait for those huge increases to resolution, geometric detail, physics that'll blow AC4 out of the water.

I'm dismissing what you have to say going forward. You were given a link that clearly shows a difference, yet here you are, spitting in the wind. Your entire argument is based on things that have yet to come. When given "proof" of the opposite, you dismiss it and set a higher bar. Suffice to say, you have an agenda to validate your position regardless of what anyone states. You're not contributing to the discussion and are in fact detracting from it.

There's no basis for your opinions, yet you continue to post.
 
3D World locked at 60 fps while 4 players are going nuts should end all the discussions about how underpowered the Wii U is.

Should.


Please return to actual tech analysis.
 
While this is probably from a cut scene, it's clearly generated from the game engine, and not CG. I think that we'll see everything in this game that you just mentioned. Though I would imagine that they'll probably use high resolution textures sparingly due to the size of the environments.

x.gif

I don't think anything we've seen of X so far should be used as indicative of what the final game's graphics look like or what the Wii U is capable of.
 
Yep, it is a multi-plat. Like I've said in other posts, TLoU and Uncharted series better than anything on Wii U (seeing as how they are PS3 exclusives, it shouldn't surprise that it looks better). The Wii U is perfectly capable of those graphics, and with better textures a slightly higher resolution with the right budget. That to me is a slight improvement over last gen, which is exactly what the Wii U is. A slight improvement to the last generation. The best I expect to see are games that look like they could have been done last gen but with slightly better textures and resolution and we're almost there.

I do not anticipate the same leap in graphics from gen 1 PS3 games to TLoU on the Wii U. It is not getting an equivalent graphics increase over time as devs already have a huge amount of knowledge on that level of hardware and are able to put it to much more efficient use out of the gate vs. a normal console cycle, like the PS3. While others in this thread are expecting that kind of vast improvement and don't think Wii U is even breaking a sweat.

You don't think the Wii U will be able to take more utilize improved lighting and special effects over current-gen? Those more modern features alone can make a notable difference (look at WW HD)

Btw, your argument seem to change between paragraphs. The second paragraph is closer to what I said earlier.
 
And we know for certain it absolutely never dips like other 60fps games? Did anyone do a frame rate test?

Considering Vsync is enabled in nearly every Wii U game, 60fps is quite impressive even if it did dip a few frames here or there (Not that I've seen any indication that it will, since the game clearly has a bunch of headroom. Just look at the cherry powerup)
 
You don't think the Wii U will be able to take more utilize improved lighting and special effects over current-gen? Those more modern features alone can make a notable difference (look at WW HD)

Btw, your argument seem to change between paragraphs. The second paragraph is similar to what I said earlier.

Yeah, the more modern GPU isn't stressed enough, Tessellation, more RAM, better shaders and lighting, all of which are less resource heavy than previous consoles and to top it off, you have everything running in Vsync to avoid screen tearing (my most hated graphical error from last gen) really should put to bed this debate, but most people don't understand what I've just said even in this tech thread (not that it is very technical at all) so the debate will continue until we stop showing up for them.

Oops, sorry for dp.
 
I was looking at Shin'en Twitter just now and there's an interesting (well for a luddite like me) exchange about their work on Fast Racing Neo regarding their texture work.

Someone should ask if they're targeting 1080p and if they're using any kind of AA.
 
The difference between games that launched with the 7th gen, and say, Last of Us is monstrous. Absolutely huge. The Wii U will see nowhere near the improvement. It is nearly maxed out.
That's just hyperbole. It has been out for a year.

If what you are trying to say is that we will not see a difference as big as last gen between launch and late gen titles on Wii U, that's likely true, as last gen developers had to adjust to programmable shaders and parallel computing.

If what you are trying to say is that we will not see massive differences, as in Wii U being able to run far more complex games than last gen at 1080p/60 all of a sudden, then you are definetely right.

That does not mean that Wii U is already maxed out nor does it mean that we will not see the normal progression you can expect from a new console.
 
Log4Girlz sounds jaded. The guy's just bitter the Wii U didn't turn out to be more in line with the more optimistic projections. It's terribly obvious when you look at most of his posts regarding Wii U. It doesn't really matter either way so I don't know why you guys are so focused on changing his mind. This is coming from someone who doesn't necessary disagree too much with him (though I definitely don't think AC4 is in any way maxing out the WiiU).
 
yes, we don´t have a proper test, but I have read several times "silky smooth 60fps even when it gets hectic".

Considering Mario Galaxy can be modded to play 2 players directly on a Wii with little/no performance hit, I don't doubt for a moment that 3D World has quite a bit of overhead.

Let's be fair, it didn't even turn out to be in line with the more pessimistic projections.....

Very true, considering the 40/45nm chips, I doubt there was really a problem with silicon budget either... The only way this makes sense to me is if Wii U is designed to shrink down drastically in a few years and be the basis for a handheld architecture. Would make a lot of sense for Nintendo to take their tools along with them to their next dev cycle. This also has the added benefit of pushing Wii U as an R&D stage for their next hardware cycle as they would be laying down all the work for that now.
 
Very true, considering the 40/45nm chips, I doubt there was really a problem with silicon budget either... The only way this makes sense to me is if Wii U is designed to shrink down drastically in a few years and be the basis for a handheld architecture. Would make a lot of sense for Nintendo to take their tools along with them to their next dev cycle. This also has the added benefit of pushing Wii U as an R&D stage for their next hardware cycle as they would be laying down all the work for that now.

Makes a lot of sense, as you can basically downscale it a bit if necessary for a handhelp or just scale up for more power in a next console, thus sharing tools between both and maybe even some games. They have actually said they want to merge handheld and console tools, I don´t buy the hybrid thingy.
 
Log4Girlz sounds jaded. The guy's just bitter the Wii U didn't turn out to be more in line with the more optimistic projections. It's terribly obvious when you look at most of his posts regarding Wii U. It doesn't really matter either way so I don't know why you guys are so focused on changing his mind. This is coming from someone who doesn't necessary disagree too much with him (though I definitely don't think AC4 is in any way maxing out the WiiU).

I'm sure he's bitter. Two years ago he was a regular poster in the WiiU topics. Never had any problems with him. Then last year, with the first evidence that WiiU was not what bgassassin had us believe, he started trolling it. He was banned for a while, i can only assume that had something to do with it, but i'm not sure. But i have no problem with him not liking the WiiU, thinking it is not powerful or whatnot. I do have a problem with broken logic and changing goalposts every time you're proven wrong. Launch games were made on poorly documented hardware, there was a dev saying only one core of the CPU was available, ports were made by small teams (the guys from Darksiders said with how many they were, it was an insanely low number, i think it was 4 or 5 people), usually outsourced, new to the hardware... Obviously, games would see drastic improvements over time. His notion that the hardware design is almost identical to that of the 360 is just too crazy. What have we been doing the past year in this thread of not scratching our heads that the setup was unlike anything anybody had thought of.
 
Very true, considering the 40/45nm chips, I doubt there was really a problem with silicon budget either... The only way this makes sense to me is if Wii U is designed to shrink down drastically in a few years and be the basis for a handheld architecture. Would make a lot of sense for Nintendo to take their tools along with them to their next dev cycle. This also has the added benefit of pushing Wii U as an R&D stage for their next hardware cycle as they would be laying down all the work for that now.

That thought crossed my mind too. The thing is already drawing only 33 watts, take out the disk drive and ports and you're lower already, and 45nm is still two generations of fabs behind what we have *today* let alone if they wait for 14nm or some such. That would drop the power draw quite a lot.

I wonder what these chips would draw with some optimization on 28nm.

I'd also think of an Nvidia Shield like implementation, in that it could still attach to a TV over HDMI or wireless standards or some sort of wireless base station that houses nothing but video streaming capabilities, and have dual screen that way, and then still have the whole consoles power while you are away.

Seems like they will keep the 3DS around for a while though, it's their money printer to the loss leader of the Wii U.
 
Considering Mario Galaxy can be modded to play 2 players directly on a Wii with little/no performance hit, I don't doubt for a moment that 3D World has quite a bit of overhead.

Very true, considering the 40/45nm chips, I doubt there was really a problem with silicon budget either... The only way this makes sense to me is if Wii U is designed to shrink down drastically in a few years and be the basis for a handheld architecture. Would make a lot of sense for Nintendo to take their tools along with them to their next dev cycle. This also has the added benefit of pushing Wii U as an R&D stage for their next hardware cycle as they would be laying down all the work for that now.

Hmm. Would be interesting...

Shrink down the Wii U into a handheld as is (at 14nm or some such). Immediate compatibility with most of the Wii U's catalog, plus new handheld centric games.

Then release a new "proper" console ("U2" or something) using almost the same hardware, only add a shitload of CPU cores/cache, RAM, and ALUs with broader buses across the board.

This console is compatible with all Wii U, and UDS (or whatever) software by shutting off all the extra shit, but can reach into the PS4 ballpark in performance for new games.

Maybe some games could even support two modes (e.g. 1080p60 with ultra textures/AA/AF for U2 and 720p30 with shit textures and no AA/AF) to work on both levels of hardware.

All that would probably also cause confusion for consumers... As usual...
 
Tand 45nm is still two generations of fabs behind what we have *today* let alone if they wait for 14nm or some such. That would drop the power draw quite a lot.

The die shot of Latte shows they're close to being pad-limited (perimeter I/O). If they shift to IBM fabs entirely, maybe they can integrate the CPU as well (at 32nm), but I'm not so certain they can realistically shrink beyond that.

Also bear in mind that eDRAM nodes lag conventional fab processes. IBM seems to be the only choice left for pushing nodes there, and that's not going to be inexpensive for a long time.
 
The die shot of Latte shows they're close to being pad-limited (perimeter I/O). If they shift to IBM fabs entirely, maybe they can integrate the CPU as well (at 32nm), but I'm not so certain they can realistically shrink beyond that.

Also bear in mind that eDRAM nodes lag conventional fab processes. IBM seems to be the only choice left for pushing nodes there, and that's not going to be inexpensive for a long time.

Would the OpenPOWER consortium 'thing' allow others to have a go at that integration?
 
Would the OpenPOWER consortium 'thing' allow others to have a go at that integration?
They'd probably want the engineering team from IBM to do the redesign though. After that is done, sure they can fab elsewhere. Even so, there aren't too many foundries (i.e. Global Foundries) to pick that have appropriate eDRAM or compatible processes.
 
The die shot of Latte shows they're close to being pad-limited (perimeter I/O). If they shift to IBM fabs entirely, maybe they can integrate the CPU as well (at 32nm), but I'm not so certain they can realistically shrink beyond that.

I'm no expert whatsoever, but I can't believe there wouldn't be a way around something like that.
 
They'd probably want the engineering team from IBM to do the redesign though. After that is done, sure they can fab elsewhere. Even so, there aren't too many foundries (i.e. Global Foundries) to pick that have appropriate eDRAM or compatible processes.

The eDRAM isn't the biggest power draw, the CPU and GPU cores are, Intels eDRAM only took a few watts and that's 128mb. The eDRAM could be left a process node behind the rest of the chip but still shrunk if need be.

I'm no expert whatsoever, but I can't believe there wouldn't be a way around something like that.

Yes, this. Chips with far more IO exist in smaller packages, I'm not sure why one would just look at the Wii U chip and say it can't shrink anymore because of it.
 
Yes, this. Chips with far more IO exist in smaller packages, I'm not sure why one would just look at the Wii U chip and say it can't shrink anymore because of it.
Some chip IO scales down worse than others.
 
This was on gamefaqs, anything to it?
Wii U CPU:
- Tri Core IBM 45nm PowerPC750CL/G3/Power7 Hybrid
- Core 0; 512KB Core 1; 2MB Core 2; 512KB L2 Cache
- Clocked at 1.24Ghz
- 4 stage pipeline/not Xenon-Cell CPU-GPU hybrid
- Produced at IBM advanced CMOS fabrication facility
- Can use eDRAM as cache(!?)
- Dual Core ARM for background OS tasks clocked at 1Ghz with 64KB L1 cache per core

Wii U Memory:
- DDR3 2GB, 1GB for OS, 1GB for Games (12.8/Gbps is speculation)
- eDRAM 32MB+4MB Memory/VRAM/Cache(UNIFIED, 4MB for gamepad)
- Clocked at 550Mhz
- CPU for Cache and GPU for VRAM use eDRAM
- eDRAM acts as unified memory, similar to AMD’s hUMA/HSA by function/behaviour(?!)
- Little to no latency between CPU and GPU with eDRAM Cache/Memory/VRAM

Wii U GPU:
- VLIW 5/VLIW4 Radeon HD 5000/6000
- DirectX 11/Open GL 4.3/Open CL 1.2/Shader Model 5.0
- Supports GPGPU compute/offload
- Customized, Using custom Nintendo API codenamed GX2(GX1 Gamecube)
- Clocked at 550Mhz
- Produced at TSMC
- Uses eDRAM as VRAM
DirectX can only be used by Microsoft thus silicon/hardware/feature is removed replacing with transistors for performance and maintaining full Open GL 4.3/Open CL 1.2/Shader Model 5.0/GPGPU compatibility

Wii U Note;
- Can’t use DirectX 11 except Open GL 4.3 that is ahead DX11
- Easier to program, no bottlenecks causing trouble like on Xbox 360 and PlayStation 3
- more efficient hardware with no bandwidth bottlenecks
- Most early ports and even now use 2 cores and 3rd is barely used #fact
- Wii U CPU has much higher operations per cycle than Xbox 360/PlayStation 3
- It is maybe in fact most efficient performance per watt in the world in terms of 45/40nm chips
- Power7 architecture allows shave off 10% of texture size without loss of quality(special compression)

It is ahead of 7th generation, developers will learn to handle and optimize for Wii U properly. I was digging on the internet for various information’s for a month and all you read is what I compiled and gathered. It can handle 1080p because it has 3.5 times more eDRAM than Xbox 360 also has no severe memory/bandwidth bottlenecks that Xbox 360/PlayStation 3 have all over the system with their FSB that is choking. Xbox 360′s GDDR3 that has theoretical 22-28Gbps is reduced to 10Gbps thanks to piss poor FSB.

Wii U has same amount of eDRAM as Xbox One, remember that and also Wii U’s CPU utilizes Power7 architecture a bit so maybe just maybe Wii U’s CPU can use eDRAM as CPU cache as GPU uses it as VRAM so in a way something like AMD’s hUMA/HSA is possible on Wii U.
 
The eDRAM isn't the biggest power draw, the CPU and GPU cores are, Intels eDRAM only took a few watts and that's 128mb. The eDRAM could be left a process node behind the rest of the chip but still shrunk if need be.

The eDRAM has to be on a process compatible node for it to be on the same die (it's the same wafer, afterall). Crystalwell is a separate chip from Haswell.

Well, is there anything in particular we can discern about expresso/latte I/O that makes it special which screams "this can't be shrunk!"?

Hence, "not so certain beyond 32nm". At some point the signal integrity needs to be maintained, so it's hard to say without knowing more. Even if the GP I/O could be shrunk accordingly, the DDR3 interface will not shrink much. Folks were proposing 14nm, which may simply be unfeasible considering just how much of a shrink that entails.

There is a lot of dead space on Espresso already padding the chip just to accommodate the I/O on it.

---

That said, if they were just going to do 14nm shrink and pad the area with dead space, I'd have to question the economics of it considering how long it will take for new nodes to arrive (and then become common place). At least 45nm is very old by now. *shrug*

Anyways, just a thought. :)

---

If the GP I/O is mostly for the CPU, then sure, it becomes a non-issue with CPU integration (apart from making sure it's 100% functionally equivalent, e.g. replacement FSB on 360 slim).
 
Status
Not open for further replies.
Top Bottom