WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I wonder if vsync is a requirement set by Nintendo, like the early failed 720p requirement setwitosoft for the 360? Three frame buffers would take a fair chunk of that eDRAM, I wonder if dropping the requirement later (if there is one) would add anything substantial to what could be done with the memory config.

If it is a requirement then how does that explain darksiders 2? I would wager it is simply incredibly simple to do (with thq simply being too cash strapped to bother implementing)
 
If it is a requirement then how does that explain darksiders 2? I would wager it is simply incredibly simple to do (with thq simply being too cash strapped to bother implementing)

Darksiders 2 is by far the worst port on the Wii U.

It was ported by a handful of people(I believe someone said 4) in less than 6 months with a budget of a bankrupt company that was going out of business. They likely did as much as they could, then released it as is. I doubt their was time or money for anything more with THQ being auctioned off piece by piece.

As Fourth said, its in "almost" all games. There are exceptions.
 
Edit: Looks like Shin'en sorta answered my question from before (from the the warriors link)
It's why I keep trying to get insight from them. They usually give a kernel here and there that helps people on this thread see things a little more clearly, or help to put their thoughts together on certain topics. Also, comparatively speaking in regard to the more knowledgeable here, it's the only way I can contribute to these kinds of threads :)
 
My knowledge of game development is still preliminary so bear with me but getting high quality assets into a game takes lots of time. There's modeling, texture mapping, baking the maps, animation (i.e mo-cap, skin weighting), qa testing (can't have something breaking the game) etc.

But it's all relative. For example, I don't expect them to pull off worlds like GTA5 in such short amount of time because of the above.

Me neither, I doubt you will ever see something on that scale from them. I get what your saying, they're not for big budget games.
 
Darksiders 2 is by far the worst port on the Wii U.

It was ported by a handful of people(I believe someone said 4) in less than 6 months with a budget of a bankrupt company that was going out of business. They likely did as much as they could, then released it as is. I doubt their was time or money for anything more with THQ being auctioned off piece by piece.

As Fourth said, its in "almost" all games. There are exceptions.

Could it be something that was introduced with a later sdk? Darksiders was one of the earliest to start iirc.

I figured vsync was now mandatory because it had something to do with keeping the GamePad stream synced with the main screen? Not sure that makes sense though.

It's why I keep trying to get insight from them. They usually give a kernel here and there that helps people on this thread see things a little more clearly, or help to put their thoughts together on certain topics. Also, comparatively speaking in regard to the more knowledgeable here, it's the only way I can contribute to these kinds of threads :)

Yeah those guys are great. If only everyone so openly talked!
 
Some more insight from Shin'en http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

'Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for the GPU if you make scattered reads around the memory. This is never a good idea for good performance.

I can’t detail the Wii U GPU but remember it’s a GPGPU. So you are lifted from most limits you had on previous consoles. I think that if you have problems making a great looking game on Wii U then it’s not a problem of the hardware.'
Yea that shinen game will probably shit on many games for what it will be. When are they gonna show it?
 
Pseudo-static eDRAM, the stuff Nintendo uses, has the exact same latency as real SRAM. The refreshes are hidden using additional buffers. Pseudo-static RAM actually refreshes every cycle, except for the bank currently accessed. To do that, PSRAM requires complex additional logic (unlike SRAM, which requires no logic at all), and this is also the reason PSRAM isn't used for very small memory pools. You'd save no or very little space because of the logic, and the manufacturing process is more complicated, so it makes more sense to use SRAM. And this is why the 1MB MEM0 pool on Latte is SRAM.

Ah, I suppose that makes sense.

Regarding that, if the memory is 1T-SRAM as suspected, then on the GC it had a 10ns latency at 162 Mhz which is a 1 cycle of sustained latency. Is it possible to go lower than 1 cycle regarding latencies?

10ns isn't really that impressive as standard DDR3 has that these days, but then again, the clock frequency is several hundred times faster than that RAM that you mentioned. Edit: Forgot to mention that the latency is more than one cycle. High clock rate...n all.
 
But those are games. :/

of course they are. Maybe I wrote it wrong, I meant the next direct, that isn't concentrated on a single game. The directs I mentioned were dedicated to a single game (the Wii Fit one in the end to two games). I should have added the word single to my first post

Wonderful 101 is also an outstanding game btw
 
I wonder if vsync is a requirement set by Nintendo, like the early failed 720p requirement set by Microsoft for the 360? Three frame buffers would take a fair chunk of that eDRAM, I wonder if dropping the requirement later (if there is one) would add anything substantial to what could be done with the memory config.

If v-synch is mentioned in the Lot Checks it would be Recommended not Required.
 
Your assumption is that it doesn't allow ports to run better. Reality is that it isn't being used in order to try and make them run better.

That really wasn't what I was assuming. I didn't write that out correctly, sorry.

Well, one thing we do see is Vsync in almost all Wii U games. That's likely being achieved by triple buffering and that almost certainly is using the eDRAM.

Thanks.

I know you were asking lherre, but thought I'd chip in that the performance issues seen in some of the first round of ports seemed to be related to the cpu more than anything, at least going by DFs analysis. 360 and wiiu cpu's are different beasts with different strengths. A game built to take advantage of one, might take some work to take advantage of the other.

Yeah, I get that, but it's still puzzling. I posed the question to him because the guy shows up talking about Wii U being a "current gen +" or a refined/perfect "current gen" system... and for all the obvious improvements to the hardware, offers zero explanations as to why "current gen ports" are mediocre. Zero insight into potential pros and cons. I know the guy is a dev and wants to protect his job, but his comments are cryptic and more just statements of casual observation.
 
Sorry Shin'en make some bigger games then you can talk about power. About the only good looking games you made was Nano Assault and that definitely it not a game to be bragging about.

Why not? Having big bugets games is a prerequisite for programming talent and knowledge now? Who better to talk about a console's power than people who are known for extremely tight code? People that have done consistently amazing things with incredibly small resources(in both hardware and man-power)? These are the guys that crammed Nano Assult Neo into 50MB. 50MB!! I mean, who does that?! Doing things as efficiently as they do requires a good use and understanding of the hardware. Sloppy code and brute force won't cut it.

Can't you even imagine what guys like these could do(technically) if given the money and a larger team? Knocking them because their games have been on very small budgets, while ignoring the fact that they are among some of the most talented programmers in the industry... come across as really silly.
 
Why not? Having big bugets games is a prerequisite for programming talent and knowledge now? Who better to talk about a console's power than people who are known for extremely tight code? People that have done consistently amazing things with incredibly small resources(in both hardware and man-power)? These are the guys that crammed Nano Assult Neo into 50MB. 50MB!! I mean, who does that?! Doing things as efficiently as they do requires a good use and understanding of the hardware. Sloppy code and brute force won't cut it.

Can't you even imagine what guys like these could do(technically) if given the money and a larger team? Knocking them because their games have been on very small budgets, while ignoring the fact that they are among some of the most talented programmers in the industry... come across as really silly.

I would love if they take the chance. Maybe start with a bigger project via Kickstarter, maybe going from very small game to a B-tier game, they would definitely have my money
 
Dont take what i said as the way i feel. im saying the large majority of fanboys will feel this way. im going to be fine with what they deliver next i know this much. im going to post a comment from someone on another site regarding this topic.


Sorry Shin'en make some bigger games then you can talk about power. About the only good looking games you made was Nano Assault and that definitely it not a game to be bragging about.

Not saying the Wii U isn't capable just saying coming from Shin'en it doesn't mean much.

I am all for developers giving some insight but when you are a developer who barely even has games to push a system anyway it is kind of pointless.

More so some of the games Shin'en have made were really crappy.


call it whatever you want but this is the way a lot of gamers feel. if this way crytek, epic, ubisoft, or someone bigger it wouldnt be a problem. but most peole dont know who shin'en is more than just a bunch of e-shop gamers their words mean very little to them. so thats why i said what i said. if their next game isnt major eyecandy and steps beyond the 7th gen firmly those gamers will call Bullshit on all the Wii U talk.

I didn't really have a problem with what you said. I just didn't see the need for the obligatory negative statement just to please them.

For those particular people you mentioned, it wouldn't matter even if Epic, Crytek or any other big well known dev said it. It wouldn't change their opinion in the least. They are a word I will refrain form using as I don't like to use insults in place of real points but it begins with an f. Bayonetta 2 is a clear exmaple They will dismiss or ignore any claim that is positive of the Wii U, and no matter what the game looks like, they will say it A. Doesn't look "impressive(ie. they will never find anything on Nintendo hardware impressive simply for the fact that is on Nintendo hardware) and B. Say that the PS3/360 can do the exact same thing. They will completely ignore statements by the devs of Trine 2; DC and Scribblenauts unlimited who straight up said the PS3/360 couldn't do what the Wii U could.

Honestly, its better to just ignore them. Just as there are people who still say the PS1 was stronger than the N64 and the PS2 was stronger than the GC, there will be people who insist the last gen consoles are stronger than the Wii U, or that the Wii U is not much better than them until their dying breath. Nothing will change their mind.

Shin'en and the indie devs are the best sources for info on the Wii U as they are not money locked to publishers and companies that govern their paycheck/opinion.

Why not? Having big bugets games is a prerequisite for programming talent and knowledge now? Who better to talk about a console's power than people who are known for extremely tight code? People that have done consistently amazing things with incredibly small resources(in both hardware and man-power)? These are the guys that crammed Nano Assult Neo into 50MB. 50MB!! I mean, who does that?! Doing things as efficiently as they do requires a good use and understanding of the hardware. Sloppy code and brute force won't cut it.

Can't you even imagine what guys like these could do(technically) if given the money and a larger team? Knocking them because their games have been on very small budgets, while ignoring the fact that they are among some of the most talented programmers in the industry... come across as really silly.

You're wasting your time. They will not listen to any opinion by any dev that is positive about the hardware. If a big dev says something, they'll just write it off as PR talk just the same as they write of a smaller dev for not making "big"(undefined) games, or they'll write them off as something related to money investments like lherre did on the last page. They'll cook up some excuse to dismiss what the dev says guaranteed.

The devs of Shin'en are some of the best developers I have ever seen. They do things with hardware with their small teams and budget that the biggest companies couldn't manage.

They are the best source of info for the Wii U GPU that we have. It will be nice to see what they do with the tessllator in the next game.
 
Speaking about Crytek, new CryEngine supports Wii U, I hope this engine is used more, and we see something soon on Wii U, would be good to judge the system as well.

It is a shame Crysis 3 did not come to Wii U.
 
I didn't really have a problem with what you said. I just didn't see the need for the obligatory negative statement just to please them.

For those particular people you mentioned, it wouldn't matter even if Epic, Crytek or any other big well known dev said it. It wouldn't change their opinion in the least. They are a word I will refrain form using as I don't like to use insults in place of real points but it begins with an f. Bayonetta 2 is a clear exmaple They will dismiss or ignore any claim that is positive of the Wii U, and no matter what the game looks like, they will say it A. Doesn't look "impressive(ie. they will never find anything on Nintendo hardware impressive simply for the fact that is on Nintendo hardware) and B. Say that the PS3/360 can do the exact same thing. They will completely ignore statements by the devs of Trine 2; DC and Scribblenauts unlimited who straight up said the PS3/360 couldn't do what the Wii U could.

Honestly, its better to just ignore them. Just as there are people who still say the PS1 was stronger than the N64 and the PS2 was stronger than the GC, there will be people who insist the last gen consoles are stronger than the Wii U, or that the Wii U is not much better than them until their dying breath. Nothing will change their mind.

Shin'en and the indie devs are the best sources for info on the Wii U as they are not money locked to publishers and companies that govern their paycheck/opinion.



You're wasting your time. They will not listen to any opinion by any dev that is positive about the hardware. If a big dev says something, they'll just write it off as PR talk just the same as they write of a smaller dev for not making "big"(undefined) games, or they'll write them off as something related to money investments like lherre did on the last page. They'll cook up some excuse to dismiss what the dev says guaranteed.

The devs of Shin'en are some of the best developers I have ever seen. They do things with hardware with their small teams and budget that the biggest companies couldn't manage.

They are the best source of info for the Wii U GPU that we have. It will be nice to see what they do with the tessllator in the next game.
Well we do have comments from epic and dice devs. Really a mix messge at best.

But the games speak for themselves in the end. But it really not fair to the wiiu because the next Gen consoles are launching. Not really a fair fight.

Art direction could make up for the lower power parts. Even the wii had some great looking games because of the art direction even though at the technical level they were not impressive compared to ps360.
 
Then this can be said about any insider on this board. Everyone has money involved in something. :P

And this is what we know as "irony" though its a lot more than in some of the cases. Though of enough of this. Its time to get back to the GPU.

I'm really interested in identifying block K and the small unlabeled block beside the eDRAM.

FSLatte.jpg
 
I'm still convinced that Nintendo have evolved the TEV Unit to give the GPU a standard rendering pipeline and keeping the fixed functions, albeit in a form that will integrate well with third party engines. I'd say that HDR lighting and shadows and depth of field effects would be good candidates. No idea where they would be on the die shot though.

I say this for the same reason I think that Nintendo are doing something devious with the memory - they knew ages ago that the PS4 was supposed to have 4GB of RAM and the One was supposed to have 8GB and that both CPUs and GPUs were going to be much more powerful so they must have a few things going on to bridge the gap in power so that third parties could port to and from all 3 consoles.
 
I'm still convinced that Nintendo have evolved the TEV Unit to give the GPU a standard rendering pipeline and keeping the fixed functions, albeit in a form that will integrate well with third party engines. I'd say that HDR lighting and shadows and depth of field effects would be good candidates. No idea where they would be on the die shot though.

I say this for the same reason I think that Nintendo are doing something devious with the memory - they knew ages ago that the PS4 was supposed to have 4GB of RAM and the One was supposed to have 8GB and that both CPUs and GPUs were going to be much more powerful so they must have a few things going on to bridge the gap in power so that third parties could port to and from all 3 consoles.

Nobody knew what the PS4 and XboxOne would have in them last year. Not even Sony or Microsoft. Even now you are still hearing about revisions being made. Half of the tech they are using didn't even exist when the Wii U was announced.

The info about the PS4/Xboxone memory count at the time were just as much a rumor as the Wii U having 768 MB of RAM total. As far as power gap is concered, that has never been true reason for any port being on a conosle or not being on it. That is usually just the most convienient. Even now we are hearing devs complain about the disparity between the Xboxone and the PS4. I expect there to be a lot more variance amongst games this gen and a lot more devs going bankrupt.

As for the TEV, there is next to no chance it is on the die. It is too big to miss. Also, it was more or less out by the Iwata asks for the Wii U. There is no Wii hardware in the Wii U. Its more than likely just being emulated on the GPU shaders.

agree but we know enough that those comments were based on early devkits. hell there are some people (the 400 dollar 360 guy) that came out later and said he never really had hands on expereience with the Wii U he was just going off of what he heard.

Actually most of those comments were based on publisher bias. At least in Crytek's case. Almost all of those devs went back and ate their words in case people forgot.
 
That's why I said an evolution of the TEV Unit, bunging in the TEV Unit from Flipper/Hollywood would give the Wii U the same problem that the Wii had with a nonstandard rendering pipeline. Use of 'free' use of HDR and depth of field amongst other things would be a real boon for developers.
 
That's why I said an evolution of the TEV Unit, bunging in the TEV Unit from Flipper/Hollywood would give the Wii U the same problem that the Wii had with a nonstandard rendering pipeline. Use of 'free' use of HDR and depth of field amongst other things would be a real boon for developers.

This makes no sense your better of putting the new space towards more programmable shaders, not ancient ff tech.
 
The statement on bandwidth is interesting. I would have guessed that the statement would have been more in regards to latency - that latency would be an issue with scattered reads. blu? If you're around, would you care to comment?
Blu reporting.

Yes, latency is the major factor for scattered reads. BW also plays a role there, though, depending on whether you could or could not bypass the GPU caches under cases if extreme scattering. Consider the following scenario: you have a GPU cache feeding the pipeline. You have scattered reads fetching a few texels from here, a few from there. Now, texture caches don't work in texels, they work in tiles (at least when they work optimally) which is the texture-cache equivalent of CPU cache lines. A tile can have various sizes based on texture format, GPU cache implementation, etc, but the point is, for a given texture format, a given GPU will use the same tile size (if the texture is tiled, in the first place). Some of the AMD GPUs I've worked with have had cache tile sizes as big as 32x32. Now, when said scattered texel reads hit a cached tile it's all fine and dandy; when they hit outside of a cached tile, though, a cache miss will incur a tile read. Now image your scattered reads were so bad, that on the average they hit a tile just once, before they incurred a tile miss. From there on you can imagine the consequences for the BW, I believe ; )

Well, one thing we do see is Vsync in almost all Wii U games. That's likely being achieved by triple buffering and that almost certainly is using the eDRAM.
Traditionally (as in Flipper & Xenos), embedded console framebuffers have held only the back buffer - all the front buffers sit in main RAM, at the expense of the BW required to copy a fb out of edram. I don't know how Latte handles that - it could very well be a matter of developer's choice.
 
I'm still convinced that Nintendo have evolved the TEV Unit to give the GPU a standard rendering pipeline and keeping the fixed functions, albeit in a form that will integrate well with third party engines. I'd say that HDR lighting and shadows and depth of field effects would be good candidates. No idea where they would be on the die shot though.

I say this for the same reason I think that Nintendo are doing something devious with the memory - they knew ages ago that the PS4 was supposed to have 4GB of RAM and the One was supposed to have 8GB and that both CPUs and GPUs were going to be much more powerful so they must have a few things going on to bridge the gap in power so that third parties could port to and from all 3 consoles.

Thats why Nintendo using compression technology was brought up.

FROM IBM:
Active Memory Expansion™, a unique capability that uses memory compression technology to make the physical memory on the system appear to the application as if it were up to twice as large as it actually is. Active Memory Expansion technology dynamically adjusts the amount of compressed memory based on a workload's memory needs

and or from S3T3
Nintendo Signs S3TC Technology License Agreement with S3 Graphics

S3TC texture compression uses an advanced compression algorithm that achieves up to six-fold compression of complex textures and images that are used in today’s hardware accelerated gaming titles. This reduces memory bandwidth and expands the amount of texture imagery that can be stored and processed through onboard graphics memory, without compromising visual clarity.

So, if Nintendo used compression technology from S3 (GPU) and possibly in combination with AME (CPU)... how large does 32MB of eDRAM look like to the system and developers?
Since we now know that the 32MB of eDRAM is accessible by both the CPU and GPU.

And we have reports from developers that WiiU has a lot of memory to work with.
 
AMD's Tessellation performance pre GCN was horrible so I wouldn't put much faith in the Wii U's being used for much if at all.

Shin'ens next game uses tesselation. They already announced that on Twitter. Nintendo dosen't implent stuff into their chips if they aren't useable. If it has tesselation, it will be used in games.
 
Shin'ens next game uses tesselation. They already announced that on Twitter. Nintendo dosen't implent stuff into their chips if they aren't useable. If it has tesselation, it will be used in games.

Of course it usable it doesn't mean the performance in comparison to GCN is any good.
 
Of course it usable it doesn't mean the performance in comparison to GCN is any good.

No one knows how it performs. Seeing as the GPU is basically a custom job it may be faster, equal or slower. Who knows?

maybe ask Shin'en on Twitter, they do respond. Sometimes at least.
 
AMD's Tessellation performance pre GCN was horrible so I wouldn't put much faith in the Wii U's being used for much if at all.
It's a big relief then that the WiiU GPU was finished AFTER the GCN architecture was put on sale, or in other words, the Wii U GPU's silicon was finished at least 3-4 months after the HD7000 series silicon was finished.
 
No one knows how it performs. Seeing as the GPU is basically a custom job it may be faster, equal or slower. Who knows?

maybe ask Shin'en on Twitter, they do respond. Sometimes at least.

Lets assume its all better because we know nothing about it, right, we should assume its GCN3 as well?.
 
Lets assume its all better because we know nothing about it, right, we should assume its GCN3 as well?.
Just let's assume that is a more modern silicon than a vanilla GCN, that's all. What hasn't the least of senses is to assume that there will be useless parts on there on a customized design made specifically for Nintendo.
If there's a tessellation unit, then it is capable enough to produce good results for what it was worth.
Then there are tons of other factors to consider like the smaller area of the Latte GPU that of course limits what can be crammed on it, but what it doesn't has any sense is to think that it's performance per transistor and clock won't be higher than vanilla designs that are older than it.

Latte won't be GCN3, Latte is Latte, a GPU whose silicon was finished in January 2012 approximately, so there's no reason to think its innards are ancient 2007-2008 tech. If there is a part made of 2007-2008 tech then it's because it fit their design more than current tech (and that's because performance per transistor is not the only factor that determines GPU advancements on the PC scene, which may lend to more ineffective designs like VLIW4 went as an evolution of the superior hardware-wise VLIW5 because the asymmetric design of the VLIW5 on a PC scenario was inefficient and the drivers weren't able to feed all the units).

So all in all it's impossible to make assumptions besides obvious things like nintendo not wasting a single mm^2 of area on "useless units" like the ones you claim this GPU is filled of.
 
Lets assume its all better because we know nothing about it, right, we should assume its GCN3 as well?.
You made the assumption it might not be 'any good' compared to GCN. The other posters have been saying it was apparently good enough to be used by Shin'en in some (one?) of their upcoming WiiU titles. Who's making the assumptions here?
 
So all in all it's impossible to make assumptions besides obvious things like nintendo not wasting a single mm^2 of area on "useless units" like the ones you claim this GPU is filled of.

I never said it was useless I merely disagreeing with people where who think nintendo sprinkles magic fairy dust over a GPU and it suddenly gets 10x more efficient. Most of the modifications for the Wii U's GPU honestly seem more centred around getting 100% BC through H/W BC not having '0 useless parts' on it. The sooner people come to terms with that the better.

You made the assumption it might not be 'any good' compared to GCN. The other posters have been saying it was apparently good enough to be used by Shin'en in some (one?) of their upcoming WiiU titles. Who's making the assumptions here?

Everyone, because we know nothing, and nintendo is a company who knows more about making GPU's then a GPU company, obviously. Until someone can show some actual evidence of all these magical modifications nintendo apparently made then I can't really be called out for 'making assumptions'.
 
Lets assume its all better because we know nothing about it, right, we should assume its GCN3 as well?.

I read my post multiple times. Can you point out where i make the assumption please? Cause i can't find it.

All i said is: "No one knows how well/ and fast the tesselator works compared to other chips".
 
I read my post multiple times. Can you point out where i make the assumption please? Cause i can't find it.

All i said is: "No one knows how well/ and fast the tesselator works compared to other chips".

Welcome to my world.

This is what happens when you say things that even remotely sound like they could be good about the hardware.

You guys are aware that GCN is the architecture the XB1 and PS4 GPUs are based on right? Every bit of evidence we have suggests that Latte is derived from R700.

Not the last I saw. Every single bit of evidence we have suggest that the Wii U GPU is custom made and could be derived from any(mulitple) GPUs that was released up until 2012 with Brazos providing the most component matches. The only link to the R700's is that it was using the highest end R700 in early dev kits.
 
Welcome to my world.



No that last I saw. Every single bit of evidence we have suggest that the Wii U GPU is custom made and could be derived from any(mulitple) GPUs that was released up until mid 2012.

Show us the evidence, I would love to see it.
 
Show us the evidence, I would love to see it.

Show you evidence of our finding in this thread about the GPU? insert >> "this thread" >> endline. There you go.

Now show us evidence of all th claims you made and all of the nay saying you've done over the last 19 posts
 
Show you evidence of our finding in this thread about the GPU? insert >> "this thread" >> endline. There you go.

Now show us evidence of all th claims you made and all of the nay saying you've done over the last 19 posts

It would appear to me based off the die images atleast that its 99% the same as R700 part the rest could probably easily be attributed to wanting to get that 100 H/W BC for the Wii / GC games out.

Now could I see your evidence please?.
 
Status
Not open for further replies.
Top Bottom