WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I have one niggling question regarding Latte.

Why is it being compared to Brazos parts again? I really don't see any sort of resemblance-in-purpose merits. One of them is a low-end APU with a strict TDP budget (shared by necessity between CPU and GPU), the other a highly custom stand-alone GPU with a reasonably generous power budget (for a low-powered part).

I'd find it more accurate were one to compare Latte to AMD's own 40 nm mobile parts from the same era (RV7x0) as the Latte is suspected to originate. Even better if someone would volunteer such a laptop for ritual sacrifice and innards reading by a techno-haruspex.

It really comes down to that we have a high quality die photo of Brazos and, better yet, an annotated diagram of the die from AMD. Would be nice if we had the same for an R700 series chip, but we take what we can get.
 

Thanks. Yeah I know it seems more CPU related but it would heavily influence the GPU too so I thought it was worth mentioning. As wsippel noted, Latte may not be an APU but it's not really a standalone GPU either.
 
Guys, I found this recently http://playeressence.com/is-the-wii-u-even-powerful-enough-to-handle-gta-v/

This Eyeofcore comes from Gamefaqs and we know this board has a lot of fake.
So is it possible or is it another bullshit thing?

Wii U is light years ahead of current generation and that is factual
Welp, there goes any effort I had in taking him/her serious. Well, it was either that or him/her bragging about the 45nm thing. To be honest, he sounds like someone's alt account. That or there must be a Wii U cloning factory somewhere because these posts all sound the same.
 
Welp, there goes any effort I had in taking him/her serious. Well, it was either that or him/her bragging about the 45nm thing. To be honest, he sounds like someone's alt account. That or there must be a Wii U cloning factory somewhere because these posts all sound the same.

You're telling me. Hell I'M a Wii U optimist and even I had to face-palm several times while reading that.
 
Guys, I found this recently http://playeressence.com/is-the-wii-u-even-powerful-enough-to-handle-gta-v/

This Eyeofcore comes from Gamefaqs and we know this board has a lot of fake.
So is it possible or is it another bullshit thing?

Some of it sounds believable, others not so much.

Huge mistakes like "Wii U has same amount of eDRAM as Xbox One" is glaring.

X1 doesn't have eDRAM. It has eSRAM.
eDRAM is 1T per bit while eSRAM is 6T per bit, and have different functionality, even if some of them do overlap.

Also, the claim for hUMA is bullshit. The CPU is on a completely different die. How would the CPU or GPU snoop each others caches if they aren't even near each other?
 
The CPU and GPU are both on the same MCM and they also do have tech that allows them to directly communicate with one another(I don't know the specifics). I think it was mentioned in the Iwata asks and Shin'en also made comments about the GPU and CPU connectivity

I'm sure he meant embedded RAM which EDRAM and ESRAM both are. The specific type would be easily confused. They both have 32 MBs of embedded RAM and DDR3.

It is poorly written, but not entirely inaccurate.
 
eDRAM is 1T per bit while eSRAM is 6T per bit, and have different functionality, even if some of them do overlap.
Would you care to develop that? I mean, for what we know, on both of those memories you can store and read everything you want, so, even when the WiiU is much more dependant on that pool of memory than the Xbox One what are the functional differences between those memories?
 
Sorry but I can't answer almost all the questions or give more data.

But I know that some people has problems with wii u (porting games, the same games that runs fine in ps360 without any optimization), others not, etc, depends on the game and the things you need to do. I know that this info is not very relevant, but I can't give you more details.

The documentation ... well let's say it's similar to gc one.

Edited (a mistake with can and can't)

Where are those games coming from before they land on PS3 and XBox 360? Are we talking about a PS3 game being ported to 360 or vice versa before being ported to WiiU?
 
Some of it sounds believable, others not so much.

Huge mistakes like "Wii U has same amount of eDRAM as Xbox One" is glaring.

X1 doesn't have eDRAM. It has eSRAM.
eDRAM is 1T per bit while eSRAM is 6T per bit, and have different functionality, even if some of them do overlap.

Also, the claim for hUMA is bullshit. The CPU is on a completely different die. How would the CPU or GPU snoop each others caches if they aren't even near each other?

maybe i'm wrong but isn't the only difference between 6t-sram which the Bone uses and 1t-sram which Wii U uses the fact that 6t can be made on smaller processes than 1t can currently?
 
maybe i'm wrong but isn't the only difference between 6t-sram which the Bone uses and 1t-sram which Wii U uses the fact that 6t can be made on smaller processes than 1t can currently?
I also thought that the reason Microsoft went with the eSRAM is that the eDRAM wasn't available at 28nm but if different transistor nodes can be used on the same chip then this explanation wouldn't make much sense, because they could go for a 40-45nm eDram (it must still be smaller than 28nm eSram at those sizes) and couple it with the 28nm CPU and GPU, couldn't they?
 
But I do agree with him for someone to question if Wii U could handle gtaV is just sad at this point. Nintendo has some work to do to get people to at least acknowledge Wii U is superior to 2005/2006 console... Problem is they don't push specs the way other companies do.

People will put their PS3 GTA V discs in their PS4 this holiday and be up in arms that it doesn't work. You can get yourself in a lot of trouble overestimating people's intelligence. Most people get their information second hand and don't really question it, mostly from people who got their information second hand from somebody else who didn't know what the fuck they were talking about.
 
People will put their PS3 GTA V discs in their PS4 this holiday and be up in arms that it doesn't work. You can get yourself in a lot of trouble overestimating people's intelligence. Most people get their information second hand and don't really question it, mostly from people who got their information second hand from somebody else who didn't know what the fuck they were talking about.

I dont think people are that naive
 
Where are those games coming from before they land on PS3 and XBox 360? Are we talking about a PS3 game being ported to 360 or vice versa before being ported to WiiU?

At least one of them (with problems) is ported at the same time in all the consoles.
 
maybe i'm wrong but isn't the only difference between 6t-sram which the Bone uses and 1t-sram which Wii U uses the fact that 6t can be made on smaller processes than 1t can currently?

That is one thing. But SRAM is also better suited if and when you want to use it as a cache. It being static also allows for lower latency. Another caching plus. eDRAM is better used as a "scratchpad" when discussing graphics, but is also used for main RAM.

I think because it's higher density, you can put more memory in less space. The downside is that it has to be refreshed. Of course, you CAN use eDRAM as a sort of L4 cache, but you then have to create circuitry to constantly refresh the "cache" so it doesn't lose it's data.
 
That is one thing. But SRAM is also better suited if and when you want to use it as a cache. It being static also allows for lower latency. Another caching plus. eDRAM is better used as a "scratchpad" when discussing graphics, but is also used for main RAM.

I think because it's higher density, you can put more memory in less space. The downside is that it has to be refreshed. Of course, you CAN use eDRAM as a sort of L4 cache, but you then have to create circuitry to constantly refresh the "cache" so it doesn't lose it's data.

what is the difference between static and pseudo-static then?
 
Some more insight from Shin'en http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

'Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for the GPU if you make scattered reads around the memory. This is never a good idea for good performance.

I can’t detail the Wii U GPU but remember it’s a GPGPU. So you are lifted from most limits you had on previous consoles. I think that if you have problems making a great looking game on Wii U then it’s not a problem of the hardware.'
 
That is one thing. But SRAM is also better suited if and when you want to use it as a cache. It being static also allows for lower latency. Another caching plus. eDRAM is better used as a "scratchpad" when discussing graphics, but is also used for main RAM.

I think because it's higher density, you can put more memory in less space. The downside is that it has to be refreshed. Of course, you CAN use eDRAM as a sort of L4 cache, but you then have to create circuitry to constantly refresh the "cache" so it doesn't lose it's data.
Pseudo-static eDRAM, the stuff Nintendo uses, has the exact same latency as real SRAM. The refreshes are hidden using additional buffers. Pseudo-static RAM actually refreshes every cycle, except for the bank currently accessed. To do that, PSRAM requires complex additional logic (unlike SRAM, which requires no logic at all), and this is also the reason PSRAM isn't used for very small memory pools. You'd save no or very little space because of the logic, and the manufacturing process is more complicated, so it makes more sense to use SRAM. And this is why the 1MB MEM0 pool on Latte is SRAM.
 
Could both the CPU and GPU having access to the 32MB of eDRAM be the reason why Ancel said that the amount of memory is almost unlimited..? No idea how any of that would work, it's been years since I did any programming.
 
That is one thing. But SRAM is also better suited if and when you want to use it as a cache. It being static also allows for lower latency. Another caching plus.
Regarding that, if the memory is 1T-SRAM as suspected, then on the GC it had a 10ns latency at 162 Mhz which is a 1 cycle of sustained latency. Is it possible to go lower than 1 cycle regarding latencies?
 
Oh shit the hammer has been dropped. But to stay neutral Shin'en next Wii U game better be fucking eye candy.... They have been saying a lot without us seeing anything new to back up their claim. If there next game comes out and isn't something a degree better than 7th gen people are gonna laugh and discredit anything else they have to say.

I agree. Or maybe we will hear the scale of the game is not to big, blah blah blah
 
I agree. Or maybe we will hear the scale of the game is not to big, blah blah blah

Or maybe... if the embedded memory is so much larger than that of the 360... perhaps we can start to see games built for Xbox 360 actually outperform on Wii U? What's the point if all the 1st party games take advantage of the hardware while the rest of the games struggle to match the PS360?
 
Oh shit the hammer has been dropped. But to stay neutral Shin'en next Wii U game better be fucking eye candy.... They have been saying a lot without us seeing anything new to back up their claim. If there next game comes out and isn't something a degree better than 7th gen people are gonna laugh and discredit anything else they have to say.

Forcing a negative critique is more mitigating than staying neutral. Eyecandy and a good looking game are not the same. For the statement, that was the always my opinion even since the original Wii.


Shin'en with its small team and small budget did this.
http://jettrocket.wordpress.com/
jett-rocket-20100329005141575.jpg
jett1.jpg

Activision with their huge teams and millions made this.

I'll take the words of devs like Shin'en over devs who haven't demonstrated a product that is even half as good any day of the week.
 
You don't have to worry about the quality of the eye candy for a Shin'en game. They always look top-notch.
They're not really a big studio though. I think their past games benefited from this where you didn't require alot of people to get best out of the gameboy for example.

It will be interested to see how they work around this with the Wii U. They'll probably have a field day with the lighting and textures (or whatever requires the least manpower) but could come up limited with actual environments and characters (compared to other AAA games on last gen) although I heard they do outsourcing.


Also, wtf at that Krizzx example.
 
They're not really a big studio though. I think their past games benefited from this where you didn't require alot of people to get best out of the gameboy for example.

It will be interested to see how they work around this with the Wii U. They'll probably have a field day with the lighting and textures but could come up limited with actual environments and characters although I heard they do outsourcing.


Also, wtf at that Krizzx example.

How he's comparing Jett Rocket to GoldenEye?
 
What's even more impressive about Jett Rocket is that Shin'en managed to squeeze all of that into 40MB...although Nintendo's mental compression algorithms should take a fair bit of credit for that I think.

That compression is a big part of the reason why I'm not too worried about the RAM difference between the Wii U and the PS4/One.
 
What's even more impressive about Jett Rocket is that Shin'en managed to squeeze all of that into 40MB...although Nintendo's mental compression algorithms should take a fair bit of credit for that I think.

That compression is a big part of the reason why I'm not too worried about the RAM difference between the Wii U and the PS4/One.

That is another thing. I remember there were some devs raging about the 40MB limit thing(after they had released a game and not before?), but it was no problem for Shin'en. They made three games on it and they did quite well according to them.

How good a game looks and runs is more often than not the result of the devs capability than the hardware's. Yet, there are always going to be people who are going to take the worst looking games they can find, then try to pin the hardware as the reason behind it not looking or running so great. To me, the fact that Latte was able to match the quality and even push it beyond some of the last gen games under such horrible development conditions as the ones shown at launch speaks more for it having great performance than low performance.


On a side note, I always find it amazing though, that no matter what I post, the target of ridicule is always myself as opposed to the content of my statement. Then again, personal attacks have always been the preferred method of those with negative intentions.
 
They're not really a big studio though. I think their past games benefited from this where you didn't require alot of people to get best out of the gameboy for example.

It will be interested to see how they work around this with the Wii U. They'll probably have a field day with the lighting and textures but could come up limited with actual environments and characters (compared to other AAA games on last gen) although I heard they do outsourcing.

Also, wtf at that Krizzx example.

I don't think environments or characters will suffer, I assume your talking about polygon counts and number of characters on screen. I think the fact that Black Ops 2 Ground War, matches current gen consoles points to that, I would expect it to exceed them in that regard.
 
Pseudo-static eDRAM, the stuff Nintendo uses, has the exact same latency as real SRAM.

thought so

Same here

Regarding that, if the memory is 1T-SRAM as suspected, then on the GC it had a 10ns latency at 162 Mhz which is a 1 cycle of sustained latency. Is it possible to go lower than 1 cycle regarding latencies?

That would be physically impossible to my knowledge. There is no such thing as half a cycle. Nothing commits until the cycle finishes.
 
I personally let the results speak. The devs that has shown higher quality work is the ones who's opinon will matter the most to me.

Going back to the early GC example. You have one dev saying shading is impossible on the GC and another already using shading on the GC. Its not hard to decide who's statement is more credible for me for someone who says x can't be done and someone who is doing it.

Shin'en has already proved themselves and their capability many times over. They did things on the GBA that I didn't even think were possible.
http://www.youtube.com/watch?v=_14_kTNMD2s

They are truly talented developers that put their money where their mouth is and I respect that.
 
I don't think environments or characters will suffer, I assume your talking about polygon counts and number of characters on screen. I think the fact that Black Ops 2 Ground War, matches current gen consoles points to that, I would expect it to exceed them in that regard.
My knowledge of game development is still preliminary so bear with me but getting high quality assets into a game takes lots of time. There's modeling, texture mapping, baking the maps, animation (i.e mo-cap, skin weighting), qa testing (can't have something breaking the game) etc.

But it's all relative. For example, I don't expect them to pull off worlds like GTA5 in such short amount of time because of the above.
 
I personally let the results speak. The devs that has shown higher quality work is the ones who's opinon will matter the most to me.

Going back to the early GC example. You have one dev saying shading is impossible on the GC and another already using shading on the GC. Its not hard to decide who's statement is more credible for me for someone who says x can't be done and someone who is doing it.

Shin'en has already proved themselves and their capability many times over. They did things on the GBA that I didn't even think were possible.
http://www.youtube.com/watch?v=_14_kTNMD2s

They are truly talented developers that put their money where their mouth is and I respect that.

every time I see that i'm shocked, would love to play it sometime hope it sees a VC release
 
And I won't take any developer word about anything if you have money involved there :P
On the other hand, those are probably the only developers who have a reason to actually try, and therefore the people most likely qualified to speak about those things.
 
Can you tell us why many of the PS360 games aren't better on Wii U even though it has a larger embedded memory, please?

easy, the games weren't optimized for the WiiU system architecture of having a fast small pool of eDRAM and larger pools of slower main memory. for the most part they are direct ports of PS3/360 games. games built from the ground up to take advantage of the hardware should look nicer (see Mario Kart 8 for example)
 
easy, the games weren't optimized for the WiiU system architecture of having a fast small pool of eDRAM and larger pools of slower main memory. for the most part they are direct ports of PS3/360 games. games built from the ground up to take advantage of the hardware should look nicer (see Mario Kart 8 for example)

how is the bolded any different from the Xbox 360? Wii U has a much larger amount of embedded memory. yet those ports aren't that much different from the 360's.

i specifically asked lherre since he's the one who is interested in commenting on Wii U's similarities to "current gen." i'm mainly asking about why the larger embedded memory in the Wii U isn't helping improve a port's performance over the 360 version.

simple question.
 
how is the bolded any different from the Xbox 360? Wii U has a much larger amount of embedded memory. yet those ports aren't that much different from the 360's.

i specifically asked lherre since he's the one who is interested in commenting on Wii U's similarities to "current gen." i'm mainly asking about why the larger embedded memory in the Wii U isn't helping improve a port's performance over the 360 version.

simple question.

Because its not magic and its not run by self optimizing A.I. If you take a cake out of a small box and put it in a bigger box it will still be the same cake. Also, if that bigger box if is a different shape, then you might have to trim, squish or divde the cake even though the box is overall larger.

A developer has to spend the time and subsequently money to take advantage of the hardware. Simply being more powerful is not going to make a game look or run any better. Large portions of the game code would have to be completely redone to take full advantage of different, stronger features and that is not free or easy.
 
Because its not magic and its not run by self optimizing A.I. If you take a cake out of a small box and put it in a bigger box it will still be the same cake. Also, if that bigger box if is a different shape, then you might have to trim, squish or divde the cake even though the box is overall larger.

A developer has to spend the time and subsequently money to take advantage of the hardware. Simply being more powerful is not going to make a game look or run any better.

Edram isn't magic.

Again, I specifically asked lherre why the Wii U's larger embedded memory doesn't allow ports to run better than the 360 versions.
 
Some more insight from Shin'en http://hdwarriors.com/why-the-wii-u-is-probably-more-capable-than-you-think-it-is/

'Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for the GPU if you make scattered reads around the memory. This is never a good idea for good performance.

I can’t detail the Wii U GPU but remember it’s a GPGPU. So you are lifted from most limits you had on previous consoles. I think that if you have problems making a great looking game on Wii U then it’s not a problem of the hardware.'

The statement on bandwidth is interesting. I would have guessed that the statement would have been more in regards to latency - that latency would be an issue with scattered reads. blu? If you're around, would you care to comment?

Can you tell us why many of the PS360 games aren't better on Wii U even though it has a larger embedded memory, please?

Well, one thing we do see is Vsync in almost all Wii U games. That's likely being achieved by triple buffering and that almost certainly is using the eDRAM.
 
Well, one thing we do see is Vsync in almost all Wii U games. That's likely being achieved by triple buffering and that almost certainly is using the eDRAM.

I wonder if vsync is a requirement set by Nintendo, like the early failed 720p requirement set by Microsoft for the 360? Three frame buffers would take a fair chunk of that eDRAM, I wonder if dropping the requirement later (if there is one) would add anything substantial to what could be done with the memory config.
 
And I won't take any developer word about anything if you have money involved there :P


So we shouldn't listen to any developer then, as by default if they're putting any serious work into development on Wiiu, then money is involved ;)


Joking aside, the Shin'en thing has been covered ad nauseum. They're a group of very skilled developers who pride themselves in rinsing the most out of the hardware they have in front of them. And they're better at it than most by all accounts. As I understand it; as Shin'en, they have only developed for Nintendo platforms. But as individual developers, they have a wealth of experience across the board on all sorts of platforms and over a very long period of time.

Can you tell us why many of the PS360 games aren't better on Wii U even though it has a larger embedded memory, please?

I know you were asking lherre, but thought I'd chip in that the performance issues seen in some of the first round of ports seemed to be related to the cpu more than anything, at least going by DFs analysis. 360 and wiiu cpu's are different beasts with different strengths. A game built to take advantage of one, might take some work to take advantage of the other.

Manfred Linzner said:
On Wii U the eDRAM is available to the GPU and CPU. So you can also use it very efficiently to speed up your application.

Oh, looks like Shin'en sorta answered my question from before (from the hd warriors link)
 
Status
Not open for further replies.
Top Bottom