WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
You mean 4GB?

Saw this cross quoted in another thread. Funny im having this same design debate in another thread.

I agree sony got very lucky with gddr5 and ms got unlucky with ddr4.

Designing in the flexibility to change what should be flexible product specs anyway with changes in supply conditions is not "luck" at all. Sony obviously knew what products might be available from their supply contractors, and had the flexibility in their design to go all the way up to 8GB if supply was available/affordable and it was determined necessary.

Did that supply chain open up in the 11th hour? By all indications, it did.

Was it "lucky?" No... No more than say Apple rolling out the retina display with the iPhone 4 was "lucky."
 
Designing in the flexibility to change what should be flexible product specs anyway with changes in supply conditions is not "luck" at all. Sony obviously knew what products might be available from their supply contractors, and had the flexibility in their design to go all the way up to 8GB if supply was available/affordable and it was determined necessary.

Did that supply chain open up in the 11th hour? By all indications, it did.

Was it "lucky?" No... No more than say Apple rolling out the retina display with the iPhone 4 was "lucky."
It was pretty lucky that 4Gbit Gddr5 was available just in time to roll into ps4, IMO. These console are design many years ago. Like xbone was design with ddr4 but it fell behind.

You can call it whatever you want....
 
It was pretty lucky that 4Gbit Gddr5 was available just in time to roll into ps4, IMO. These console are design many years ago. Like xbone was design with ddr4 but it fell behind.

You can call it whatever you want....

Hynix, Sammy, and even Micron (then relative newcomer Elpida) were talking about a production schedule for 4Gbit GDDR5 way back in late 2010. Don't know about Winbond since they had barely started then. Wouldn't call it THAT lucky.
 
Hynix, Sammy, and even Micron (then relative newcomer Elpida) were talking about a production schedule for 4Gbit GDDR5 way back in late 2010. Don't know about Winbond since they had barely started then. Wouldn't call it THAT lucky.

This type of production is delayed all the time, in the same way that smaller nodes are quite often delayed, which then snowball into production delays of products elsewhere.
 
Not chiming in for anything, but I think it wasn't that brilliant to do on Sony's account; and I don't mean from a technical standpoint; the judge is out on that one and I believe they did the improvements they thought they needed to negate possible downfalls of the choice.


Well MS and Nintendo, in particular Nintendo, also put themselves in a precarious situation with their eDRAM/SRAM solution. Dont forget:

Over the weekend, reports that Semiconductor manufacturer Renesas Electronics is in dire straights hit the web. According to these reports, the company will need to close four semiconductor plants in Japan over the next few years. One of these plants that is closing is responsible for manufacturing the Wii U embedded DRAM.

According to reports, Nintendo was responsible for more than half the load at that particular factory at peak times, but slow sales of the Wii U has forced the plant to run at a loss. Some have speculated that this shift in the production line of the Wii U could have repercussions on availability of the console in the future, but Nintendo has announced that a new factory will pick up where Renesas Electronics left off.
 
Exactly, why invest so much in eDRAM when they could have gone with 2 gigs of GDDR5 and be done with it?

I don't understand the question. Backwards compatibility with the Wiis SRAM, for one, power consumption which is always important to Nintendo second, and also what makes you think GDDR5 would be cheaper than the eDRAM?

A lot of the design goals are oriented at backwards compatibility, at that.

The Wii U even has low power consumption RAM typically found in ultrabooks and cell phones iirc.
 
Well MS and Nintendo, in particular Nintendo, also put themselves in a precarious situation with their eDRAM/SRAM solution. Dont forget:
I wonder, AFAIK that closure was predicted to happen for quite a while.

Still, I feel we're derailing from my original point, which was not focused on eDRAM or GDDR5 alone (well it kinda was) but it was also trying to tackle a tendency bigger than that; for Sony the risk comes not from GDDR5 alone, but also a GPU that is not mimicking others as it's aiming higher. It doesn't really strike a balance because it's shooting for the stars not for competitive matching.

Compromises are done to save money; Nintendo did them to no end (probably too much) on Wii U as did Microsoft with XBone; they did so though in order to include peripherals in the spec, something that apparently backfired. Sony invested it all on having the best box, just like they did on PS3 or how Sega did on Sega Saturn; minus the problematic inherent architectures. Slimming down and competing in price could end up being troublematic due to that (sega saturn was discontinued by the time Playstation became popular and cheap, as they had too much logic for being able to follow; and PS3 never really matched X360 prices).

Going higher is also a risk and a risk I don't really understand on Sony's part right now, but they played their game of chess perfectly to this point, something Microsoft and Nintendo didn't. Had they flunked on it, or things happened differently they could be royally screwed now, as their strategy was actually handicapped.

If it fails, it's like Vita, they aimed too high, so they can't pricedrop it for a while; something Microsoft can do (that or throw Kinect out) as can Nintendo to some extent (yes, they say it's break even right now, but they can sure lose some money too; Sony can't afford that, so it's riskier business).
 
Just been doing a bit of reading about the wii and didn't realise that the Hollywood is a gpgpu.

http://forum.wiibrew.org/read.php?15,50783,50791

http://www.1up.com/do/my1Up?publicUserId=6079479
That's because it really isn't.

You should disregard those fanfic things you just read.

Hell, I only read the first line and they speak about Darkside Chronicles having HDR and AA; I don't know if the HDR going on in there can be considered real, but the thing has zero AA going for it.
 
This depends on the engine, but to answer the question you have to go back to how 3d art/animation is created.

1) 3d models are created in a program like 3dstudio/maya/the like, you get a 3d mesh.
2) then for things that are complex (like NPCs), have bones created, which essentially give you the ability to move/animate less control points, they've virtual, the player never sees them, it beats having to animate every vertext on a complicated 3d mesh directly.
3) So you have a smaller number of control points that you can "animate". You can decide to do forward or inverse kinematics on them. Forward is the most basic, which means you rotate at the joints to make your animation. Reverse is more complicated, but you connect mulitple joints in a virtual group, and you can move the end point (say the hand), and the rest of the joints below will move to allow for that joint to be placed where it needs to be (the elbow and wrist will bend automatically), good for walking, as you can get feet to not slide around. This is the key though to your question. These "angles" or "positions" in forward or reverse kinematics are "key framed". So the animator will pick key points in time where the animation is right. For an arm moving, you would have a key frame at the beginning of an animation, and then at the end. The engine/animation player will calculate the inbetween moves. To add more "flare" you can add as many key frames inbetween as you want that deviates from automatic "tween".

What Thunder_monkey is suggesting is that Nintendo opted to not use tweening, and hand animated, every frame @ 30fps, thus, and the engine can't handle tweening to 60fps. I guess another option is the source files are "exported" to the engine which could strip out any tweening, and key frame every frame, but this would only be a problem if Nintendo didn't have any of the source animation files.

Tweening can be rudimentary, averaging and whatnot, but it can also be complicated where you can have ease in/ease out and even curves to better control the motion.

My opinion is that animation isn't a factor in why it's 30fps... But I'm just looking from the outside, just doesn't make sense to me. But feel free to prove me wrong.

Thanks for the reply :)
Wasn't sure this would get answered as it's not directly related to the thread topic.

I come from the field of math, but my understanding is pretty similar to what you describe, and while it does seem like this is a procedure that could be automated for the most part - as far as having sufficient information available in the current (30 Hz) data to allow algorithms to extrapolate correctly to 60 Hz - it would still require some work from Nintendo's programmers to build the tools to make this conversion simple, if they don't have such tools yet.

edit: to make myself clear, from an algorithmic point of view, I'm assuming all the information about the motion of bones (position, speed and acceleration as a function of time, or frame #) can be extracted from the existing 30 Hz data, from which point the problem to solve is finding the proper method of interpolation for each new frame that is to be created. It's still possible a human would need to be heavily involved in this process though, as a combination of different methods may be required to achieve a good result.

as an example, consider this animation, taken from wikipedia:
toG1Bj9.gif

There are three different animations in this gif - one for the position of the ball, a second for its shape, and a third for its color. While it's possible to find and fit appropriate curves to each of these animations, it may require additional knowledge of "cartoon physics" to get the best result when combining them together, meaning a human may still have to examine and make adjustments to many procedurally created frames.
 
That's because it really isn't.

You should disregard those fanfic things you just read.

Hell, I only read the first line and they speak about Darkside Chronicles having HDR and AA; I don't know if the HDR going on in there can be considered real, but the thing has zero AA going for it.

Sorry, just can't believe someone would go to so much trouble to make this all up. Found another link:-

http://tapionvslink.weebly.com/uplo...lywood_is_a_gpgpuaccording_to_dan_emerson.pdf
 
toG1Bj9.gif

There are three different animations in this gif - one for the position of the ball, a second for its shape, and a third for its color. While it's possible to find and fit appropriate curves to each of these animations, it may require additional knowledge of "cartoon physics" to get the best result when combining them together, meaning a human may still have to examine and make adjustments to many procedurally created frames.
That's why there are "key frames", seriously, that's the whole reason for them. In your example, for specifically motion and squash animations:
1) motion, key frame at top left, key frame at bottom, key frame at top right, done. Ideally, it would have ramping and tangents to give a proper angle. With those 3 key frames, every frame between can be extrapolated, you didn't have to "key frame" every frame.
2) similar,unsquashed would be a key frame at first frame, unsquashed just before impact, squash at most deep impact, unsquashed again after impact, and unsquashed again at end of animation.

These will run at any FPS. Especially things that need smooth motion, you don't want to be manually key framing every frame since it's easier for the engine/player to do it smoothly than for an animator, and makes for easier fine tuning of animations (and reuse).
 
How are you sure that they are key framing every frame (I assume you meant every frame, since all animations have "key frames"? And how do you know that's why they limited it to 30fps? I'm asking because if you're getting frame drops at 30fps, it's highly unlikely that it could be pushed to 60fps without massive frame drops.
Haven't been active for a couple of days.

I don't know that's why they limited it. I honestly doubt that's why. I honestly just figured they said "Meh.. too much work for little pay off." The animation work was already serviceable. The more apparent technical issue would be the texturing, severe dithering, and the rudimentary lighting system.

I'm one of the few that does not see the beauty in Dolphin WW shots. Consistent yes... but good looking? I don't know about that. Maybe if poly counts were much higher, giving everything a smoother appearance.
 
How are you sure that they are key framing every frame (I assume you meant every frame, since all animations have "key frames"? And how do you know that's why they limited it to 30fps? I'm asking because if you're getting frame drops at 30fps, it's highly unlikely that it could be pushed to 60fps without massive frame drops.
I'm not following here. 'Keyframes at every frame' is an oxymoron. Keyframes make sense only in animations which also have interpolated frames. If there were no interpolated frames, then keyframes would lose their meaning.

Re the frame drops, that could also be unrelated/indirectly related to the game's main loop. Like, say, miiverse-related comm latencies or some such.

But regardless, I think that once devs established the targeted fps, they also decided what their frame budget would be spent on. So it's quite likely they went for things they could not reproduce at 2x the fps.
 
That's why there are "key frames", seriously, that's the whole reason for them. In your example, for specifically motion and squash animations:
1) motion, key frame at top left, key frame at bottom, key frame at top right, done. Ideally, it would have ramping and tangents to give a proper angle. With those 3 key frames, every frame between can be extrapolated, you didn't have to "key frame" every frame.
2) similar,unsquashed would be a key frame at first frame, unsquashed just before impact, squash at most deep impact, unsquashed again after impact, and unsquashed again at end of animation.

These will run at any FPS. Especially things that need smooth motion, you don't want to be manually key framing every frame since it's easier for the engine/player to do it smoothly than for an animator, and makes for easier fine tuning of animations (and reuse).

I'm not familiar with animation software so I don't doubt your knowledge about what key frames are required to achieve the results in the gif, but I can say with certainty that in terms of the math being done to produce the interpolation, you'd need to provide more information than that in order to get the result we see in the gif.

Maybe some animation programs are designed to make clever guesses, accounting for elements such as gravity and other things, which is why they require less data points. But as an example, from a strictly mathematical point of view, with no further knowledge you'd need at least five data points to uniquely describe something like the ball's path in the gif (assuming piecewise-polynomial interpolation with the middle point used twice, and assuming constant acceleration as is custom in Newtonian dynamics).

But even this is based on assumptions which don't apply to every possible method of interpolation (i.e not polynomial), and certainly don't necessarily apply to non-Newtonian dynamics (like those seen in "cartoon physics"), so without knowing what method Nintendo employ it's not possible to say with certainty that further information is not required.
 
Sorry, just can't believe someone would go to so much trouble to make this all up. Found another link:-

http://tapionvslink.weebly.com/uplo...lywood_is_a_gpgpuaccording_to_dan_emerson.pdf
It's still rampant speculation by people that couldn't accept the Wii was what it was. And that last link goes as far as before the console launched and to the x1600 speculation; which was bull.

An improved Flipper; for whom they didn't even bother adding more framebuffer memory, making AA a rarity in comparison to what it could have been; and dithering a feature won't have Radeon x1x00 stream processors added just because, if they were to do that they would have added vertex and compliant shader units.

And PsysX can run on software.
 
Sorry, just can't believe someone would go to so much trouble to make this all up. Found another link:-

http://tapionvslink.weebly.com/uplo...lywood_is_a_gpgpuaccording_to_dan_emerson.pdf

Nvidia CUDA. On an old fixed shader ATI design. Really.



it's fake


Only if you're using a very loose definition, sure, even those old GPU designs could do some general processing, but it's not in the same sense we speak of today. As lostinblue said, there were a lot of fanfictions about Wii hardware back then and even now, something I'm sure will be repeated (and is being repeated) with the Wii U. There were theories that it was meant to be overclocked with a firmware update but Nintendo aborted it, or that it's framebuffers were used wrong by every single developer including Nintendo due to documentation, etc etc. It got into some really crazy stuff. This is some of it.
 
Yeah, those were the days. I fondly remember when the Wii had a dual-core G5 and a dedicated physics processor. Because Elebits. Or something.
 
Yeah, those were the days. I fondly remember when the Wii had a dual-core G5 and a dedicated physics processor. Because Elebits. Or something.

Especially seeing as even the 360 and PS3 had cores that were cut down from the G5 core. Or that the G5 targeted huge TDPs while the Wii drew 18W. Those were the days.

This was repeated with the Power7 in Wii U thing, come to think of it.
 
Especially seeing as even the 360 and PS3 had cores that were cut down from the G5 core. Or that the G5 targeted huge TDPs while the Wii drew 18W. Those were the days.

This was repeated with the Power7 in Wii U thing, come to think of it.

The difference this time was the whole power 7 idea was kind of IBM's fault
 
Especially seeing as even the 360 and PS3 had cores that were cut down from the G5 core. Or that the G5 targeted huge TDPs while the Wii drew 18W. Those were the days.
That's not right. I can understand why someone would say that, as it's an easy way to get the point across (they're simplified PPC cores, and G5 were contemporary PPC used in high spec computers at high speeds; Sony touting PS3 as a supercomputer, no less); but they had very little in common, or for a basis.

CELL and Xenon are a further development of the IBM guTS/Rivina project:

IBM, like any large technology company does research. In 1997, long before GHz or 64 bit CPUs arrived on the desktop IBM developed an experimental 64 bit PowerPC which ran at 1GHz. Its snappy title was guTS (GigaHertz unit Test Site) [guTS] .

The guTS and a later successor were designed to test circuit design techniques for high frequency, not low power. However since it was only for research the architecture of the CPU was very simple, unlike other modern processors it is in-order and can only issue a single instruction at a time. The first version only implemented part of the PowerPC instruction set, a later version in 2000 implemented it all.

(...)

When a low power, high clocked general purpose core was required for the Cell, this simple experimental CPU designed without power constraints in mind turned out to be perfect. The architecture has since been considerably modified, the now dual issue, dual-threaded PPE is a descendant of the guTS.

The Xbox 360’s "Xenon" processor cores also appear to be derived from the guTS processor although they are not quite the same as the PPE. In the Cell the PPE uses the PowerPC instruction set and acts as a controller for the more specialised SPEs. The Xenon cores uses a modified version of the PowerPC instruction set with additional instructions and a beefed up 128 register VMX unit.
Source: http://www.blachford.info/computer/Cell/Cell4_v2.html

Going onto a Power4/PPC970 and simply peeling everything out and making it 2-way (which they weren't) would make no sense; it would be going back to turn left; instead they uncovered a test system with the PPC instruction set that was abandoned for good reason, but whose development teams hadn't given up completly; with Power6 eventually pulling a 2-way in-order execution solution too.
This was repeated with the Power7 in Wii U thing, come to think of it.
Ironically so, saying PS3 and X360 borrowed heavily from PPC970 is like saying Wii U is heavily related to Power7. ;)

But I'm sure you realized that by now; Cell/Xenon Altivec/VMX implementations were most certainly backported from Power5, just like SMP implementation probably was; Power ISA too. As for Espresso, it inherited eDRAM as replacement for 6T-SRAM from Power7. It's the same situation/misconception.
 
Wow, a coincidence that someone has brought up Elebits (does anyone know why it was called Eledees over here in Europe, I've always wondered..?). I brought the game up and linked to this video when someone on a forum mentioned that nobody can expect decent physics from what he incorrectly assumed was 3 x Broadways duct-taped together lol.

I also linked to the video done by the same guy with over 400 objects in an open area outside too (which personally I think is more impressive despite the considerably lower framerate).

I wonder whether next gen engines will be using the CPUs for physics or use GPGPU functionality..? I would guess the latter.
 
Wow, a coincidence that someone has brought up Elebits (does anyone know why it was called Eledees over here in Europe, I've always wondered..?). I brought the game up and linked to this video when someone on a forum mentioned that nobody can expect decent physics from what he incorrectly assumed was 3 x Broadways duct-taped together lol.

I also linked to the video done by the same guy with over 400 objects in an open area outside too (which personally I think is more impressive despite the considerably lower framerate).

I wonder whether next gen engines will be using the CPUs for physics or use GPGPU functionality..? I would guess the latter.

I thought it was because bits means genitals in the uk and eledees is a play on L.E.D.s
 
That's not right. I can understand why someone would say that, as it's an easy way to get the point across (they're simplified PPC cores, and G5 were contemporary PPC used in high spec computers at high speeds; Sony touting PS3 as a supercomputer, no less); but they had very little in common, or for a basis.

...

I'm aware, I just didn't want to say the whole history (but thank you for taking the time :) ). I meant from a performance standpoint though, compared to contemporary G5s the cores were cut down in that sense.

The per-core performance of the PPE (and the similarish 3 cores in the Xenon) output about half the performance per clock iirc, barring advanced SIMD instructions (which admittedly they had up the wazoo, perhaps using them well they would have came closer to or surpassed G5s at their clock). The 3.2GHz parts offered similar throughput as 1.6GHz G5 cores without any optimization, at any rate
 
I've been thinking about something; during the Iwata Asks for Wind Waker HD, one of the developers said that the game essentially runs off of the same shaders as the Zelda HD Experience demo. I think that it's very clear for whoever's played Wind Waker HD that it makes rather heavy use of screen-space ambient occlusion. So, if Wind Waker HD rund off of the same shaders as the tech demo, why did nobody notice that during the tech demo it was using SSAO?

EDIT: looking back at this photo, maybe we thought that it was baked-on AO for the backgrounds?

zelda-hd-experience.jpg


If those backgrounds are actually being shaded in real time, am I the only one who's impressed?
 
I'd really love both of those tech demos as a free download from the eShop. I'm surprised Nintendo didn't have them both preinstalled on every Wii U since launch tbh. I'd love to see them both up close and personal because I haven't been able to go to any conferences.

Do they still have them these days..?
 
I'd really love both of those tech demos as a free download from the eShop. I'm surprised Nintendo didn't have them both preinstalled on every Wii U since launch tbh. I'd love to see them both up close and personal because I haven't been able to go to any conferences.

Do they still have them these days..?

well i'm sure they've got the demos somewhere, i'd even pay money if they put them up for download
 
No, I mean do they still have them at conferences and stuff..?

Maybe they made changes to the hardware and GX2 API that prevents them from running the way they were? Or maybe they just think that having demos of games to be released are more important..? Dunno. :o/
 
I'd really love both of those tech demos as a free download from the eShop. I'm surprised Nintendo didn't have them both preinstalled on every Wii U since launch tbh. I'd love to see them both up close and personal because I haven't been able to go to any conferences.

Do they still have them these days..?

I know, it's just mental that they don't do it. It'd get nerds like us excited. The conclusion I've come to is that this will be a dungeon in the new Zelda :)
 
The fact that they haven't and apparently won't, along with their choice to re-release Wind Waker instead of a less cartoony styled game as they learn HD development better makes me think that they really don't want us expecting the Zelda Wii U to look like that.

Unfortunate. But hey, maybe these things mean nothing and they'll surprise us.
 
I meant "If the self-shadowing in the background is real-time" instead of the shadows being baked on. I know that the demo WAS actually running on early Wii U hardware.

Isn't it possible to tell from a video? I mean, you could change between day/night in real-time, with the lighting changing accordingly. (Lighting effects were pretty nice at night-settings)
 
Not really surprising or very important, but I got confirmation that the connection between Espresso and Latte is, in fact, a modified 60x bus.

Also, and that seems more interesting and quite a bit weirder, Latte can apparently do hardware PCF (Percentage Closer Filtering). The strange thing about this is that only Nvidia GPUs support that feature as far as I know, which means it's usually implemented using shaders (which is slower, but works on AMD GPUs as well) - or not at all.
 
Is W101 a 60fps title?
Because falling under 60fps is generally more noticeable than
falling under 30fps.
From my playtime it hits 60fps sometimes but hovers around 45fps most of the time with dips to 30. In other words it's a Platinum game.

I'm hoping that Bayo2 will be locked at 60fps but I haven't gotten the impression from other Platinum games that a locked framerates are a high priority for them. It's never impacted my enjoyment of any of their games honestly.
 
Status
Not open for further replies.
Top Bottom