WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I'm no expert, but at first glance, Hollywood looks to be much more different than simply an "overclocked Gekko"

didn't expect that, personally
 
This has probably been addressed somewhere, but about v-sync... In all head to head comparisons that I've seen, the Wii U versions has perfect v-sync while the PS360 versions have some, sometimes rampant, screen tearing.

Do we know the technical reason for this?
 
This has probably been addressed somewhere, but about v-sync... In all head to head comparisons that I've seen, the Wii U versions has perfect v-sync while the PS360 versions have some, sometimes rampant, screen tearing.

Do we know the technical reason for this?

Secret Sauce...

Sorry, couldn't help myself
 
I'm no expert, but at first glance, Hollywood looks to be much more different than simply an "overclocked Gekko"

didn't expect that, personally
Well, yes. But seems mostly rearranged; the big blocks (and most apparent changes) are the L1 and L2 cache blocks (well, actually L2 cache, L1 seems to stay at relatively the same proportion, just slightly moved around?).



(Broadway > Gekko - same height for side by side comparison purposes)

I'm guessing since they're bigger than Gekko's, and the cache is supposed to be the same size (32KB instruction + 32 KB data L1 cache + 256 KB L2 cache) then either L2 couldn't be core shrinked like the rest of the CPU or they added 512 KB L2 or opted for less density cache blocks, I dunno.
edit - lol was a mislabling, can see some noticable similarities
Yes, my bad, fixed.
This has probably been addressed somewhere, but about v-sync... In all head to head comparisons that I've seen, the Wii U versions has perfect v-sync while the PS360 versions have some, sometimes rampant, screen tearing.

Do we know the technical reason for this?
A scanline renderer theory has been proposed, seeing the DS GPU had one.
 
This has probably been addressed somewhere, but about v-sync... In all head to head comparisons that I've seen, the Wii U versions has perfect v-sync while the PS360 versions have some, sometimes rampant, screen tearing.

Do we know the technical reason for this?

I think this is something Nintendo may have as a policy or something they highly encourage in titles that run off-tv at least. It doesn't point to dedicated silicon per say, and could actually be one of the factors that is putting a hurting on overall performance.
 
I think this is something Nintendo may have as a policy or something they highly encourage in titles that run off-tv at least. It doesn't point to dedicated silicon per say, and could actually be one of the factors that is putting a hurting on overall performance.

I thought I read somewhere that this is a GamePad requirement or is related to the GamePad in some way.
 
I thought I read somewhere that was a GamePad requirement.

Yup, I mentioned off-tv play as being a reason. And then it would be pretty crappy if you were playing the same game on your HD screen and the tearing showed up.

Basically my suggestion is if the game features off-tv play, vsync should be turned on regardless of which screen you're using.

This should be pretty easy to test actually. Are there any games which feature off-tv play that don't have vysnc enabled?
 
DarkSiders 2 has screen tearing though.

Then this would point towards it not so much as being a Nintendo requirement on devs as something that Wii U developers seem to agree is a good idea.

Or there is fixed function hardware there that Vigil didn't employ for whatever reason. It's possible, but IMO less likely than the simpler explanation.

It bears noting that it could be a combination of both. Nintendo could have added in some extra support for vsync, but it still takes a toll on performance somewhat. Few things are truly free. Even Gamecube's hardware lighting was dependent on poly count and whatnot.
 
Then this would point towards it not so much as being a Nintendo requirement on devs as something that Wii U developers seem to agree is a good idea.

Perhaps, though for anyone who has been through the submission process during the launch of a new console, would know that "requirements" are pretty liquid. The last console generation was a gong show for TRC/TCR/Lot Checks, all parties really wanted launch/launch window titles and let many requirements slide.
 
"Weird" seems to be par for the course for this system. After all the pre-release talk about making it a great system for simple ports and nice and easy to work with, it makes me wonder what everyone was smoking... :-)

Wasn't that all coming from gaming website writers without technical backgrounds based on what they were trying to piece together from rumors? Talking up the system in hopes of getting traffic. When in truth the only people that know how hard or easy it will be will be the developers and even then that's going to differ based on the quality of the development teams. What's easy for one will be a struggle for another, etc.
 
The CPU is weird.


EDIT: Very weird.

Ah so it's true then that "It's almost impossible to understand the Wii U in the abstract, without playing it. And even then you won't be sure of it, because the Wii U isn't sure of itself, and that's its greatest virtue. In an age when showy CEOs shout hubristic, trite predictions about the inevitable future of games, The Wii U offers an understated bravado that's far more courageous. With it, Nintendo admits, "we don't know either." We don't know what video games are anymore, or what they will become. It's a huge risk, and it's probably the most daring move Nintendo has made in its 125-year history. Domestication through polite ferocity. Feral design.


and that "Even Nintendo may not have fully realized what it has done. It has domesticated the wildness of the present moment in video games, consumer electronics, the internet, and home entertainment by caging them out in the open. It's lurid and beautiful and repugnant and real, like watching Mickey Mouse smoke a joint in the alley behind Space Mountain."?**

Written by the great Ian Bogost.
 
This bothers me a little so Third-party devss hadn't gotten the complete tools and support on time to get the graphics and GPU to run at an acceptable frame-rate, why ?? Or was this purely because of the launch deadline ?
Sumo mentioned it too, said it was like other usual console launches they've had where stuff tends to be in flux until really late in the cycle.
Yeah, I wonder if those comments indicate it's more than just 3x overclocked gekko with added cache, OR if they ALREADY consider that "a different kind of chip"
It was already different for many reasons discussed in the previous threads.
 
Wasn't that all coming from gaming website writers without technical backgrounds based on what they were trying to piece together from rumors? Talking up the system in hopes of getting traffic. When in truth the only people that know how hard or easy it will be will be the developers and even then that's going to differ based on the quality of the development teams. What's easy for one will be a struggle for another, etc.

Nah, a lot of it was coming from devs talking on- and off-record from what I remember. There was certainly no indication of any peculiarities of the hardware or of any particular difficulties with bringing PS3/360 titles over - all the talk was of spending little time doing that basic work, then spending longer adding features and improvements.

Even assuming a lot of that on-record chatter was encouraged to push the idea of the system as a clearer jump than it is, it's surprising how little talk there was off-record pre- and post-launch about the complexities of bringing titles over. The DF feature with Criterion is the first piece I can think of that didn't break down to either "yeah, the Wii U is great and we ported our game in a fortnight!" or "The CPU is shit, it's barely current-gen!" and actually suggested there are architectural complexities that mean what you see on paper isn't necessarily indicative of what it's really capable of.
 
Well you have the haters still saying its using 360 geometry... So what gives? Is it more like PC version or 360 version?

The technical director of NFS said they only used the textures of the high-end PC version in the digital foundry article, presumably he'll know better then Ward.

IMO, higher res textures will be more apparent than models.
 
I wouldn't expect anything less from Nintendo. This console will be like GameCube as in its gonna significantly produce things that the numbers we find says it shouldn't. It's gonna be a fun ride... Especially in the age of gaming we are in now.

Times like this I wish Factor 5 were still around to give us a ceiling to look at.
 
Digital Foundry is biased against Nintendo.
Beyond3D is biased against Nintendo.
The technical director of NFS is full of haterade by saying his game only has 360 geometry.

gg

I'm putting together a new thread, guys. Hang tight. You'll all get to see the die, as with Latte.
I'm surprised ChipWorks is being so forthcoming. It's awfully generous, although I suppose it provides something akin to free publicity.
 
Ok so what does that mean exactly. Can we tell anything? What makes it weird compared to other cpu's? Nintendo said it was one without any strange habits.
It's actually not that weird now that I think about it. There's some SRAM that looked a bit weird at first, but probably isn't.
 
Maybe it is? No idea. Very clean layout. It's almost completely symmetric.

Well for a tiny morsel of thought, I believe somewhat organic looking structures in CPUs sometimes point to a hand-tuned transistor design, while cleaner symmetric looking ones are auto-placed by computer programs.

I haven't seen it nor do I know if this is always or mostly always true, just thinking out loud, so put no weight into this.

Looking forward to the thread, hopefully it's less confusing than this one :P
 
Well for a tiny morsel of thought, I believe somewhat organic looking structures in CPUs sometimes point to a hand-tuned transistor design, while cleaner symmetric looking ones are auto-placed by computer programs.

I haven't seen it nor do I know if this is always or mostly always true, just thinking out loud, so put no weight into this.

Looking forward to the thread, hopefully it's less confusing than this one :P

Actually it's pretty much the other way around. On the A6 you can see the CPU cores arranged by hand in a nice symmetrical layout, and then the GPU cores which, laid out by computer, are just solid blocks of brown. Computers don't much care for symmetry when it comes to chip layouts (which is why we get awkwardly shaped stuff like N4 on the Latte die).
 
Ahh, my bad, that is right. Had it reversed.

Now it depends on if he was talking about the overview of the larger structures, or the smaller structures within the core itself.
 
Well for a tiny morsel of thought, I believe somewhat organic looking structures in CPUs sometimes point to a hand-tuned transistor design, while cleaner symmetric looking ones are auto-placed by computer programs.

I haven't seen it nor do I know if this is always or mostly always true, just thinking out loud, so put no weight into this.

Looking forward to the thread, hopefully it's less confusing than this one :P
This thing looks like it was laid out by hand. Dunno...
 
This thing looks like it was laid out by hand. Dunno...

Nah, those are just things like SRAM, registers and other standard components. Intel (and Apple) seem to be the only folks laying out CPUs by hand these days.

Edit: Anyway, I should save this for the CPU thread.
 
Nah, those are just things like SRAM, registers and other standard components. Intel (and Apple) seem to be the only folks laying out CPUs by hand these days.
Certainly possible. No idea. I think it looks too organized to be laid out by a computer, but I obviously don't know.
 
Certainly possible. No idea. I think it looks too organized to be laid out by a computer, but I obviously don't know.

I am no expert on VLSI design, but I'd have thought that a procedurally generated - ie, not laid by hand - IC would be one that is more organized.
 
Status
Not open for further replies.
Top Bottom