WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I'd love for this to be true but I think from what everyone has said it's highly unlikely.

Still just talking about the OS improvement, Nintendo Everything and Polygon measured the speeds and found on average it was 44% faster.

On the power draw, I know nothing about tech or whatever just remembered this in the Criterion interview and thought it might be relevant:

"Wii U is a good piece of hardware, it punches above its weight. For the power consumption it delivers in terms of raw wattage it's pretty incredible. Getting to that though, actually being able to use the tools from Nintendo to leverage that, was easily the hardest part."

I know nothing about tech or power draw, but you guys are saying Wii U is like 30 in idle and only a few watts more when playing a game like NFS? So if there was a clock jump how much more power would it draw, if playing a game only draws a few watts more?

Weren't the rumours that Nintendo purposely underclocked the Wii U CPU prior to E3 to prevent overheating? Digital Foundry said that U is incredibly power efficient and runs cooler than most laptops. Perhaps Nintendo has tested the heat tolerance levels with that massive heat sink and airflow system and found it could handle more so took the handcuffs off the CPU a bit? Again I know nothing about tech just wondering.
 
I'll admit something, it's probably just my imagination, but ever since the update my WiiU's fan is louder than my PC fan. Probably my imagination, or my PC's fans have gotten quitter.

Was going to say this but people were going to jump in, so I am not alone on this.

I know you can tweak fan speed in computers. So there is a chance they increased fan speed, because I recalled the console being so quiet and after the update it is a little louder.
 
I understand portables but can some one give me a reason why you would underclock a home console CPU?

To achieve get the "green" consumer by telling them how little voltage the system takes. That's the only reason I could honestly come up with.
 
Five points to the blog site which does *not* repost this bullshit.
 
I said it at the start but I lack tecnical knowledge. 3.24 / 3 = 1.08. So that is and underclock. System stability???
No, that's even less possible. Furthermore, you never add the clocks of the individual core to achieve the total clock speed. If the CPU runs at 1.24Ghz, then each core of it runs at that speed.

If Nintendo downclock it from 1.24 to 1.08 then all the games released until no would suffer deep framerate problems, or even some graphical bugs.
 
Not that I believe this at all but one of the arguments against it is heat, well the CPU is absolutely minute so could any heat increase actually be easily dealt with by the heatsink?
 
I asked this in the CPU thread as well, but this one is more active..

Could somebody refresh my memory....

I thought the CPU (and GPU for that matter) clock rates were extracted while the Wii U was running in Wii mode. They had to do additional hackery to unlock the other two cores if I call correctly, but have not achieved any of that in Wii U mode as of yet. Fail overflow mentioned being unable to set/change the multiplier...

So how is everyone so cock sure 1.24ghz is the real clock speed? There was no chance it was running underclocked in Wii mode? Maybe there was DEFINITIVE evidence that 1.24 number MUST be correct but my memory is hazy..

Thx.
 
I asked this in the CPU thread as well, but this one is more active..

Could somebody refresh my memory....

I thought the CPU (and GPU for that matter) clock rates were extracted while the Wii U was running in Wii mode. They had to do additional hackery to unlock the other two cores if I call correctly, but have not achieved any of that in Wii U mode as of yet. Fail overflow mentioned being unable to set/change the multiplier...

So how is everyone so cock sure 1.24ghz is the real clock speed? There was no chance it was running underclocked in Wii mode? Maybe there was DEFINITIVE evidence that 1.24 number MUST be correct but my memory is hazy..

Thx.

The clock speeds in Wii mode are Wii clock speeds. 1.24ghz is what it is outside of that.
 
Not that I believe this at all but one of the arguments against it is heat, well the CPU is absolutely minute so could any heat increase actually be easily dealt with by the heatsink?

The heat argument can only come into play once you've got over the first hurdle of getting a CPU like that with such a short pipeline working at that kind of speed.

It wasn't so long ago that people were impressed that Nintendo got the CPU up to the speed it had, all things considered. Now we're really going play along with the idea that it can pull that off as well?

This is a bad day for thread lurking...
 
I asked this in the CPU thread as well, but this one is more active..

Could somebody refresh my memory....

I thought the CPU (and GPU for that matter) clock rates were extracted while the Wii U was running in Wii mode. They had to do additional hackery to unlock the other two cores if I call correctly, but have not achieved any of that in Wii U mode as of yet. Fail overflow mentioned being unable to set/change the multiplier...

So how is everyone so cock sure 1.24ghz is the real clock speed? There was no chance it was running underclocked in Wii mode? Maybe there was DEFINITIVE evidence that 1.24 number MUST be correct but my memory is hazy..

Thx.
fail0verflow has a native mode hack, the clock speeds marcan made public are Wii U mode clocks. In Wii mode, the CPU is running at 729MHz (243 * 3).
 
fail0verflow has a native mode hack, the clock speeds marcan made public are Wii U mode clocks. In Wii mode, the CPU is running at 729MHz (243 * 3).

Okay, thanks. But if it's a 5x multiplier in Wii U mode, 1.24 ghz is slightly too high. Is that "base" from which we're calculating higher in Wii U mode? I'm sure all this was discussed to death; sorry...
 
But what if Wii U was designed to have higher clocks, originally?
Looks like it's the other way around - the system was at some point supposed to run at 1GHz/ 400MHz, but Nintendo decided to bump the clock speeds early on.


Okay, thanks. But if it's a 5x multiplier in Wii U mode, 1.24 ghz is slightly too high. Is that "base" from which we're calculating higher in Wii U mode? I'm sure all this was discussed to death; sorry...
It's 243 * 3 (729MHz) in legacy mode and 248 * 5 (1240MHz) in native mode. Why 5MHz more in native mode? I have no idea.
 
The heat argument can only come into play once you've got over the first hurdle of getting a CPU like that with such a short pipeline working at that kind of speed.

It wasn't so long ago that people were impressed that Nintendo got the CPU up to the speed it had, all things considered. Now we're really going play along with the idea that it can pull that off as well?

This is a bad day for thread lurking...

i was never actually particularly surprised at espresso running 1.24 seeing as how the 750cl could run at 1.1 and the process has had a hefty shrink, now yes i agree the over 3 is ridiculous (just was questioning if heat was the barrier it was claimed to be) but wouldn't surprise me if it could go close to 2
 
I do not wish to back track or bring thi sup. But I was very much so for the CG because it was more fun.

Lol, hahaha. =P
It's ok, I was in the camp that thought Sony was going to put the "cell" (it was more of the cell architecture, and not the 1PPU 6SPU cell) on the PS4 SOC. To be honest, other than costs, it was plausible =[.


Fuck, I hope Kotaku posts this. I really do. This will be hilarious if they did.
 
Nintendo will surely announce an accelerator board at E3

dat extra CPU/FPU
NmTHrIx.jpg

LMAO! thanks for the hearty laugh this morning!
 
I still laugh at people who thought that Joakim couldn't have been in every way possible CG


Some people don't have a concept of how gpu are powerful today.

Konami making tech that surpasses Avatar and other CGI intensive movies? Yeah, no.

The lighting was too spot on, the "animation" was too real, there were so many dead giveaways.
 
He should have gone for the much more realistic 2Ghz, instead he reached for the stars with a 3.0Ghz. Never go full GameFAQs.

Heh, maybe the rumor reporter fudged up the original rumor, and reported it as a 2ghz increase instead of just 2ghz period...
no
 
Wow. nintendo fans are so desperate that they are making this CPU 2-3X overclock bullshit?

Just look at WiiU's cooling system and ask yourselves.
 
Well not a single person on THIS site seems to believe it...

Didn't mention anything about nintendo fanboys on THIS site. =X

Just quick glance at comments on nintendo fansites makes my head roll. They have absolutely no idea, and desperate enough to believe ANYTHING.
 
Didn't mention anything about nintendo fanboys on THIS site. =X

Just quick glance at comments on nintendo fansites makes my head roll. They have absolutely no idea, and desperate enough to believe ANYTHING.

That's what's good about neogaf. Fanboys of all stripes HERE at least usually have SOME level of rational thinking. xD

You're right, a couple of those comments sections are absurd.
 
The GPU being bumped to 800mhz is reasonably possible. Low end video cards are notorious for being overclockable. Even with passive coolers on them. The only problem I have with the idea is that 800 mhz is a stretch. Very few mobility radeons got close to that speed ( at stock of course) from the possible line that latte is from.
That said the cooling system on the wiiu is very well made and because you have everything on one chip it's just a sinle cooling solution for everything. Just one big heatsink with a fan on the end.
Now as for power draw, hmm thats a tough one. In theory it may not be that much more with the overclock.
In the end though its very likely that this whole story is bollocks. What the hell though it's fun to spectulate!
 
The GPU being bumped to 800mhz is reasonably possible. Low end video cards are notorious for being overclockable. Even with passive coolers on them. The only problem I have with the idea is that 800 mhz is a stretch. Very few mobility radeons got close to that speed ( at stock of course) from the possible line that latte is from.
That said the cooling system on the wiiu is very well made and because you have everything on one chip it's just a sinle cooling solution for everything. Just one big heatsink with a fan on the end.
Now as for power draw, hmm thats a tough one. In theory it may not be that much more with the overclock.
In the end though its very likely that this whole story is bollocks. What the hell though it's fun to spectulate!

please stop. The CPU and GPU are under the same aluminum block. It's simply not a good idea to invest brain cells into thinking about the possibility.

So embarrassing....
 
I noticed more hiccups in every game AND in the Mii Plaza. Also, the only thing that goes faster now is booting up games. The rest is still slow as hell. I don't know man...

Doesn't necessarily mean there's some truth to the rumor or its not possible. But personally, I doubt it, what I'm seeing probably has more to do with software than anything else. Anyone that does PC gaming, would agree that updated drivers improve hardware performance.
 
Doesn't necessarily mean there's some truth to the rumor or its not possible. But personally, I doubt it, what I'm seeing probably has more to do with software than anything else. Anyone that does PC gaming, would agree that updated drivers improve hardware performance.

Does the Wii u GPU even have a 'driver' I know the 3ds does but was under the impression the Wii u didn't
 
People really think that nintendo would go with a huge clockspeed bump post launch? Don't you think they would be wary of such a move after MS lost a billion or more on the RROD?
 
People really think that nintendo would go with a huge clockspeed bump post launch? Don't you think they would be wary of such a move after MS lost a billion or more on the RROD?

No nobody here believes this clock bump one bit, however speculation as to the possibility of some sort of clock bump at some point is valid, i'm of the opinion nintendo considered clock bumping the wii (clockspeeds were software set) but probably abandoned any plans after the gpu melting problems
 
Status
Not open for further replies.
Top Bottom