Rumor: Wii U final specs

With the thing out, are the specs still rumors?

The people who have machines need the machines working to do reviews etc. It will take someone like our Chinese Friends who opened up a 3DS. Good news is those people have a Wii U bad news is its a retail box one and not a factory floor one so no guarantee they will open it up
 
That's so contridicting and you know it.

There's not even an official source for the Wii specs, but at some point it becomes fairly clear that the numerous leaks are correct

Yeah no, it seems kinda unbalanced though. And i wouldn't think an ARM11 at 268mhz would be eating up as much power as the 3ds seems to use.

I'm not sure I would say unbalanced, but it's certainly different from your average smartphone. Even a with its very simple gameplay logic, Gunman Clive was mostly CPU bottlenecked during development before I added some optimisation, despite rendering with 4x AA. Though when I say bottlenecked, it still ran at over 100fps on average at that point.
 
That's an awfully low clockrate for an ARM11

I'd imagine it's a bit more specialized/customized than what we'd find in smartphones. So what it lacks in pure speed, it may make up for in efficiency. It's also coupled with higher performance/Lower latency RAM, so I'd wager they'd be a lot less wasted clock cycles too. That's not to say this makes it the 3DS ARM11 an overall better performing as CPU than all those much faster ones found in other devices, but the performance gap likely isn't anywhere near as wide as a direct Hz comparison would suggest.
 
I'm not sure I would say unbalanced, but it's certainly different from your average smartphone. Even a with its very simple gameplay logic, Gunman Clive was mostly CPU bottlenecked during development before I added some optimisation, despite rendering with 4x AA. Though when I say bottlenecked, it still ran at over 100fps on average at that point.

Is that 4x AA in 3d mode? Because 3d mode AA would be nice to see in games that aren't zen pinball
 
Well I'm thinking the Wii U (along with the PS4 and XBox3 for that matter) is designed to offload most of its vector grunt work (ie physics) to the GPU, which immediately reduces the need for a full SIMD unit in each core.

The interesting thing I realised, though, after writing about the A2 core is that Gekko/Broadway actually have a similar sort of multi-function FPU, in that it can operate on either 64bit scalars, or a 2*32bit SIMD unit called a paired single. Given that this was a customisation designed just for Nintendo, it's not out of the realm of possibility that they've got a custom designed "Dual FPU", as it were, that could handle either two 64bit scalar operations or one 4*32bit vector, sort of like a slimmed down, dual-threaded, out-of-order version of the A2 Quad FPU.

I seem to remember IdeaMan mentioning a good while ago that he heard something about SIMD instructions, maybe he'll be able to shed some light.

Agreed on the offloading. And perhaps a custom FPU as you describe is what's going on there. How does that compare to VMX 128?
 
The 3ds has a really small battery for something that has to drive 2 separate LCD screens.

Yeah, but it's considerably larger than the batteries in any of the DS models(1300mAh vs 1050mAh on the DSi XL), and even in sleep mode with wi-fi off the battery life isn't even close to touching the DS sleep mode battery life.
 
The Game Pad battery seems to occupy only about 60% of the free space in its compartment, why on earth could that be?

wii-u-battery-replacement-1_1020_epic.jpg


Pretty suprising if they actually designed in headroom for upgrades to those half-paltry 3-5 hours.
 
DS BC? Given the DS had an ARM7 for GBA BC, and the GBA had a Z80 for GB BC, it seems to be the way they do things on the handheld front.
Dunno. My memory is a little fuzzy, but apparently, some 3DS applications contain both ARM11 and ARM9 binaries. But the encryption hasn't been cracked yet, so it's not possible to reverse engineer those binaries and tell what they do.
 
So with no optical output on this is there any other way to get sound without going the hdmi route? I game on a projector and even though i need to update my surround sound was not wanting to do it just cause of the wii u needing an hdmi connection!
 
So with no optical output on this is there any other way to get sound without going the hdmi route? I game on a projector and even though i need to update my surround sound was not wanting to do it just cause of the wii u needing an hdmi connection!

HDMI into receiver, HDMI from receiver to projector?
 
Agreed on the offloading. And perhaps a custom FPU as you describe is what's going on there. How does that compare to VMX 128?

As far as the A2 is concerned, it doesn't support the full VMX or VSX instruction sets, but it has its own smaller instruction set (not sure how small, though). If Nintendo were going that sort of route I'm sure the instruction set would be heavily customised (although the same could be said about a dedicated VMX/VSX style SIMD unit, as a console hardly needs decimal floating point support).

The Game Pad battery seems to occupy only about 60% of the free space in its compartment, why on earth could that be?

wii-u-battery-replacement-1_1020_epic.jpg


Pretty suprising if they actually designed in headroom for upgrades to those half-paltry 3-5 hours.

It's possible that they'd planned on a larger battery, but swapped it out for a smaller one to reduce costs. The good thing, though, is that even if Nintendo doesn't offer larger replacement batteries themselves, there's nothing to stop third parties from doing so.
 
The Game Pad battery seems to occupy only about 60% of the free space in its compartment, why on earth could that be?

wii-u-battery-replacement-1_1020_epic.jpg


Pretty suprising if they actually designed in headroom for upgrades to those half-paltry 3-5 hours.

Wow yeah.

Absolutely got to be cost saving on a battery there.

On the other hand: 3rd parties to the rescue. Come on boys, toss out a huge effing battery right after launch.
 
Wow yeah.

Absolutely got to be cost saving on a battery there.

On the other hand: 3rd parties to the rescue. Come on boys, toss out a huge effing battery right after launch.
At all. There was a lot of potential to make the hardware cheaper, but Nintendo went with more expensive, more user friendly and elegant solutions. The "Nintendo is cheap" thing is completely ridiculous.
 
Well I'm thinking the Wii U (along with the PS4 and XBox3 for that matter) is designed to offload most of its vector grunt work (ie physics) to the GPU, which immediately reduces the need for a full SIMD unit in each core.

The interesting thing I realised, though, after writing about the A2 core is that Gekko/Broadway actually have a similar sort of multi-function FPU, in that it can operate on either 64bit scalars, or a 2*32bit SIMD unit called a paired single. Given that this was a customisation designed just for Nintendo, it's not out of the realm of possibility that they've got a custom designed "Dual FPU", as it were, that could handle either two 64bit scalar operations or one 4*32bit vector, sort of like a slimmed down, dual-threaded, out-of-order version of the A2 Quad FPU.

I seem to remember IdeaMan mentioning a good while ago that he heard something about SIMD instructions, maybe he'll be able to shed some light.

For the SIMD, what i've heard was very laconic, and from another source that the ones from which most i've said on WUST comes from, so i don't know if it's reliable. Basically, i have asked to someone that should be in a position to know details on the CPU, precise questions on it, and he replied to me with some kind of a PR approach, confirming that everything will be good, improved, etc. Now, was this PR talk or a confirmation that the UCPU will have an extensive altivec unit/VMX akin to Xenon, i'm not sure :(

On another note, a little bit of new info on the CPU: what i know for sure, from my reliable sources, is that the CPU is build in a way that some studios managed to gimp/cripple themselves and don't properly/fully use it for a certain time, before correcting the "problem". It was originating from the way they were using it, their coding, their engine, and not an issue with the CPU itself. I think that one day, if studios will be able to talk of this development process during this last year, you'll recognize what i've just said :p I can't be more specific.
 
Yeah, but it's considerably larger than the batteries in any of the DS models(1300mAh vs 1050mAh on the DSi XL), and even in sleep mode with wi-fi off the battery life isn't even close to touching the DS sleep mode battery life.

I don't know that I'd call an extra 250mAh "considerably larger" when you have a much more powerful SoC, a larger screen, and the powered LCD used for the 3d.
 
At all. There was a lot of potential to make the hardware cheaper, but Nintendo went with more expensive, more user friendly and elegant solutions. The "Nintendo is cheap" thing is completely ridiculous.

I don't think cost saving on a few points = Nintendo is cheap.

It's true that the Nintendo is cheap meme is misguided. Nintendo puts a lot of effort, R&D money, and quality components / materials into places that really matter for what they're going for. People don't get it when they harp on Nintendo being cheap by comparing their hardware to other devices that sacrifice much for the sake of a few points of raw processing and visual power.

Still wouldn't be surprised if a smaller battery is shipped with the pad though, due to squeaking every last component down to fit the final cost criteria. As a battery can be upgraded later, unlike something built into the device.
 
For the SIMD, what i've heard was very laconic, and from another source that the ones from which most i've said on WUST comes from, so i don't know if it's reliable. Basically, i have asked to someone that should be in a position to know details on the CPU, precise questions on it, and he replied to me with some kind of a PR approach, confirming that everything will be good, improved, etc. Now, was this PR talk or a confirmation that the UCPU will have an extensive altivec unit/VMX akin to Xenon, i'm not sure :(

On another note, a little bit of new info on the CPU: what i know for sure, from my reliable sources, is that the CPU is build in a way that some studios managed to gimp/cripple themselves and don't properly/fully use it for a certain time, before correcting the "problem". It was originating from the way they were using it, their coding, their engine, and not an issue with the CPU itself. I think that one day, if studios will be able to talk of this development process during this last year, you'll recognize what i've just said :p I can't be more specific.
Thanks for the update IM. That info does goes along with the namco statement that they need to figure out how to work with the Wii U's CPU.
 
Yeah no, it seems kinda unbalanced though. And i wouldn't think an ARM11 at 268mhz would be eating up as much power as the 3ds seems to use.
It's not that unbalanced when you consider it needs to render in 3D, which nearly doubles the workload on the GPU. But I also think the CPU should indeed be a bit faster, since the GPU seems to have ended up performing better than Nintendo planned (I remember an interview in which Nintendo personnel were surprised when they saw Capcom's RE prototypes). A single ARM11 with a higher clock would have benefited developers better (they could have the OS run on the ARM9 with a higher clockrate).
 
Might be relevant: the Scribblenauts team discuss their time with the Wii U.

Polygon said:
"What was really cool was [Nintendo] was shipping ... new kinds of dev kits — alpha, beta, different versions," Slaczka recalls excitedly. "Nintendo kept on shipping them over and we'd send back the older ones constantly ... They'd send their SDK and try to be as heads up as possible and be like 'this stuff's really early; it could change a lot.' There were times really early on with the dev kits where things would be running at like 10 frames per second. We eventually got it up to 60 frames per second. That just took a lot of time and effort and learning the Wii U's APIs and SDK and stuff like that."
http://www.polygon.com/2012/11/8/3591160/wii-u-nintendo-changed-scribblenauts
 
For the SIMD, what i've heard was very laconic, and from another source that the ones from which most i've said on WUST comes from, so i don't know if it's reliable. Basically, i have asked to someone that should be in a position to know details on the CPU, precise questions on it, and he replied to me with some kind of a PR approach, confirming that everything will be good, improved, etc. Now, was this PR talk or a confirmation that the UCPU will have an extensive altivec unit/VMX akin to Xenon, i'm not sure :(

On another note, a little bit of new info on the CPU: what i know for sure, from my reliable sources, is that the CPU is build in a way that some studios managed to gimp/cripple themselves and don't properly/fully use it for a certain time, before correcting the "problem". It was originating from the way they were using it, their coding, their engine, and not an issue with the CPU itself. I think that one day, if studios will be able to talk of this development process during this last year, you'll recognize what i've just said :p I can't be more specific.

Thanks for the input. On the second point, it would seem to indicate that we're looking at something more complex than three overclocked Broadways, as the Gekko/Broadway architecture is no mystery to most devs (especially Shin'en), unless there was some kind of novel interconnect between the cores.
 
Thanks for the input. On the second point, it would seem to indicate that we're looking at something more complex than three overclocked Broadways, as the Gekko/Broadway architecture is no mystery to most devs (especially Shin'en), unless there was some kind of novel interconnect between the cores.

This is exactly how i've understood this information too (if it was needed, because there are other elements like simply what studios can manage to deliver with this system that contradicts the idea a broadway 1.01 in a three-core version).

I could add as nice last nugget that this self-crippling was HUGE, i mean, it wasn't just a reduction of -5% of what they were supposed to draw from the CPU, it was far more. So yes, the learning curve on Wii U is important, it's a real new configuration, an architecture that requires optimizations, the study of the components, familiarization with the SDK, etc.
 
This is exactly how i've understood this information too (if it was needed, because there are other elements like simply what studios can manage to deliver with this system that contradicts the idea a broadway 1.01 in a three-core version).

I could add as nice last nugget that this self-crippling was HUGE, i mean, it wasn't just a reduction of -5% of what they were supposed to draw from the CPU, it was far more. So yes, the learning curve on Wii U is important, it's a real new configuration, an architecture that requires optimizations, the study of the components, familiarization with the SDK, etc.
From your insiders and from dev interviews, it looks like the crippling of the CPU and using unoptimized code reduced the graphic output between 1/2 and 1/6th it's first-gen capabilities . That is VERY significant. Is such a massive difference normal for devs working on new hardware?
 
Top Bottom