Yeah... Unless that stuff runs on the coprocessor.Its a pretty fair assumption that the browser and home menu won't work when playing online
Yeah... Unless that stuff runs on the coprocessor.Its a pretty fair assumption that the browser and home menu won't work when playing online
We've got far more than we had before the wii or 3ds launch
Do we even have any confirmed clockrates on the 3ds yet?
Nope! X3
2 x 268mhz ARM11 processors.Do we even have any confirmed clockrates on the 3ds yet?
I almost leaked it a few pages back...
2 x 268mhz ARM11 processors.
The GPU is also clocked the same.
There's also a 134mhz DSP in there.
2 x 268mhz ARM11 processors.
268mhz PICA200
134mhz DSP
With the thing out, are the specs still rumors?
You tease you.
That's an awfully low clockrate for an ARM11
source?
There is no source. Nintendo never released specs. But those numbers are 100% accurate.
That's so contridicting and you know it.
There is no source. Nintendo never released specs. But those numbers are 100% accurate.
That's so contridicting and you know it.
It's a pretty awesome clock rate for the GPU though
That's so contridicting and you know it.
Yeah no, it seems kinda unbalanced though. And i wouldn't think an ARM11 at 268mhz would be eating up as much power as the 3ds seems to use.
There's not even an official source for the Wii specs, but at some point it becomes fairly clear that the numerous leaks are correct
Gotta save the battery.You tease you.
That's an awfully low clockrate for an ARM11
That's an awfully low clockrate for an ARM11
I guess, I was just confirming them to be true. Those are the specs Nintendo lists I their 3ds developer overview documents.
I'm not sure I would say unbalanced, but it's certainly different from your average smartphone. Even a with its very simple gameplay logic, Gunman Clive was mostly CPU bottlenecked during development before I added some optimisation, despite rendering with 4x AA. Though when I say bottlenecked, it still ran at over 100fps on average at that point.
Well I'm thinking the Wii U (along with the PS4 and XBox3 for that matter) is designed to offload most of its vector grunt work (ie physics) to the GPU, which immediately reduces the need for a full SIMD unit in each core.
The interesting thing I realised, though, after writing about the A2 core is that Gekko/Broadway actually have a similar sort of multi-function FPU, in that it can operate on either 64bit scalars, or a 2*32bit SIMD unit called a paired single. Given that this was a customisation designed just for Nintendo, it's not out of the realm of possibility that they've got a custom designed "Dual FPU", as it were, that could handle either two 64bit scalar operations or one 4*32bit vector, sort of like a slimmed down, dual-threaded, out-of-order version of the A2 Quad FPU.
I seem to remember IdeaMan mentioning a good while ago that he heard something about SIMD instructions, maybe he'll be able to shed some light.
Yeah no, it seems kinda unbalanced though. And i wouldn't think an ARM11 at 268mhz would be eating up as much power as the 3ds seems to use.
The 3ds has a really small battery for something that has to drive 2 separate LCD screens.
Is that 4x AA in 3d mode? Because 3d mode AA would be nice to see in games that aren't zen pinball
There's supposedly an ARM9 as well, but nobody really knows what it does.2 x 268mhz ARM11 processors.
268mhz PICA200
134mhz DSP
There's supposedly an ARM9 as well, but nobody really knows what it does.
Dunno. My memory is a little fuzzy, but apparently, some 3DS applications contain both ARM11 and ARM9 binaries. But the encryption hasn't been cracked yet, so it's not possible to reverse engineer those binaries and tell what they do.DS BC? Given the DS had an ARM7 for GBA BC, and the GBA had a Z80 for GB BC, it seems to be the way they do things on the handheld front.
So with no optical output on this is there any other way to get sound without going the hdmi route? I game on a projector and even though i need to update my surround sound was not wanting to do it just cause of the wii u needing an hdmi connection!
Agreed on the offloading. And perhaps a custom FPU as you describe is what's going on there. How does that compare to VMX 128?
The Game Pad battery seems to occupy only about 60% of the free space in its compartment, why on earth could that be?
![]()
Pretty suprising if they actually designed in headroom for upgrades to those half-paltry 3-5 hours.
The Game Pad battery seems to occupy only about 60% of the free space in its compartment, why on earth could that be?
![]()
Pretty suprising if they actually designed in headroom for upgrades to those half-paltry 3-5 hours.
At all. There was a lot of potential to make the hardware cheaper, but Nintendo went with more expensive, more user friendly and elegant solutions. The "Nintendo is cheap" thing is completely ridiculous.Wow yeah.
Absolutely got to be cost saving on a battery there.
On the other hand: 3rd parties to the rescue. Come on boys, toss out a huge effing battery right after launch.
Well I'm thinking the Wii U (along with the PS4 and XBox3 for that matter) is designed to offload most of its vector grunt work (ie physics) to the GPU, which immediately reduces the need for a full SIMD unit in each core.
The interesting thing I realised, though, after writing about the A2 core is that Gekko/Broadway actually have a similar sort of multi-function FPU, in that it can operate on either 64bit scalars, or a 2*32bit SIMD unit called a paired single. Given that this was a customisation designed just for Nintendo, it's not out of the realm of possibility that they've got a custom designed "Dual FPU", as it were, that could handle either two 64bit scalar operations or one 4*32bit vector, sort of like a slimmed down, dual-threaded, out-of-order version of the A2 Quad FPU.
I seem to remember IdeaMan mentioning a good while ago that he heard something about SIMD instructions, maybe he'll be able to shed some light.
Yeah, but it's considerably larger than the batteries in any of the DS models(1300mAh vs 1050mAh on the DSi XL), and even in sleep mode with wi-fi off the battery life isn't even close to touching the DS sleep mode battery life.
At all. There was a lot of potential to make the hardware cheaper, but Nintendo went with more expensive, more user friendly and elegant solutions. The "Nintendo is cheap" thing is completely ridiculous.
The Game Pad battery seems to occupy only about 60% of the free space in its compartment, why on earth could that be?
Thanks for the update IM. That info does goes along with the namco statement that they need to figure out how to work with the Wii U's CPU.For the SIMD, what i've heard was very laconic, and from another source that the ones from which most i've said on WUST comes from, so i don't know if it's reliable. Basically, i have asked to someone that should be in a position to know details on the CPU, precise questions on it, and he replied to me with some kind of a PR approach, confirming that everything will be good, improved, etc. Now, was this PR talk or a confirmation that the UCPU will have an extensive altivec unit/VMX akin to Xenon, i'm not sure
On another note, a little bit of new info on the CPU: what i know for sure, from my reliable sources, is that the CPU is build in a way that some studios managed to gimp/cripple themselves and don't properly/fully use it for a certain time, before correcting the "problem". It was originating from the way they were using it, their coding, their engine, and not an issue with the CPU itself. I think that one day, if studios will be able to talk of this development process during this last year, you'll recognize what i've just saidI can't be more specific.
It's not that unbalanced when you consider it needs to render in 3D, which nearly doubles the workload on the GPU. But I also think the CPU should indeed be a bit faster, since the GPU seems to have ended up performing better than Nintendo planned (I remember an interview in which Nintendo personnel were surprised when they saw Capcom's RE prototypes). A single ARM11 with a higher clock would have benefited developers better (they could have the OS run on the ARM9 with a higher clockrate).Yeah no, it seems kinda unbalanced though. And i wouldn't think an ARM11 at 268mhz would be eating up as much power as the 3ds seems to use.
http://www.polygon.com/2012/11/8/3591160/wii-u-nintendo-changed-scribblenautsPolygon said:"What was really cool was [Nintendo] was shipping ... new kinds of dev kits alpha, beta, different versions," Slaczka recalls excitedly. "Nintendo kept on shipping them over and we'd send back the older ones constantly ... They'd send their SDK and try to be as heads up as possible and be like 'this stuff's really early; it could change a lot.' There were times really early on with the dev kits where things would be running at like 10 frames per second. We eventually got it up to 60 frames per second. That just took a lot of time and effort and learning the Wii U's APIs and SDK and stuff like that."
Might be relevant: the Scribblenauts team discuss their time with the Wii U.
http://www.polygon.com/2012/11/8/3591160/wii-u-nintendo-changed-scribblenauts
For the SIMD, what i've heard was very laconic, and from another source that the ones from which most i've said on WUST comes from, so i don't know if it's reliable. Basically, i have asked to someone that should be in a position to know details on the CPU, precise questions on it, and he replied to me with some kind of a PR approach, confirming that everything will be good, improved, etc. Now, was this PR talk or a confirmation that the UCPU will have an extensive altivec unit/VMX akin to Xenon, i'm not sure
On another note, a little bit of new info on the CPU: what i know for sure, from my reliable sources, is that the CPU is build in a way that some studios managed to gimp/cripple themselves and don't properly/fully use it for a certain time, before correcting the "problem". It was originating from the way they were using it, their coding, their engine, and not an issue with the CPU itself. I think that one day, if studios will be able to talk of this development process during this last year, you'll recognize what i've just saidI can't be more specific.
Thanks for the input. On the second point, it would seem to indicate that we're looking at something more complex than three overclocked Broadways, as the Gekko/Broadway architecture is no mystery to most devs (especially Shin'en), unless there was some kind of novel interconnect between the cores.
From your insiders and from dev interviews, it looks like the crippling of the CPU and using unoptimized code reduced the graphic output between 1/2 and 1/6th it's first-gen capabilities . That is VERY significant. Is such a massive difference normal for devs working on new hardware?This is exactly how i've understood this information too (if it was needed, because there are other elements like simply what studios can manage to deliver with this system that contradicts the idea a broadway 1.01 in a three-core version).
I could add as nice last nugget that this self-crippling was HUGE, i mean, it wasn't just a reduction of -5% of what they were supposed to draw from the CPU, it was far more. So yes, the learning curve on Wii U is important, it's a real new configuration, an architecture that requires optimizations, the study of the components, familiarization with the SDK, etc.