Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
Prediction update time. I think everyone is getting to the point where we're agreeing on the specs. This is just a writeup of what I think might be an explanation of some of the specs.

CPU: Based on lherre's comments, the devkit CPU is both of near final design as well as some weird assymetric unit. I'm speculating here that it will be built around one single full-fledged POWER7 core. This core will still be highly customized when compared to the server chip, but will share a similar amount of execution units to support a similar high performance as the original POWER7. This single core will be accompanied by simpler cores, I think either highly simplified POWER7 units. These cores will support the instruction set, but will be much simpler to reduce die size and power usage. All cores will support the VMX128 instruction set like the Xbox 360 CPU. All parts of the CPU will be barely be recognizable as sharing a similar base as the POWER7 server chip. One customization will be that L3 cache is scrapped for a larger L1 cache and a larger L2 cache using EDRAM. Customizations will allow a small die size and a high clock speed - rumours indicate the devkit is running at 3.2+ GHz. I think server POWER7 is a very bad indication of what power usage it'll have at that speed however.
Prediction: 1 core w/ 4-way SMT, ~2MB L2 EDRAM cache, 2 cores w/ 2-way SMT, ~1 MB L2 EDRAM cache. Cores share a common base with server POWER7, and will be clocked at ~3 GHz. ~30W TDP on a 45nm process, though 32nm shouldn't be out of the question (despite IBM's June press release).

GPU:The RV770LE sets a considerable number of constraints. The final GPU will feature 640 SPUs - if the final GPU would have 800 SPUs Nintendo should have used an RV770Pro - at 600 MHz clock speed. The final unit will probably be very similar in featureset as an RV740. That's probably only the case because the raw numbers make sense for a small efficient GPUs, and the GPU will feature a completely different design with several magic Nintendo tweaks we won't know the details of until some time in the Wii U's life. 32 MB of eDRAM like the one in the Xbox 360 is pretty much a given.
Prediction: 640 SPUs (probably VLIW5, VLIW4 is not ruled out yet I think), 32 TMUs, 16 ROPs, ~600 MHz clock speed, 32 MB eDRAM intelligent framebuffer. Competely new design although similar in featureset as an RV740/RV770LE. ~35W TDP on a 40nm process.

Memory: Assuming the devkit uses GDDR3 with the GPU having a 256-bit bus to that memory, we can only say that that's probably not final. The 256-bit bus does tell us that the memory in the devkit uses a number of memory chips that is a power of 2. This suggests that there won't be 1.5GB of memory - which also makes less sense for other reasons. If they end up using a 128-bit bus for the GPU (which is very likely) GDDR5 is an obvious choice, but I won't rule out XDR2 just yet because it may allow the usage of less memory chips and increased performance for lower power usage.
Prediction: 1GB unified. Could be anything.

Other stuff: One mystery is that lherre has indicated that the devkit indicates the final unit will have more than 1GB, however slightly that might be. 1.5GB is unified is a weird number that is problematic for other parts of the system design, and 2GB is probably too much. I think Nintendo will use a big chunk (at least 32MB, maybe even up to 256MB) of IBM EDRAM, 1T-SRAM or a similar type of low latency memory in the Wii U to achieve that. The reason I think they will do so is because the main memory it will have is high latency, even if it's GDDR3. A second reason is that this allows the Wii U design to be approached as some sort of superpowered Wii - a powerful single core OOE Power-based CPU (with optional additional extra cores), a small but quickly accessible chunk of primary memory, a large chunk of video memory and a GPU with super fast memory on board. I don't know whether this will be very advantageous, but I think it could definitely help speed up design for simpler games and avoid the headaches developers had with 360 and PS3.

Another thing we don't know nothing about is tesselation. I'm thinking it will definitely be there, but I'm wondering how much. I'm guessing that it should in principle be possible for the Wii U GPU will be at least as capable as a Barts or Southern Islands chip of similar specs.

bgassassin's got me convinced about the BC stuff. There'll probably be some ARM9/ARM11 chip doing various system stuff. I'm guessing total system TDP with CPU, GPU, motherboard, I/O chip and disk drive and other functionality will be around 80W.
 
I would hope for VSX instead of VMX or VMX128. VSX has very impressive single and double precision performance characteristics.
There were people saying that double precision isn't relevant at all for gaming? Besides, VMX128 will mean it shares the same instructions with 360 and maybe even the 720. VSX instructions are completely different I think I read somewhere.
 
I think the most we will get is DS type stuff. I think the potential for innovation wil come from multiplayer. What I wonder is will all games support tablet only play? If so will the gameplay change when u only play on the tablet?



According to the Darksiders devs, tablet only play took minutes to implement.
I think, that for single player, it'll work out just fine (besides in some cases like the Rhythm Game at E3).
But for the majority of games, it should be ok. I'm betting most tablet experiences will be pretty.... unnecessary.

Also, incase I'm banned before I get home from work for some unseen reason:
I love you all. Hit me up on Steam.
:D
 
It's three, three Drakes in one!


Does anyone else want to see them do more with the sensor bar, instead of its current form of just 2 LEDs on a stick? The console obviously still needs it, so it has to exist in some form. I know the singular Wii U tablet has a camera and mic, but supposedly not everybody will be using one. I'd like it if they put maybe a higher quality camera (than on the tablet), and a Wii Speak-style mic into the sensor bar, just build it into an all-in-one chat solution for the console.
 
There were people saying that double precision isn't relevant at all for gaming? Besides, VMX128 will mean it shares the same instructions with 360 and maybe even the 720. VSX instructions are completely different I think I read somewhere.
Those differences only really matter if someone codes in assembly, otherwise only the compiler really cares. Also, while double precision doesn't really matter, VSX is the fastest double and single precision SIMD unit on the market right now as far as I know. It could be heavily tweaked similar to VMX128, as VSX also has integer and decimal FP features that are quite useless. They could probably remove those and bump the number of registers instead.
 
It's three, three Drakes in one!


Does anyone else want to see them do more with the sensor bar, instead of its current form of just 2 LEDs on a stick? The console obviously still needs it, so it has to exist in some form. I know the singular Wii U tablet has a camera and mic, but supposedly not everybody will be using one. I'd like it if they put maybe a higher quality camera (than on the tablet), and a Wii Speak-style mic into the sensor bar, just build it into an all-in-one chat solution for the console.

The camera area on the prototype padlet actually made me think there may eventually be some sort of sensor built into it -- perhaps to facilitate TV-free Wii gaming.

I'd like there to be some kind of surprise new functionality though, whether its sensor based or not. While Nintendo are introducing a new controller in the Wii-U controller, to not update the base motion controller for a console whose predecessor defined an entire motion controlling generation seems distinctly, well... Sony / Dualshock.

I'd definitely be in favour of outer facing cam(s) on the Wii-U padlet, because it would enable AR, switching viewpoints in video-chats, using it as a camera etc.
 
i'm the exact opposite. I hope nintendo support this heavily, as it would give me much mire opportunity to play while at home when the tv is being used.

seems to clash with using the controller screen for 'innovation' though, i'm curious how nintendo deal with that

Probably just 'dumb down' the control options for when it's on a controller.
 
The camera area on the prototype padlet actually made me think there may eventually be some sort of sensor built into it -- perhaps to facilitate TV-free Wii gaming.

Huh? It's already known there's a sensor bar built into the Wii U tablet for just that purpose. (Not to mention other tablet/remote interaction.)
 
Huh? It's already known there's a sensor bar built into the Wii U tablet for just that purpose. (Not to mention other tablet/remote interaction.)

I didn't realise that was confirmed.. I wonder what happens if you're using a remote with the Wii-U and the remote can see both the sensor in the Wii-U padlet and the sensor bar above or below your TV? Remote confusion?
 
I didn't realise that was confirmed.. I wonder what happens if you're using a remote with the Wii-U and the remote can see both the sensor in the Wii-U padlet and the sensor bar above or below your TV? Remote confusion?

Yeah. How do you think the golf game worked in the demo reel? The pad has it's own sensor bar.
 
We won't get a 28nm SoC. The CPU is confirmed to be 45nm.

The SoC would theoretically include the GPU cores, eDRAM pool, DSP (hopefully for some kick ass 7.1), and ARM processor to handle I/O, Wifi, and other mundane tasks. The IBM CPU would be its own chip.


I ended up forgetting those don't have a GDDR3 variant. I don't see us being able to properly translate what's in the dev kit to what the final GPU will look like. What I mean by that is when you say it won't be 4870 due to the bus size. That's why I keep my final specs general. Based on the fact the GPU will have Eyefinity, that would eliminate all R700s.

Oh, of course. Don't misunderstand - when I say it won't be a 4870, I just mean I don't think they'd use that in a dev kit or as some sort of base when it has GDDR5 on a 256-bit bus. It would just be too much of a performance difference to make any sense.

Also, remember way back before E3, IGN claimed that the 4850 would be the closest consumer card to Wii U's GPU. Now, their little build-a-bear project was inane, but they must have gotten that information from somewhere. The 4850 hits the rumored 1 TFlop right on the head.
 
Those differences only really matter if someone codes in assembly, otherwise only the compiler really cares. Also, while double precision doesn't really matter, VSX is the fastest double and single precision SIMD unit on the market right now as far as I know. It could be heavily tweaked similar to VMX128, as VSX also has integer and decimal FP features that are quite useless. They could probably remove those and bump the number of registers instead.
Maybe I should have said "the CPU will have fast vector processing instructions", because VMX128 is quite specific and 2005 tech. I'm thinking they'll definitely need it because both 360 and PS3 are vector and floating point beasts.

IBM aggressively cut down on redundant VMX128 features (it's not 100% compatible with AltiVec) in the Xbox 360 to decrease chip size, so I think they'll be doing the same with the Wii U. The VSX units on POWER7 appear to be very large, so there's not going to be much left.
bgassassin said:
PS360 can run BF3. Wii U just probably wouldn't run it at max with 1080p and 60FPS.
Hmm:
amd%20high%201920.png

Look at the Radeon HD5750. We seem to have established that it is likely the Wii U GPU will have a configuration of 640 SPUs, 32 TMUs, and 16 ROPs @ 500-700 MHz. The Radeon HD 5750 has a configuration of 720 SPUs, 36 TMUs and 16 ROPs @ 700 MHz, and it's running Battlefield 3 at 1080p with 30+ fps. This was using beta drivers as well. If you consider that EA optimized the console versions enough to have it run on old Geforce 7xxx tech used in the PS3, I'm thinking it's certain the Wii U can run Battlefield 3 at 1080p30 and maybe even at 1080p60 at this 'high quality' fidelity, provided the GPU is like what we've been deducing in this thread.
 
If PS4/Xbox720 can run BF3 at PC Max settings at 1080/60FPS and Wii U is 720/60FPS, then I'll be pleased with that. It's close enough that Wii U won't miss out on any games from third party devs. Or I hope....

A scenario I can see that shouldn't draw any complaints. Though some will still find something to nitpick.

Prediction update time.

Yeah I think we're all settling on a similar idea of what the final would look like. Only things I would say in response is that I still have no reason to believe they would use XDR2. And based on the density of IBM's eDRAM, 256MB would be a ~512mm2 die of memory alone. Back when I was thinking of larger amounts of 1T-SRAM, I wasn't giving enough acknowledgement of the density.

Also, incase I'm banned before I get home from work for some unseen reason:
I love you all. Hit me up on Steam.
:D

LOL.

It's three, three Drakes in one!


Does anyone else want to see them do more with the sensor bar, instead of its current form of just 2 LEDs on a stick? The console obviously still needs it, so it has to exist in some form. I know the singular Wii U tablet has a camera and mic, but supposedly not everybody will be using one. I'd like it if they put maybe a higher quality camera (than on the tablet), and a Wii Speak-style mic into the sensor bar, just build it into an all-in-one chat solution for the console.

And we shall call it... Kinect. :P

The SoC would theoretically include the GPU cores, eDRAM pool, DSP (hopefully for some kick ass 7.1), and ARM processor to handle I/O, Wifi, and other mundane tasks. The IBM CPU would be its own chip.

That's why MCM would be a better label. SoC implies the GPU and CPU are on one chip.

Oh, of course. Don't misunderstand - when I say it won't be a 4870, I just mean I don't think they'd use that in a dev kit or as some sort of base when it has GDDR5 on a 256-bit bus. It would just be too much of a performance difference to make any sense.

Also, remember way back before E3, IGN claimed that the 4850 would be the closest consumer card to Wii U's GPU. Now, their little build-a-bear project was inane, but they must have gotten that information from somewhere. The 4850 hits the rumored 1 TFlop right on the head.

Ok, though I would suggest a 128-bit bus and eDRAM with the bandwidth intensive components moved to the eDRAM die might be equal to that. And yeah the IGN article was floating around in my head as well when typing that stuff the other day. That's what has kept me from locking into 640 or 800 ALUs and instead sticking to that range.
 
And based on the density of IBM's eDRAM, 256MB would be a ~512mm2 die of memory alone. Back when I was thinking of larger amounts of 1T-SRAM, I wasn't giving enough acknowledgement of the density.
For the same amount of chip size as in the GameCube, Nintendo can fit 326MB of regular 1T-SRAM on 28nm in there (isn't technology great?). So I'm thinking it should be possible to go with 128 of 192MB.

I feel like such a geek tonight.
 
Spec predictions...

Yup, we all seem to be coming closer to together with our expectations. Although, I do have a few notes/preferences.

CPU: I can see a 3.6 Ghz final clock. We haven't heard of anything amiss on IBM's end. Seems like they got their shit together with this chip. I am going to stick with the rumors of all cores being 2-way SMT. I think they'll be identical except for the different amounts of L2 cache. And I think they'll keep the VSX instead of VMX128. It seems to be an integral part of Power 7, and, as wsippel has shared with us, it shouldn't be much of an issue porting the code over from 360/PS3 regardless.

I'm very confident on the 3MB L2 cache. This is exactly 3 times the amount in the 360, and IBM's eDRAM process enables exactly 3 times more eDRAM on a die than the SRAM, which 360 was designed with. They may have switched that to eDRAM in the latest 360s, as they said they would - I don't know.

GPU: I do think it will have alot in common with the R700 family. VLIW5...the architecture of the chip will remain the same. I do think it will have improved tesselation (although perhaps not quite as good as Southern Islands) since Nintendo even added that into the 3DS. And Eyefinity support, of course. I agree that they are probably targeting a 600 Mhz clock and 640 spus, although at 28nm (which I believe they are gunning for - AMD called it a "modern" chip and I'll take that optimistically) 800 spus might be doable.

Memory: I think they're planning on 1 GB GDDR5. Nintendo tends to go for seemingly smalls pools of fast RAM, although I'm hopeful they decide to throw in an even 2 GB due to pressure from developers wanting easier ports of PS4/720 games.

Other Stuff: I'm not buying the 256 MB eDRAM idea. That sounds insanely pricey. Plus, with a nice fat L2 cache and the 32 MB eDRAM on the GPU, we're already looking at a powerful and efficient system. The Wii was designed with low-latency as a priority, but that does not necessarily mean they will repeat that with Wii U, if only because there is now much faster and cheaper RAM. Remember, Nintendo actually did not increase the amount of 1t-SRAM from the GCN to Wii. They just slapped a stick of GDDR3 on there instead. Nintendo are probably trying to keep the cost of that component roughly the same. 32 MB isn't that much more than 27 MB...
 
GPU: I do think it will have alot in common with the R700 family. VLIW5...the architecture of the chip will remain the same. I do think it will have improved tesselation (although perhaps not quite as good as Southern Islands) since Nintendo even added that into the 3DS.

are their any 3DS games that actually make use of tessellation?

it seems the relatively low screen resolution would be too small for it to even be noticeable.
 
are their any 3DS games that actually make use of tessellation?

it seems the relatively low screen resolution would be too small for it to even be noticeable.

It's billed as Geo Shader/polygon subdivision on the 3DS' PICA200 GPU. I don't know of any specific examples, but I'm sure there's a benefit to it being included regardless of the screen being small.
 
are their any 3DS games that actually make use of tessellation?

it seems the relatively low screen resolution would be too small for it to even be noticeable.
Since PICA200 is targeted at embedded systems, DMP intended tessellation (and procedural textures) as means to reduce VRAM consumption and bandwidth, not so much to actually improve graphics. Surprisingly, it seems no game uses either feature, so maybe the 3DS GPU lacks the silicon (Maestro is highly modular after all), the SDK doesn't support them (yet), or they plain and simply suck balls.
 
It's billed as Geo Shader/polygon subdivision on the 3DS' PICA200 GPU. I don't know of any specific examples, but I'm sure there's a benefit to it being included regardless of the screen being small.

a benefit one can't detect, perceive, or otherwise see?

I guess there is some infinitely tiny benefit to just knowing it's there. like going hunting without any ammo. true it's completely useless, but just holding that (unloaded) rifle is I'm sure somewhat comforting in the ~10 seconds before the bear eviscerates you.
 
Since PICA200 is targeted at embedded systems, DMP intended tessellation (and procedural textures) as means to reduce VRAM consumption and bandwidth, not so much to actually improve graphics. Surprisingly, it seems no game uses either feature, so maybe the 3DS GPU lacks the silicon (Maestro is highly modular after all), the SDK doesn't support them (yet), or they plain and simply suck balls.

Based on most comments of most GPUs I'd guess the latter.
 
Based on most comments of most GPUs I'd guess the latter.
But if they suck, wouldn't Nintendo have known and designed a SoC without those features? Or did you mean the SDK? Now that wouldn't surprise me at all, after all the latest rumors about features in more recent SDKs...
 
Yeah. How do you think the golf game worked in the demo reel? The pad has it's own sensor bar.

But all we saw of that demo, was a woman swinging. We never saw a golf club head over top of the ball where she moved it slowly to position it. It just zooms across the screen as she swipes. Something that's best served with motion detection.

I'm not convinced it's of any value and wouldn't be surprised to see it disappear.
 
But if they suck, wouldn't Nintendo have known and designed a SoC without those features? Or did you mean the SDK? Now that wouldn't surprise me at all, after all the latest rumors about features in more recent SDKs...

Sometimes you just don't know or else MS wouldn't have settled on 10MB of eDRAM.
 
VB: You did show a Zelda demo on the Wii U. Will there be a Zelda game on the new console?

RF-A: We showed what Link (the main character of Zelda) might look like in a 1080p environment, and it got people pretty excited.

VB: Have you made any comments about whether the performance will be better than the PS 3 or the Xbox 360?

RF-A: Again, the point is not about a comparison versus our competitors. What we’ve said is it will be 1080p. Check the box on the best graphics capability.

Ok, for better or worse, Reggie set the bar for the WiiU at 1080p.
And we are all trying to figure out what they plan to put in the box to achieve that.
But we may never know.

Next year, Nintendo will have to put up or shut up about this 1080p box being checked.
I doubt third parties will push the console that far, so I assume Reggie was referring to a 1st party Nintendo game with his claim.

So my question is, what launch game (especially in light of a Xbox3 reveal and/or launch) do you think Nintendo will bring out to show off at 1080p? And what first party game would you be most impressed by, running at 1080p, for you to state the WiiU is a beast of a machine?

I expect maybe Pikmin 3 to be shown off at 1080p,
but the console would impress me with a Metroid Prime or Star Fox game.
 
Ok, for better or worse, Reggie set the bar for the WiiU at 1080p.
And we are all trying to figure out what they plan to put in the box to achieve that.
But we may never know.

Next year, Nintendo will have to put up or shut up about this 1080p box being checked.
I doubt third parties will push the console that far, so I assume Reggie was referring to a 1st party Nintendo game with his claim.

So my question is, what launch game (especially in light of a Xbox3 reveal and/or launch) do you think Nintendo will bring out to show off at 1080p? And what first party game would you be most impressed by, running at 1080p, for you to state the WiiU is a beast of a machine?

I expect maybe Pikmin 3 to be shown off at 1080p,
but the console would impress me with a Metroid Prime or Star Fox game.

Reggie had no idea what he was talking about. I wouldn't count on any true 1080p Wii U games being released.
 
So my question is, what launch game (especially in light of a Xbox3 reveal and/or launch) do you think Nintendo will bring out to show off at 1080p? And what first party game would you be most impressed by, running at 1080p, for you to state the WiiU is a beast of a machine?

I expect maybe Pikmin 3 to be shown off at 1080p,
but the console would impress me with a Metroid Prime or Star Fox game.

For me, it'd be a Metroid as a first party. I really enjoy the franchise and think that a Metroid in all it's modern GFX glory would blow everyone's socks off. Even the core crowd who typically haven't been Nintendo gamers might be impressed and I think this is the crowd Nintendo wants to work on.

Other than that, I think the only pants-pooping announcement would really be some sort of 3rd party exclusive or some demo of an upcoming multi-playform 3rd party game pushed to it's limits on Wii U to really showcase the difference between current gen and Wii U.
 
Awww man, cant you just play along?

Nope. I'm BurkPoint.

If one is shown, I assume this post to be edited.

I said that I wouldn't count on it, not that there won't be one. If I'm wrong, I'm wrong. I don't edit away things that I'm wrong about. Those edits you attacked me over were due to it being quoted every 3-4 posts.

Gif I linked to 6 months ago was turned into... an interesting porn gif...

I would not call that "interesting".
 
The Zelda demo was 720p without any form of AA. Based on that Reggie claims that the WiiU can do 1080p. Does not compute.

Does it compute when you add in the fact that the scene was able to be rendered onto two screens at the same time from different angles?
 
Reggie had no idea what he was talking about. I wouldn't count on any true 1080p Wii U games being released.

Who said it had to be True 1080p?

All he did was "check the box" for 1080p. he never specified whether it was going to be upscaled or not (rather, he was never asked to specify) So as long as games come out at 1080p, whether fake or not, he didn't lie.
 
Status
Not open for further replies.
Top Bottom