Rumor: Wii U final specs

It's clear it doesn't but do you think Nintendo will sell a controller at break even? No, they'll "inflate" the price to make some profit. They'll be producing these just enough for people to buy one, but you can't have more than two per console so they have to balance production and logistic costs and make a moderate profit on it too.

Exactly. So if they sell it for $150 or whatever doesn't mean Nintendo would have had $150 extra to spend on the hardware
 
Maybe he'd like to see a nintendo game that has lots of power behind it? IIRC, Nintendo games were great before they had a wii mote or anything else "gimmicky."
Maybe he realizes that 3rd party devs no longer want to invest much time or money on "exclusive" features or different input methods. Hell, PC games hardly have proper controller support sometimes!

if that how he feels then he should skip all nintendo systems that'll have 'gimmick'.

as for me, i liked the gimmick. i liked the games i played using the gimmick. i also enjoyed ps2 games. and if i get a chance, i think i'll find something likable about the new gimmick too.
 
if that how he feels then he should skip all nintendo systems that'll have 'gimmick'.

as for me, i liked the gimmick. i liked the games i played using the gimmick. i also enjoyed ps2 games. and if i get a chance, i think i'll find something likable about the new gimmick too.


It's only a gimmick because Nintendo is doing it. :^p


Seriously though, I mentioned this earlier in this thread (I think) but a screen in the controller seems like a natural evolution and not even a new one.


This is from an earlier post of.
The Wii U controller- more or less an evolution to traditional dual analog style controllers. It's not even a new concept. The Dreamcast put a screen in the middle of your controller 14 years ago. Nintendo is just taking advantage of current technology to make better use of the idea. Had the technology been there, you can guarantee Sega would have done the exact same thing with the screen.

I dont really get why some people seem perturbed by a screen in the middle of the controller. This has been an evolution I've been dreaming about since I was a little kid. It just seems natural to me that this is where controllers were headed.
 
A few pages back, but I thought I might reply to these:

Where is this talk about manufacturing problems? There's nothing to suggest anything of the sort in the Iwata asks. They just talk about the difficulty of isolating hardware bugs and test for defects with an MCM design. I would expect most hardware designers run into a few problems along the way regardless of the die size.

Yes, but the Wii has an MCM, and they've been working with the same hardware providers effectively since the Gamecube. It seems a bit odd to bring it up as a particular point with the Wii U.

I assume the GPU LSI uses Renesas' UX8 process (40nm) and UX8GD eDRAM. UX8GD supports up to 256Mbit, which happens to be exactly the amount the Wii U is supposed to use, and, according to Renesas, targets game consoles. A single cell of UX8 eDRAM is 0.06 square micron, half the size of the previous generation UX7LSeD eDRAM for 55nm. Even though the tech was announced back in 2007, it's not yet officially available.

This is a possibility. It's worth noting that Renesas is shifting to fabless operation sub 40nm, and has joined the Global Foundies/IBM alliance for fabbing at 32nm/28nm and below. I don't know what that might mean for Renesas's eDRAM if it is a 32nm or even 28nm chip, although I suppose a relatively straightforward die-shrink is possible. Alternatively, it's possible that the reason that the UX8GD isn't officially available is simply that they reworked it for GloFlo's 32nm process.
 
I have to say my Wii U mood has taken a negative turn in the last few hours.

You've been positive? :P

I want to make sure I'm correct about this, so BG or someone who is better with Hardware correct me if I'm incorrect.

The purpose of the MCM is to make it so that instead of splitting the Voltage between the CPU and GPU, they get the same amount of voltage.

I remember when Iwata announced that the highest Voltage would be 75watts (and at average was 45), everyone guessed that the GPU would receive about 15 watts and CPU about 10 watts. Now that we know that they are both on an MCM, does this mean that they receive 25 watts instead?

Average is 40. I have felt that the CPU is less than 10w and the GPU is greater than 15w.

Yes, but the Wii has an MCM, and they've been working with the same hardware providers effectively since the Gamecube. It seems a bit odd to bring it up as a particular point with the Wii U.

I think what they are saying with Wii U is that the MCM in it has both the CPU and GPU as opposed to just the GPU in Wii. They did mention that Wii used separate dies for the CPU and GPU.
 
Yeah, it could've been like the PS4 and XBox3 and your PC. But why not just buy a PS4 or XBox3 or PC? Why do people insist that every single console maker has to spend every penny cramming as much theoretical performance in at the expense of all else? Why do you want four different devices doing exactly the same thing, playing exactly the same games in exactly the same way? Am I the only one who's glad that Nintendo is trying to actually do something unique, is trying to provide a different approach in an industry full of risk averse cookie-cutter games that are designed more than anything else to look like they're fun to play, with actually being fun to play a seeming irrelevance? Am I the only one who's happy that they're providing a different option in an industry where even Resident Evil has turned into a fucking Gears of Uncharted clone?

They could do the same much better at 1080p and 60FPS with 4xMSAA.
 
You didnt reply to my first and most relevant comment, so I assume you agree with it.

Going down from half of a 55 inch screen to 6.2" is bad, as is going from 640x720 to 858x480.

If you sit a foot or so in front of your tv, sure. but the average viewing distance is 9 feet, they will be around the same fidelity in this case, for most peopel. And I have yet to hear any one complain about playing on the Gampad Also this setup has the benefit of retaining the correct aspect ratio, which split screen cannot. Can also alleviate screen watching.

Nintendo has said more than once that you can only use the Wii U Gamepad in the same room as the console. So no playing from the bedroom or bathroom for you.

hmmm, I don't believe they have said this, since not every room is the same size. Their recommendation is 24 feet, at a minimum and Nintendo points out they are always very conservative with their recommendations. So most likely it will be further than that in most cases. Yeah you want be able to take it everywhere, but you will always have a lot more options than always being tied to the tv. Especially if someone is already using it

And makes the gaming experience even more detached than it already is. Because you constantly have to switch between the interactive small screen right in front of you and the big screen on the wall.

Ok, perhaps, but I am sure you can learn the screen, like people learn a keyboard, and a lot of hands on for games haven't all had this issue. But the gamepad is still easily better than the alternative have having to use a clunky thumb stick to select units, or commands, or having to go to the pause menu and select something with the thumb sticks, and then go back to the game, touchscreen makes it instantly accessible and much quicker

Why would I then watch it in the first place?

it's not hard to imagine, maybe your kids are watching something, and you want to watch somethign when they are done or you are not sure what you want to watch, so you start something to just have something on for everyone in the room to sort of preview it but you want to continue browsing shows, or you want to browse what you want to watch next. There are many reasons.

Because you have such a huge variety of movies, many times when I am using netflix, we start a movie, only to stop it and browse something else, over and over. This setup would hugely help this issue.


And many other things I am missing.

And for something I missed, for games like pikmin 3 and BLOPS 2, you can use the wiiremote as you main controller and have the gamepad as a supplementary screen touchpad, to easily navigate a map, or have hotkeys, and to clean up the HUD.
 
I think what they are saying with Wii U is that the MCM in it has both the CPU and GPU as opposed to just the GPU in Wii. They did mention that Wii used separate dies for the CPU and GPU.

Yep, but the Wii's MCM still has multiple dies on it (GPU+DSP+ARM, if I'm not mistaken). Actually, who manufactured the DSP and ARM components in the Wii? If it was NEC then that might make sense (ie. a fully NEC built MCM vs. a Renesas & IBM MCM).
 
Yep, but the Wii's MCM still has multiple dies on it (GPU+DSP+ARM, if I'm not mistaken). Actually, who manufactured the DSP and ARM components in the Wii? If it was NEC then that might make sense (ie. a fully NEC built MCM vs. a Renesas & IBM MCM).
It could be just a matter of difference in complexities between the MCM of the Wii and the one in the WiiU. WiiU's MCM is more in line with 360's Valhalla (which is still an MCM, despite people calling it a SoC).

Didnt Rensesas merge with NEC?
I don't think so ;p
 
Yep, but the Wii's MCM still has multiple dies on it (GPU+DSP+ARM, if I'm not mistaken). Actually, who manufactured the DSP and ARM components in the Wii? If it was NEC then that might make sense (ie. a fully NEC built MCM vs. a Renesas & IBM MCM).

Didnt Rensesas merge with NEC?
 
You've been positive? :P

Well I still assumed the Wii U would be well more powerful than PS3/360 (maybe like, 50% on the GPU). Now sometimes I even doubt that again.

But we still have lots of historical hinting at a more powerful GPU, and I fail to see how any 150mm 40nm GPU, even with EDRAM, shouldn't top current gen.

I wonder how the system would be looking had they dedicated like, 100mm to the cpu instead of 30. Silicon is silicon I guess.

right now with launch games i guess a good summation is better gpu, lesser cpu, equals about the same overall, compared to current gen.

even though we had hints, seeing in the flesh that the cpu is tiny has to be discouraging.
 
It could be just a matter of difference in complexities between the MCM of the Wii and the one in the WiiU. WiiU's MCM is more in line with 360's Valhalla (which is still an MCM, despite people calling it a SoC).

I think the big difference is that the CPU is included in the MCM. With the Wii it was just
the GPU.

This time we fully embraced the idea of using an MCM for our gaming console. An MCM is where the aforementioned Multi-core CPU chip and the GPU chip10 are built into a single component

That is somehow the big deal for Nintendo.
 
Well I still assumed the Wii U would be well more powerful than PS3/360 (maybe like, 50% on the GPU). Now sometimes I even doubt that again.
I know you pull those figures out of non-sunshiny places, but for the sake of playing along: If U-GPU was Xenos verbatim with 32MB eDRAM (actually embedded, and not ROP'ed), you'd already have a fairly more capable GPU. Tiling was far from being free on Xenos.
 
With the info that we have, can we expect a CPU more powerful than Cell or Xenon?

Somebody called me "not serious" when I have said that Wii U CPU can be weaker than PS360 CPU
 
It could be just a matter of difference in complexities between the MCM of the Wii and the one in the WiiU. WiiU's MCM is more in line with 360's Valhalla (which is still an MCM, despite people calling it a SoC).

Technically speaking, couldn't the Wii U's MCM be considered a SoC? It's got a CPU, GPU, a bit of memory and possibly an I/O chip.

Didnt Rensesas merge with NEC?

Renesas merged with/bought out NEC's microelectronics division. I referred to NEC as that's what they were when they were making the Wii.

With the info that we have, can we expect a CPU more powerful than Cell or Xenon?

Somebody called me "not serious" when I have said that Wii U CPU can be weaker than PS360 CPU

It won't be as powerful in raw floating point performance, but to be honest I wouldn't expect PS4 or XBox3's CPUs to be as powerful in raw floating point performance as Xenon and Cell. The fact that it's got out-of-order execution and a much bigger cache should benefit things like AI quite a bit, though, and being able to offload physics to the GPU, audio to the DSP and I/O to a dedicated ARM chip should help out significantly.
 
With the info that we have, can we expect a CPU more powerful than Cell or Xenon?

Somebody called me "not serious" when I have said that Wii U CPU can be weaker than PS360 CPU
The CPU's design seems to have different strengths and weaknesses compared to Cell or Xenon. The current-gen CPUs are weak at doing general tasks, for example, while the U-CPU may be a lot stronger at that.
 
It's a guess.

No developer has came forward with their performance findings and the console hasn't been reversed engineered.

That's what I thought, and the whole 2x kinda gets hit because the Ram is 4x the 360.

The Wii U has modern tech, Nintendo would have to pay more money to make the Wii U weaker/par/slightly better than the PS3/360.
 
That's what I thought, and the whole 2x kinda gets hit because the Ram is 4x the 360.

The Wii U has modern tech, Nintendo would have to pay more money to make the Wii U weaker/par/slightly better than the PS3/360.

Twice as much RAM allocated to games and titles like NSMB Wii U running at a whopping 720 p.
 
Twice as much RAM allocated to games and titles like NSMB Wii U running at a whopping 720 p.

2GB is still 2GB, people can take away the OS ram from the 360 and it will still make the difference even if it's for games, and NSMBU running in 720p means it's weak? That's news to me.
 
2GB is still 2GB, people can take away the OS ram from the 360 and it will still make the difference even if it's for games, and NSMBU running in 720p means it's weak? That's news to me.

So if you had 2 GB with 1.75 GB dedicated to the OS that would still be impressive to you? I think not. What matters is what is available to games, and as it stands, its twice the Xbox 360. Hey, its an increase. Not that it matters much as not a single game looks substantially better than current gen and in some cases, are inexcusably simple. NSMBU is a first party title, is simplistic in nature and is only running at 720 p.
 
So if you had 2 GB with 1.75 GB dedicated to the OS that would still be impressive to you? I think not. What matters is what is available to games, and as it stands, its twice the Xbox 360. Hey, its an increase. Not that it matters much as not a single game looks substantially better than current gen and in some cases, are inexcusably simple. NSMBU is a first party title, is simplistic in nature and is only running at 720 p.

How much of the OS ram does the 360 use?

Call of Duty Black Ops 2 is "full HD" and that's a 3D game, it's more impressive, and Trine 2 is 1080p which has better graphics than NSMBU.
 
Technically speaking, couldn't the Wii U's MCM be considered a SoC? It's got a CPU, GPU, a bit of memory and possibly an I/O chip.
Well, an MCM is a packaging technique - as long as you have multiple chips/dies on the same substrate it's an MCM (if it's not a 2.5D stack, that is). A SoC is usually 'almost everything' you'd need to make a complete system on a single die, RAM non-withstanding. Valhalla is not a SoC, just like AMD APUs are not SoCs either. But Hollywood, Xenos, Valhalla and U-MCM are all MCM packages.

Re Wii's Hollywood, if one really wanted to I guess they could call it a SoC, but let's not forget Starlet was essentially a coprocessor - Hollywood's big die was filled with coprocessors.
 
Wait, that 2D Mario game is NOT 720p? How come? O.o

Because it wasn't necessary.

The game wouldn't gain anything going from 720p60fps to 1080p60fps unless you own a massive TV of which a very small % of the market does. Even then, its arguable that its worth the effort.

Damn people, this is the first run of games on a system that just barely finalized a few months ago.

I still can't believe people were expecting so much with 3rd party ports and a company (Nintendo) who never pushed the console 100% on day one.

Next Xmas these arguments about Wii U's power will be seen as stupid and way too early. When real next-gen games are shown on it. When first party stuff is shown that actually pushes the new technology. That's when all the naysayers will be like.... oh, yeah its clearly ahead of anything current and will be fine with PS4/neXtBox.
 
I wouldn't count on BLOPS 2 being 1080p either. The statement has yet to be clarified by them. And even though Full HD usually means 1080p I dont actually believe they have out right said "1080p". They may just be using the full HD term loosely as a buzz word . And I think Activision has been misleading on native resolution statements in the past.

Especially with all the confusion and speculation, and contention the over the actual resolution, I would think Activision would have cleared this up by now if it truly was 1080p. But at the same time they have not chosen to correct journalists from big sites that have reported it as 1080p
 
Sure, but no one uses tiling now for a variety of reasons (screen space AA techniques, etc.).
Yes, but one of the big deals about those ROPs was that they could provide free MSAA, if only the target could fit in that eDRAM. It was a huge irony, and it was on the devs.
 
Top Bottom