Rumor: Wii U final specs

Zelda tech demos have a proven history of looking like shit compared to the actual LoZ games released on the system.
Nintendo's conservative when making them ... which makes that all the more impressive to me.

That's always because of efficient, intelligent uses of the hardware. GCN tech demo Link had a good 10,000-15,000 polygons more than the final TP model. Even though the final model looked a lot better.

I expect the final game to have deficiencies in comparison to the tech demo. Even if the game on the whole looks better.

I don't see anything special in the zelda tech demo too ... (with anything special I mean nothing that we couldn't see in current gen machines).
It was a pretty showcase that used current era tech well.

It wasn't the second coming.
 
since the gamecube have we ever known anything in detail about nintendo hardware?

True. But back then I couldn't care less. But now I'm at a stage where I can't just throw money away on a product that could end up gathering dust. Knowing everything about it before it comes out, makes me decide whether or not I should be putting money aside for it at launch. Obviously the games matter more, but still.
 
Zelda tech demos have a proven history of looking like shit compared to the actual LoZ games released on the system.
Nintendo's conservative when making them ... which makes that all the more impressive to me.
Yes, the zelda games end up looking better than the spaceworld tech demo because of polished art design. But they don't actualy achieve the lighting, physics and assets quality present in the tech demo so people shouldn't focus on that.

The Zelda HD demo however looks technically perfectly doable on current gen hardware so I wouldn't worry about that.


I understand 1 gig might feel real good to devs since they've been working with only half a gig all gen, but still, I don't think devs would use the term "enormous" or "a lot more" if we're talking just 1 gig.
Most of those comments were made in the context of discussing the porting of a PS360 game so I guess it makes sense.
 
Nintendo trotted out the Zelda tech demo, and at the same time said that they are not competing with MS/Sony and are not focusing on hardware graphics power. Two different messages is not good. Why bother showing the tech demo with Zelda the way they did?
 
There has been suggestion from some that 3 clocked up and improved Broadway's is in fact very good and is superior to xenon, I'm don't know as I'm not technically knowledgeable enough on the subject but if evidence of this theory does appear would everyone in this thread who has critisised and mocked the CPU apologise?
 
Nintendo trotted out the Zelda tech demo, and at the same time said that they are not competing with MS/Sony and are not focusing on hardware graphics power. Two different messages is not good. Why bother showing the tech demo with Zelda the way they did?
Because their fans wanted to know what it would look like, and it would get them excited toward a possible console purchase. Isn't it common practice for nintendo to show tech demos of their franchises anyway? Perhaps I'm just not understanding what it is you are saying. It is 4:44am here lol
 
Nintendo trotted out the Zelda tech demo, and at the same time said that they are not competing with MS/Sony and are not focusing on hardware graphics power. Two different messages is not good. Why bother showing the tech demo with Zelda the way they did?
I think the intended message was 'Our hardware is good enough for a visualy impressive zelda game'.
 
Don't know why everyone is taking this as fact. They did after all say, "Our anonymous sources."

Don't get me wrong, those specs are, O.K., (but JUST O.K) but I really think Nintendo would put more ram than THAT into the system.

I understand that some people like EC have said this is what they heard. But for the longest, many people on this site have rumored HEAVILY on 2 GB RAM, with 512 of the RAM being dedicated to the OS/multi-tasking. Now all the sudden this pops up, and they've switched their claim to "Oh yeah, it's really 1 gig."

The idea that it had 2GBs was always a highly wishful interpretation of earlier rumors that there was 1.5GBs of RAM and 512MB reserved for the OS. They hoped that you could add those together to get a higher amount, even though 1.5GBs total RAM on a 96bit bus always seemed like a far likelier scenario. The current rumor support that interpretation very strongly.

And then, we have multiple developers commenting on the impressive amount of RAM the Wii U has.

Michel Ancele from Ubisoft for instance:
"-Wii U has an enormous amount of memory
"- almost no limitations in terms of memory"

Sumo: "The Wii U has way more memory, so we can take advantage of that with less compression on elements and textures"


I understand 1 gig might feel real good to devs since they've been working with only half a gig all gen, but still, I don't think devs would use the term "enormous" or "a lot more" if we're talking just 1 gig.

They're looking at the system as a current gen machine. For something with GPU and CPU capabilities in the same class as the PS3 and 360, having 3 times the total memory is a huge differentiation. Even with the OS reserving 512MB of that you get more than twice the usable amount because the 360 and PS3 give up a percentage of their RAM to the OS as well.

The CPU talk is also confusing to me. It may make sense to a lot of you, and I know about the whole "Espresso" guy on the Beyond forums, but keep in mind, all because something makes sense, doesn't mean it's true. It can just be well thought-up rumors to create max credibility.

We have multiple developers talking about CPU performance as if it's a problem. We have technical examinations of multiplatform games like Arkham City that seem to demonstrate a deficiency in CPU performance. We have had insider rumors for a while suggesting that the WiiU's CPU cores were not significantly different from those used by the Wii and Gamecube. The only argument for something more powerful than that comes from people clinging to the vaguest, self serving PR statements IBM has made that hold very little water. The idea that something the size of the WiiU uses an actual Power7 like Watson used is literally preposterous.
 
That's always because of efficient, intelligent uses of the hardware. GCN tech demo Link had a good 10,000-15,000 polygons more than the final TP model. Even though the final model looked a lot better.

I expect the final game to have deficiencies in comparison to the tech demo. Even if the game on the whole looks better.

Yeah, and it's also due to the fact that they don't have much time to think and prepare the art direction for a tech demo, which is not the case for a final game where the pre-production work is primeval to reach an excellent result.
 
Its not about limiting yourself at all, its knowing who your target audience is.There is nothing unprofitable about the main sector Sony or MS is going after. Again profit is dependent on the manufacturer and what they put in there products. Its Sony's own fault that they weren't able to cash-in say like the 360 has.

I'd argue that for third parties, the sector is *extremely* high-risk. There's profit there - big profit - but also significant chance of loss.

I wouldn't blame a third party for thinking long and hard about maybe taking a different path.
 
I don't see anything special in the zelda tech demo too ... (with anything special I mean nothing that we couldn't see in current gen machines).
It's nothing we couldn't see on current machines, but something rarely seen so skillfuly put together in all its aspects (strictly speaking only of visual techniques here). And it's just a tech demo, not a product that's been years in development.

That said, software advancements over the span of the generation have a lot to do with how things look on a given hw eventually, so we can safely expect things on the WiiU that will never, ever be seen on the ps360, just because by that time nobody will be developing for the ps360.

The Japanese garden demo, though, is something else. While also likely 'theoretically possible' on the ps360, I have not seen footage with such dynamic light complexity and convincing GI approximation on the ps360. For me that was the first harbinger of the new gen.
 
People who are claiming Wiiu specs don't matter at all are ridiculous. If you're buying hardware you want to know if the price is fair.
Actually, no. That's rarely a criterion for mass-market sales. That is, you do want to know if the price is fair, but *not* if the price is fair for the *hardware* - what people generally purchase on is whether the price is fair for the *software* that the hardware enables.

32MB is the edram pool. 1GB is the Game pool and for some reason they don't bother to mention the reserved 512MB pool

I presume - assuming that the extra 512 is indeed true - that it was omitted because it's not information necessary for the audience pitched at. If this doc does indeed come from dev sources, they don't *need* to know about the OS pool, just what's actually available for them to use.
 
I don't get why people say the Os will have 512mb reserved for it. It's a beefed up current gen machine, current gen os on ps3/360 use under 50mb right?
 
It's nothing we couldn't see on current machines, but something rarely seen so skillfuly put together in all its aspects (strictly speaking only of visual techniques here). And it's just a tech demo, not a product that's been years in development.

That said, software advancements over the span of the generation have a lot to do with how things look on a given hw eventually, so we can safely expect things on the WiiU that will never, ever be seen on the ps360, just because by that time nobody will be developing for the ps360.

The Japanese garden demo, though, is something else. While also likely 'theoretically possible' on the ps360, I have not seen footage with such dynamic light complexity and convincing GI approximation on the ps360. For me that was the first harbinger of the new gen.
.
Some of which can be seen in more recent builds of ZombiU.
 
The idea that something the size of the WiiU uses an actual Power7 like Watson used is literally preposterous.

Power7 is an architecture, it comes in multiple flavors of number of cores, clock rate and therefore TDP.

http://www-03.ibm.com/press/us/en/pressrelease/34683.wss

The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience

The custom-designed chips will be made at IBM's state-of-the-art 300mm semiconductor development and manufacturing facility in East Fishkill, N.Y.
 
That CPU sounds so very underpowered. It's reminds me a lot of the SNES, great graphics hardware but an under powered CPU - that was slower than even the Mega Drive (hence the need for DSP chips built into cartridges even for early games).

I wonder who designs Nintendo hardware these days? The Gamecube was an awesome design, extremely well balanced in almost every aspect. The only bottleneck was low RAM but that is common with every console.
 
That CPU sounds so very underpowered.

Are you clairvoyant enough to see that without knowing the clock rate and what enhancements this rumor refers to?

This question is not just for you btw, it seems 90% of neogaf reads things that are not there regarding Wii U's specs.
 
At the absolute best it was only ever going to be a mid-gen leap.

At the best.

Turns out it's maybe 1/3rd gen leap at max. Which if Upad is rendering a full 3D scene in tandem with the main screen, can limit the overall output.

The best looking main TV screen games will have limited Upad rendering. And even that isn't going to free up too many resources.

To be fair, the second screen is much more likely to be a GPU hit than anything significantly affecting the CPU, and people seem to be broadly in approval of the GPU. At least, I've seen fewer complaints!
 
That CPU sounds so very underpowered. It's reminds me a lot of the SNES, great graphics hardware but an under powered CPU - that was slower than even the Mega Drive (hence the need for DSP chips built into cartridges even for early games).

I wonder who designs Nintendo hardware these days? The Gamecube was an awesome design, extremely well balanced in almost every aspect. The only bottleneck was low RAM but that is common with every console.

Clearly not someone from western country....Nintendo really needs western input taken seriously...
 
Are you clairvoyant enough to see that without knowing the clock rate and what enhancements this rumor refers to?

This question is not just for you btw, it seems 90% of neogaf reads things that are not there regarding Wii U's specs.

Based on all the speculation that has come out thus far. Even Eurogamer said the CPU was underpowered from what devs were telling them.
 
Are you clairvoyant enough to see that without knowing the clock rate and what enhancements this rumor refers to?

This question is not just for you btw, it seems 90% of neogaf reads things that are not there regarding Wii U's specs.

Devs have said that it is a bit underpowered, it is the best info we can go on.
 
To be fair, the second screen is much more likely to be a GPU hit than anything significantly affecting the CPU, and people seem to be broadly in approval of the GPU. At least, I've seen fewer complaints!

I doubt many games will render an actual 3D scene for the pad, it's not like people can keep an eye on both screens simultaneously to make that worth it. Most games will probably have a menu / map or other simple things that could be rendered with a 486 CPU from the early 90s...
 
Clearly not someone from western country....Nintendo really needs western input taken seriously...

The same team that developed the GameCube, developed the Wii and Wii U. NTD which is in Seattle (and has several engineers who worked on PS3 and 360), and IRD in Kyoto, Japan.
 
But see the curious thing is, why even show them off? Why ever show any footage that didn't at least match the previous gen? Puzzling to say the least. And Watch Dogs looking like RE 4 on PS2 would be pretty good, IIRC the PS2 game stood up quite well to the Gamecube version.

I suppose that depends how highly you rate PS2. Because the game itself was quite severely downgraded. Polygon counts were cut on characters, less enemies on screen, textures were reduced and the lighting was basically just removed completely.
 
I doubt many games will render an actual 3D scene for the pad, it's not like people can keep an eye on both screens simultaneously to make that worth it. Most games will probably have a menu / map or other simple things that could be rendered with a 486 CPU from the 90s...

GPUs are for rendering, not CPUs.
 
I don't get why people say the Os will have 512mb reserved for it. It's a beefed up current gen machine, current gen os on ps3/360 use under 50mb right?

Upon real world testing of the OS, and optimizations a large portion of the 512 ram pool could easily be freed up for development with a simple update. In fact it probably will be, it is very doubtful that the OS will take up more than a small portion of the pool, but Nintendo will of course want it to be available to it initially to ensure exactly how much it needs before freeing it up.
 
I really, really want to know how exactly the CPU is enhanced. The chipset just sounds extremely lopsided to have something like 12x the RAM (or more), and let's call it 10x the GPU (or maybe a lot more)... but then only about 3x the CPU plus some.

I understand why they would want to do it, but damn. There must be something significant to the "enhanced." There must be, right?

Broadway was 729Mhz, WiiU's CPU will be somewhere between 1.4Ghz and 2Ghz. So no not just 3 x Broadway even before the enhancements.
 
GPUs are for rendering, not CPUs.

CPUs can also render, my point was that even a weak CPU from the early 90s could render what most games will put on the Wii U pad. Sure, they can use the GPU to do it too, but for menus / maps the impact is almost negligible whether they use the CPU or the GPU.
 
I love that people are uniformly bashing the CPU too before even knowing the nature or extent of the enhancements, ie, not 3 wii cpu's ductaped together, or knowing even the clockspeeds! So presumptuous...

All of the complaints that I've seen have come from developers trying to slap ports on the machine, quite possibly with minimal effort. The CPU's architecture is different from the PS3's or the 360's, you can't just cut and paste and expect an optimized, perfect product, which is probably the extent of the care put into some of these quick cash in ports.

Have we heard any complaints about it from developers actually making software for the Wii-U from the ground up? Or even from developers who are putting work into optimizing their ports to take advantage of the Wii-U? Vigil? Ubisoft? Gearbox? Anyone? I don't think I have though it could be out there certainly...

This report also doesn't tell the whole story, as Arkam, who is confirmed as being in the know has flatly stated that Wii-U at least supports some features beyond SM4. DSP should also unload quite a bit of work from the CPU.
 
I don't get why people say the Os will have 512mb reserved for it. It's a beefed up current gen machine, current gen os on ps3/360 use under 50mb right?

Ignoring the whole "beefed up current gen" silliness, what does that have to do with how complex the OS can be? XBox 360/PS3's OS's could be more complex if they added more features, they wouldn't need more power to do so..
 
Well, the way I look at it, if the Wii was 2 GC's, the Wii was only a moderatly over clocked GC (not even double the clocks) with still a single core.

Taking that logic, a similarily over clocked Wii U cpu (single core) would be 2 Wii's duck taped together, which is 4 GC's.... then x3

So, the Wii U is 12 GC's, or 6 Wii's duck taped together!

Mystery solved...

and some sprinkles...

Where do I buy duck tape?
 
I don't get why people say the Os will have 512mb reserved for it. It's a beefed up current gen machine, current gen os on ps3/360 use under 50mb right?
The annoying thing about OSes lately is that they tend to, well, expand. I suspect it's less "The OS uses 512MB" as "The OS will be guaranteed to never take more than 512MB", meaning that developers can be confident that the full 1GB will be available to them.
 
I love that people are uniformly bashing the CPU too before even knowing the nature or extent of the enhancements, ie, not 3 wii cpu's ductaped together, or knowing even the clockspeeds! So presumptuous...

All of the complaints that I've seen have come from developers trying to slap ports on the machine, quite possibly with minimal effort. The CPU's architecture is different from the PS3's or the 360's, you can't just cut and paste and expect an optimized, perfect product, which is probably the extent of the care put into some of these quick cash in ports.

Have we heard any complaints about it from developers actually making software for the Wii-U from the ground up? Or even from developers who are putting work into optimizing their ports to take advantage of the Wii-U? Vigil? Ubisoft? Gearbox? Anyone? I don't think I have though it could be out there certainly...

This report also doesn't tell the whole story, as Arkam, who is confirmed as being in the no has flatly stated that Wii-U at least supports some APU's beyond 4.1 .

When the source of the info suggests that the info is was presented inaccurately/not fully and then people still ignore it, there's not much you can do.
 
I doubt many games will render an actual 3D scene for the pad, it's not like people can keep an eye on both screens simultaneously to make that worth it. Most games will probably have a menu / map or other simple things that could be rendered with a 486 CPU from the early 90s...

No, you misunderstand me: Even a 3D scene is not much of a *CPU* hit. It's likely to be all on the GPU unless the second scene contains its own distinct game logic, which would be... odd.

Edit: Ah, you clarified, and I see your point now. Although it's worth bearing in mind that if the pad wasn't present, you'd have to assume the CPU would still be doing the same work putting that information on the main screen - in other words, the fact that it's on a different *screen* is not increasing the CPU's load.
 
All of the complaints that I've seen have come from developers trying to slap ports on the machine, quite possibly with minimal effort. The CPU's architecture is different from the PS3's or the 360's, you can't just cut and paste and expect an optimized, perfect product, which is probably the extent of the care put into some of these quick cash in ports.
True, but in prior generational transitions where developers have made 'quick cash-in ports'.... is it not the case that they'd be superior due to the sheer increase in power?
 
No, you misunderstand me: Even a 3D scene is not much of a *CPU* hit. It's likely to be all on the GPU unless the second scene contains its own distinct game logic, which would be... odd.

I did understand you, I was actually strengthening your point by saying that not only the CPU will not take much of a hit from the pad, but the GPU probably won't either.

edit - saw your edit, lol ;)
 
No, you misunderstand me: Even a 3D scene is not much of a *CPU* hit. It's likely to be all on the GPU unless the second scene contains its own distinct game logic, which would be... odd.

How would it be odd? I can envision dozens of scenarios where a completely independent scene can be rendered on the screen with totally different geometry, AI, animations, physics etc.
 
True, but in prior generational transitions where developers have made 'quick cash-in ports'.... is it not the case that they'd be superior due to the sheer increase in power?

Yes with consoles that were 10x as powerful as the previous generation. We've all known from the start that WiiU wasn't going to be 10x the 360 or PS3. Why is this now surprising people?
 
tumblr_m95l52Uw121rbwcyao1_500.gif


GWya1.gif


HTH's

It's all very well saying PS360 can do these things, but fact of the matter is they are never going to have a Zelda game on them.

This is the first HD zelda game people have seen.
This is why it's so impressive.

Besides, and I've said this time and time again, those Zelda gifs ARE impressive compared to current gen software. With even more time and effort, we could get a glorious looking game.

*EDIT* And by that I don't mean it surpasses this gen - it does in some areas, but my point is that it's a fantastic tech demo.
 
Yes with consoles that were 10 as powerful as the previous generation. We've all known from the start that WiiU wasn't going to be 10x the 360 or PS3. Why is this now surprising people?

Maybe because they don't realize that 10x 360 / PS3 is still a pretty beefy PC these days, which the Wii U will of course not be on par with.
 
How would it be odd? I can envision dozens of scenarios where a completely independent scene can be rendered on the screen with totally different geometry, AI, animations, physics etc.

If it's a separate view on the same environment from the main screen, all the (CPU-heavy) updates that were used for the main screen can be simply replicated from a different camera on the second pad.

If it's an entirely independent environment, you'd expect that the CPU budget for the game was structured around assuming that you were rendering two distinct environments each frame.

My point is more that the fact that it's on a distinct *screen* is not a hit in itself, and if a game requires what you describe then it should plan around it in the first place.
 
True, but in prior generational transitions where developers have made 'quick cash-in ports'.... is it not the case that they'd be superior due to the sheer increase in power?

In those cases they are more than just quick ports, because the hardware was obviously a much larger jump than in this case. In a traditional generational jump, say PS2- PS3 the PS3 games obviously weren't ports, but versions of the same game designed for the PS3. They were shoddy because they were rushed for launch, these are shoddy because they are trying to cut and paste, two completely different scenarios.

If these companies were interested in designing versions of these games from the ground up to take advantage of the Wii-u's power increase then Wii-U's versions of the games would likely be better looking too, but alas, they are not.

Wii-U's CPU uses pretty different architecture if I understand correctly. It is OOE, and has a DSP to take additional load off of the CPU, as well as the capability to offload some of the work onto the reasonably powerful GPU and its edram. Are the developers who are complaining, the ones guilty of quick lazy port jobs taking advantage of these features that separate the Wii-U's CPU from the older systems? Doubtful.

Which is likely why we haven't heard the complaints from studio's designing original content, or studio's optimizing their ports for Wii U, such as Vigil, Ubisoft, or Gearbox etc...
 
Besides, and I've said this time and time again, those Zelda gifs ARE impressive compared to current gen software. With even more time and effort, we could get a glorious looking game.

*EDIT* And by that I don't mean it surpasses this gen - it does in some areas, but my point is that it's a fantastic tech demo.

It was supposedly tossed together very quickly too, without much dev time at all. The lighting is what strikes me, and has been the one aspect of most known Wii-U games that has consistently impressed the most.

The Bird Demo was very impressive too, i'm not sure i've seen the like of it on current gen, even in tech demo's. Individual effects here and there, maybe, but all at once?

http://www.youtube.com/watch?v=GcapRBQoMWk
 
The same team that developed the GameCube, developed the Wii and Wii U. NTD which is in Seattle (and has several engineers who worked on PS3 and 360), and IRD in Kyoto, Japan.
I might be wrong but I believe Gamecube was mostly designed in the west by a company called Art-X (now a part of AMD). They didn't just make the GPU, they also did the motherboard and had input on the overall system.
 
Top Bottom