Rumor: Wii U final specs

The Zelda tech demo doesn't look technically impressive at all. I'd be beyond shocked if it couldn't be done by a competent dev on PS3 or 360.

High end PS3/360 games don't look technically impressive?

Come on son. That Zelda demo looked great, whether it's possible on 360 or PS3 is irrelevant.
 
When a dev says a game/demo can't run on other hardware, what underlying cause would there be to make it so? That's your answer and I think I'm done (for the hundredth time now).

Well that's a shame, I was honestly curious what you picked out that's beyond current gen.
 
I don't expect anything released in the first year to match anything done in the Uncharted Series or Last of Us. It will be a while before the Wii U actually shows off anything unquestionably superior to the last gen.
 
High end PS3/360 games don't look technically impressive?

Come on son. That Zelda demo looked great, whether it's possible on 360 or PS3 is irrelevant.

This is and always will be my stance.

It doesn't matter the hardware if they can achieve beauty. I'm mindful that there's better out there, but there always is.
 
When a dev says a game/demo can't run on other hardware, what underlying cause would there be to make it so? That's your answer and I think I'm done (for the hundredth time now).

Devs say that all the time. Maybe it's factually correct; if they used more than 512 MB ram it couldn't run on the 360, but that doesn't really say anyone would actually notice if they shrunk a few textured or added a bit more compression. It might use a few gpu features that aren't available on the 360, but it might have been possible to move some of it to the cpu or precalculate a few things. There's really no way to know how close you could come without spending several man months trying to port it to another console.
 
I don't expect anything released in the first year to match anything done in the Uncharted Series or Last of Us. It will be a while before the Wii U actually shows off anything unquestionably superior to the last gen.
Did Wii U's Assassin's Creed 3 get canceled or something? It's not that different from Uncharted/LoS.
 
I heard a snippet somewhere that the CPU isn't even 1 ghz, this wouldn't surprise me. Anyway, as a whole package, this is basically well, the Wii all over again...its an Xbox 360 on steroids. This will disappoint graphics horses but it will be way more than enough for families and Nintendo fans. I just finished Mario Galaxy and started Galaxy 2. What they push out of the Wii is miraculous. I simply cannot imagine what the next game could look like. Though as always, I expect disappointment when it comes to anything Nintendo related. So when they exceed expectations its that much sweeter.

I read that it was around the same level as the Vita CPU maybe lower. but I also read that it's clocked low but it's still about the same level as the Xbox 360 CPU.


but just like when I posted about the CPU being 3 Wii CPU cores clocked higher people just wrote it off.
 
Why do people believe that "Espreso" (I like the name, I'll stick with it) is based off of three "enhanced Broadways" when IBM confirmed LAST YEAR that it was based off of POWER7? DO I NEED TO BRING UP THAT DAMNED TWEET AGAIN?!?
 
I heard a snippet somewhere that the CPU isn't even 1 ghz, this wouldn't surprise me. Anyway, as a whole package, this is basically well, the Wii all over again...its an Xbox 360 on steroids. This will disappoint graphics horses but it will be way more than enough for families and Nintendo fans. I just finished Mario Galaxy and started Galaxy 2. What they push out of the Wii is miraculous. I simply cannot imagine what the next game could look like. Though as always, I expect disappointment when it comes to anything Nintendo related. So when they exceed expectations its that much sweeter.

Even the Wii processor is 729Mhz so...that seems unlikely...
 
Some yes. But the same problems are still present if you are porting a game over from the Xbox360. So if companies dont want to invest retooling games, you will get crappy(downgraded) ports.


That said,I am nothing but excited for the console and cant't wait to see the first round of games made from the ground up on the WiiU. That is when we will see what this little beast can do!

Thanks for the insight. Are the porting issues related to the CPU differences?

Why do people believe that "Espreso" (I like the name, I'll stick with it) is based off of three "enhanced Broadways" when IBM confirmed LAST YEAR that it was based off of POWER7? DO I NEED TO BRING UP THAT DAMNED TWEET AGAIN?!?

It's not based on Power7.
 
Wouldn't surprise me if the initial hardware was literally 3 broadways taped together and had the same speed. I would love to know their current clock speed.
Honestly, taking a design like that and slapping more of it together competently would be much harder than just stripping out what you don't view as necessary from something newer, achieving a similar footprint in the process.

I'm skeptical that it is literally three Broadways because of that fact alone. Though Nintendo is strange enough to try it.
 
Honestly, taking a design like that and slapping more of it together competently would be much harder than just stripping out what you don't view as necessary from something newer, achieving a similar footprint in the process.

I'm skeptical that it is literally three Broadways because of that fact alone. Though Nintendo is strange enough to try it.

Oh by now its probably pretty tweaked. I don't see it literally being 3 broadways currently. But whatever it is, its underwhelming.
 
The fact that we are even debating this proves that Nintendo has not done enough to make this console clearly more powerful than the 360 or PS3.

It's gonna be a Wii situation all over again as 3rd parties abandon the Wii U in favour of MS/Sony/PC.
 
The fact that we are even debating this proves that Nintendo has not done enough to make this console clearly more powerful than the 360 or PS3.

It's gonna be a Wii situation all over again as 3rd parties abandon the Wii U in favour of MS/Sony/PC.

Hey, they can't abandon Wii U unless they're there to begin with.
 
The fact that we are even debating this proves that Nintendo has not done enough to make this console clearly more powerful than the 360 or PS3.

It's gonna be a Wii situation all over again as 3rd parties abandon the Wii U in favour of MS/Sony/PC.

Not necessarily. Nintendo's launch games are of the more stylised type and 3rd parties are just doing straight ports. Even if you have a high end PC, if the port from 360 was essentially "compiled on another platform" it wouldn't look any better. I will say it for the third (fourth? fifth?) time - When the 720 and PS4 come out, I can imagine Wii U ports to suddenly start looking better because they will be down ports from better versions as opposed to straight ports current (lesser) versions. If we haven't seen a showcase before then, we might very ask "Is this the same machine?"

Of course, it's entirely possible that all developers that have invested in adding Wii U to their pipeline, and are continuing to make 360/PS3 ports of their 720/PS4 games, might just go "Just do the standard straight port".
 
Someone translate this into DBZ levels of power plz


Assuming that Wii U is still 2-3X more powerful than 360/PS3, and PS4/720 is 10X more powerful than the 360.

PS3/360 = SSJ1 Trunks - 14,000,000/SSJ Vegeta - 16,500,000
Wii U = Android Sixteen - 32,000,000 to Imperfect Cell - 47,000,000
PS4/720 = Perfect Cell - 145,000,000

If PS4/720 ends up being more than 10x powerful than 360, then I'm assuming 11.25-14X more power at most, so it could be Full Power Cell 180,000,000 to Super Perfect Cell - 225,000,000.

No way would the PS4/720 be 20X as powerful than the 360 (which would be SSJ2 Gohan (300,000,000).

looooool .-.
 
I heard a snippet somewhere that the CPU isn't even 1 ghz, this wouldn't surprise me. Anyway, as a whole package, this is basically well, the Wii all over again...its an Xbox 360 on steroids. This will disappoint graphics horses but it will be way more than enough for families and Nintendo fans. I just finished Mario Galaxy and started Galaxy 2. What they push out of the Wii is miraculous. I simply cannot imagine what the next game could look like. Though as always, I expect disappointment when it comes to anything Nintendo related. So when they exceed expectations its that much sweeter.

Because the Wii U doesn't have enough whorespower?
 
Don't come at me with pitchforks in unison, GAF, but wasn't there a recent re-evaluation of the Wii's broadway CPU after IBM released a 600-something page document about it recently? Am I delusional, or did I read that the CPU in the Wii had a surprising amount of clout for its low clock speed and single core design ?

Not that GAF should be seriously treated as a credible source under any circumstance, but I remember reading a post on here stating that Xenon was only about 20% faster (in real-world performance terms) than Broadway running highly optimized code.

Three broadway cores with greater cache, a few additional features, and higher clock speeds should logically surpass 360 CPU performance figures by a comfortable margin. I'm also aware, though, that developers weren't exactly enamored with the CPU this gen probably because it operated differently. Maybe this accounts for some of the disenchantment over the CPU and some pretty underwhelming third-party ports?
 
I don't expect anything released in the first year to match anything done in the Uncharted Series or Last of Us. It will be a while before the Wii U actually shows off anything unquestionably superior to the last gen.

Yeah unfortunately it will probably be after MS and Sony begin showing off their new systems, which will probably be much better than what the Wii U is boasting. Wii U will likely be halfway between PS360 and PS4/whatever in the best case scenario, less than that in the likely scenario.
 
The previously leaked specs seems similar but with more info:

Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core.

Main Memory

Up to 3GB of main memory (CAT-DEVs only). Note: retail machine will have half devkit memory
Please note that the quantity of memory available from the Cafe SDK and Operating System may vary.

Graphics and Video

Modern unified shader architecture.
32MB high-bandwidth eDRAM, supports 720p 4x MSAA or 1080p rendering in a single pass.
HDMI and component video outputs.

Features

Unified shader architecture executes vertex, geometry, and pixel shaders
Multi-sample anti-aliasing (2, 4, or 8 samples per pixel)
Read from multi-sample surfaces in the shader
128-bit floating point HDR texture filtering
High resolution texture support (up to 8192 x 8192)
Indexed cube map arrays

8 render targets
Independent blend modes per render target
Pixel coverage sample masking
Hierarchical Z/stencil buffer
Early Z test and Fast Z Clear
Lossless Z & stencil compression
2x/4x/8x/16x high quality adaptive anisotropic filtering modes
sRGB filtering (gamma/degamma)
Tessellation unit
Stream out support
Compute shader support

GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.

Sound and Audio

Dedicated 120MHz audio DSP.
Support for 6 channel discrete uncompressed audio (via HDMI).
2 channel audio for the Cafe DRC controller.
Monaural audio for the Cafe Remote controller.

Networking

802.11 b/g/n Wifi.

Peripherals

2 x USB 2.0 host controllers x 2 ports each.
SDCard Slot.

Built-in Storage

512MB SLC NAND for System.
8GB MLC NAND for Applications.

Host PC Bridge

Dedicated Cafe-to-host PC bridge hardware.
Allows File System emulation by host PC.
Provides interface for debugger and logging to host PC.

The old GPU info matches what was revealed today.
 
Honestly, taking a design like that and slapping more of it together competently would be much harder than just stripping out what you don't view as necessary from something newer, achieving a similar footprint in the process.

I'm skeptical that it is literally three Broadways because of that fact alone. Though Nintendo is strange enough to try it.

The maximum clock of Wii Broadway is 1.1Ghz. Wii U Broadway will certainly upgraded to be capable of more than that, it can't just be three of them unmodified.
 
Assuming that Wii U is still 2-3X more powerful than 360/PS3, and PS4/720 is 10X more powerful than the 360.

PS3/360 = SSJ1 Trunks - 14,000,000/SSJ Vegeta - 16,500,000
Wii U = Android Sixteen - 32,000,000 to Imperfect Cell - 47,000,000
PS4/720 = Perfect Cell - 145,000,000

If PS4/720 ends up being more than 10x powerful than 360, then I'm assuming 11.25-14X more power at most, so it could be Full Power Cell 180,000,000 to Super Perfect Cell - 225,000,000.

No way would the PS4/720 be 20X as powerful than the 360 (which would be SSJ2 Gohan (300,000,000).

looooool .-.

Wii U's superiority will be most apparent when you keep resolution and frame rate on par with current systems and use the extra grunt for more details and better textures. Once you start increasing resolution or frame rate, which for most plebs borders on unnoticeable, the graphic advantages start to fade away. And god forbid you try to run a concurrent scene on the pad.
 
https://mobile.twitter.com/IBMWatson/status/240241146213842944

Seriously you guys need to stop hanging on every useless PR line.

The Wii-U can share design characteristics with Power7 and that tweet of yours could still apply. That doesn't mean it's a Power7 in the system or the CPU is as powerful as a Power7.

I'm not familiar with the technical specifications, but couldn't this apply to the CPU description as having Broadway cores as well?

The fact that we are even debating this proves that Nintendo has not done enough to make this console clearly more powerful than the 360 or PS3.

This is my main concern. I know the hardware is fine for Nintendo's goal, and I know I'm content with the way games look now. However, it's disheartening that the comparisons to current generation consoles are even being made. With usual console launches, a big power increase should be expected. I'm not talking about third party support, though.
 
The fact that we are even debating this proves that Nintendo has not done enough to make this console clearly more powerful than the 360 or PS3.

It's gonna be a Wii situation all over again as 3rd parties abandon the Wii U in favour of MS/Sony/PC.


The World has changed a lot since Last Gen so I wouldn't be to worried about the Wii U getting abandoned for the PS4\Xbox Next & PC because there will be so many devs making games for PS3,Xbox 360 , Vita , Tablets & Smartphones that Wii U will not get left behind.
 
The World has changed a lot since Last Gen so I wouldn't be to worried about the Wii U getting abandoned for the PS4\Xbox Next & PC because there will be so many devs making games for PS3,Xbox 360 , Vita , Tablets & Smartphones that Wii U will not get left behind.

I'm confused. So which of the bolded is it?
 
The previously leaked specs seems similar but with more info:

"Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core.

Main Memory

Up to 3GB of main memory (CAT-DEVs only). Note: retail machine will have half devkit memory
Please note that the quantity of memory available from the Cafe SDK and Operating System may vary.

Graphics and Video

Modern unified shader architecture.
32MB high-bandwidth eDRAM, supports 720p 4x MSAA or 1080p rendering in a single pass.
HDMI and component video outputs.

Features

Unified shader architecture executes vertex, geometry, and pixel shaders
Multi-sample anti-aliasing (2, 4, or 8 samples per pixel)
Read from multi-sample surfaces in the shader
128-bit floating point HDR texture filtering
High resolution texture support (up to 8192 x 8192)
Indexed cube map arrays

8 render targets
Independent blend modes per render target
Pixel coverage sample masking
Hierarchical Z/stencil buffer
Early Z test and Fast Z Clear
Lossless Z & stencil compression
2x/4x/8x/16x high quality adaptive anisotropic filtering modes
sRGB filtering (gamma/degamma)
Tessellation unit
Stream out support
Compute shader support

GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.

Sound and Audio

Dedicated 120MHz audio DSP.
Support for 6 channel discrete uncompressed audio (via HDMI).
2 channel audio for the Cafe DRC controller.
Monaural audio for the Cafe Remote controller.

Networking

802.11 b/g/n Wifi.

Peripherals

2 x USB 2.0 host controllers x 2 ports each.
SDCard Slot.

Built-in Storage

512MB SLC NAND for System.
8GB MLC NAND for Applications.

Host PC Bridge

Dedicated Cafe-to-host PC bridge hardware.
Allows File System emulation by host PC.
Provides interface for debugger and logging to host PC."

pretty much
 
Pretty disapointing specs as seems to be the new normal from Nintendo.

Since Nintendo has to be so fucking cheap with everything and pass all the costs to its users if I was serious about this system just to get the same functionality of a PS3 I would be required to buy:

Ethernet adapter: My house is older with plaster walls. There is what would be akin to chicken wire under that plaster causing signal interference. I barely get a phone signal inside the house due to this. I have run cat5e cable throughout the house. So Ethernet is my best and really only option.

Portable hard drive/USB Key: I have bought far more than 8Gb of PSN titles so 8Gb isnt going to be enough. I don't have a device on hand that would be useful so an additional purchase is required.

HDMI audio receiver since there will be no optical support, it is HDMI LPCM 6.1 only. I have an older model receiver that is still in perfect functioning condition. This is the first device I have seen that doesn't support some sort of TOSLINK or SPDIF outputs. My current consoles, PC, cablebox, all support at least some multichannel alternative, even bargain basement Walmart Bluray players support them. It will be interesting to see if PS4/Durango support optical out or HDMI only. I bet ps4 supports it.

This system is looking shittier and shittier, im not even talking graphics I am talking ease of use. All of this shit could have been built into the system at a insanely cheap price. Even the welfare Walmart Bluray player for $50 can afford an Ethernet port and a toslink output. I think ill wait till the thing drops to under $150-200 or until Mario/zelda/metroid/SSB/Mariocart are all out before I even consider this thing.
 
The World has changed a lot since Last Gen so I wouldn't be to worried about the Wii U getting abandoned for the PS4\Xbox Next & PC because there will be so many devs making games for PS3,Xbox 360 , Vita , Tablets & Smartphones that Wii U will not get left behind.

lol that's a pretty unintentionally insulting post. "Don't worry, Wii U will still have bargain bin games."
 
Wii U's superiority will be most apparent when you keep resolution and frame rate on par with current systems and use the extra grunt for more details and better textures. Once you start increasing resolution or frame rate, which for most plebs borders on unnoticeable, the graphic advantages start to fade away. And god forbid you try to run a concurrent scene on the pad.

Yeah, I know. 2-3X in power in GPU and RAM isn't a huge difference, but notable.


Really reminds me of last gen Vs Wii tbqh.
 
Don't come at me with pitchforks in unison, GAF, but wasn't there a recent re-evaluation of the Wii's broadway CPU after IBM released a 600-something page document about it recently? Am I delusional, or did I read that the CPU in the Wii had a surprising amount of clout for its low clock speed and single core design ?

Not that GAF should be seriously treated as a credible source under any circumstance, but I remember reading a post on here stating that Xenon was only about 20% faster (in real-world performance terms) than Broadway running highly optimized code.

Three broadway cores with greater cache, a few additional features, and higher clock speeds should logically surpass 360 CPU performance figures by a comfortable margin. I'm also aware, though, that developers weren't exactly enamored with the CPU this gen probably because it operated differently. Maybe this accounts for some of the disenchantment over the CPU and some pretty underwhelming third-party ports?

I refuse to believe that amount of power was possible in 1999. It's not like the Gamecube/Wii are renderfarms so the tech isn't passive.
 
Wii U's superiority will be most apparent when you keep resolution and frame rate on par with current systems and use the extra grunt for more details and better textures. Once you start increasing resolution or frame rate, which for most plebs borders on unnoticeable, the graphic advantages start to fade away. And god forbid you try to run a concurrent scene on the pad.

I will yet again point someone to the E3 Japanese Garden floor demo.

That looks like fullscene rendering on TV and GamePad to me.
 
Yeah, I know. 2-3X in power in GPU and RAM isn't a huge difference, but notable.


Really reminds me of last gen Vs Wii tbqh.

If handled smartly, the Wii U will eventually dish out some fine graphics. But I get the feeling it will be left out in the cold and down ports from PS4 and Durango will not be flattering.
 
I'm not reading through all the posts, but to reply to the first page...

Xbox 360 = SSJ Goku (post 3 year android prep time)(Pre HBTC)
PS3 = SSJ Vegeta (Pre HBTC)
Wii U = Piccolo fused with Kami
PS4/Durango = Ascended/Ultra Super Saiyan Vegeta

Serious Business.
 
Power7 is really powerful (no pun), the benchmarks we have got for the WiiU show it only being about half the speed per a core as an AMD Stars core @ 2.4Ghz.

A Power7 would have to be running at a very low clockspeed to lose to a Stars core.

That and the fact that if the WiiU CPU is based off Broadway then it can not be also Power 7.
 
HDMI audio receiver since there will be no optical support, it is HDMI LPCM 6.1 only.

Ugh, tell me about it. I don't understand why this is such a hard thing to do considering that many audio devices still interface with it. For a company that prides itself on not being on the cutting edge of technology, it's strange that they would actually bottleneck consumers with HDMI audio only.
 
Totally agree. A "next gen" console should be something that blows you away immediately with things that you couldn't imagine on previous consoles.

A HD Wii is nice, but other than the prospect of playing more awesome Nintendo games, it doesn't really do that for me.

You guys do know the difference between reality and what you "want" right? There are plenty of "shoulds" in this world that you can dwell on and only end up upset. Id rather enjoy myself.
 
pretty much
With the CPU being an evolution of the Gecko/Broadway CPUs and the GPU 3D graphics API (GX2) being an extension of the GX1 API, it's just like I said earlier: Nintendo made it that anybody who developed on the Gamecube or Wii will feel in many ways right at home with this system. That's a brilliant design imo, from what I can understand.
 
Top Bottom