Rumor: Wii U final specs

Crytek is primarily a middleware provider. They ported the engine over, so they do in fact develop for Wii U.

I think they haven't ported the engine themselves, I think only the licensee that is using it for its game.

As far as I know (June, so it can change of course but I doubt it) there is no official support by crytek to wii u.
 
The Halo thread is making Nintendo a disservice. Now, we are probably all expecting the next Starfox or Metroid on Wii U to look like that.
 
The Halo thread is making Nintendo a disservice. Now, we are probably all expecting the next Starfox or Metroid on Wii U to look like that.

Theoretically if the Wii U has a better GPU (it seems) it should be able to look better.

Of course, Halo 4 is year 7 of the 360.

Which probably goes to highlight another of Nintendo's problems. Fully mature, fully tapped PS360 boxes with great libraries and console-pushing games, that might not look any lesser than the Wii U's best first efforts.

Kind of reminds me of back when the Atari Jaguar first came out, and often the latest SNES games looked better lol (for a variety of reasons of course).
 
Theoretically if the Wii U has a better GPU (it seems) it should be able to look better.

Of course, Halo 4 is year 7 of the 360.

Which probably goes to highlight another of Nintendo's problems. Fully mature, fully tapped PS360 boxes with great libraries and console-pushing games, that might not look any lesser than the Wii U's best first efforts.

Kind of reminds me of back when the Atari Jaguar first came out, and often the latest SNES games looked better lol (for a variety of reasons of course).

fortunately Nintendo have Retro to push the system. Microsoft have been without a Retro or a Naughty Dog to really push their hardware ever since Bizarre Creations went off to Activision. 343 is finally stepping into that role. MS exclusives haven't blown away third party titles since PGR4. It's an area they've really pailed in comparison to Sony's exclusive devs.

At least we know Nintendo have Retro and EAD Tokyo to push the hardware and that they're already likely working on games with impressive graphics, even if we haven't seen them yet.
 
Which probably goes to highlight another of Nintendo's problems. Fully mature, fully tapped PS360 boxes with great libraries and console-pushing games, that might not look any lesser than the Wii U's best first efforts.

this has never been a problem for nintendo. You're forgetting that the massive market that they have don't give a toss about graphics in the sense that we do. The new mario game looking amazing and charming sticks in the average persons mind much better than buttloads of polygons.
 
So Crytek Metroid still not confirmed yet?

It's actually Crytek Animal Crossing. You can pull the most realistic looking weeds and dig holes so detailed you can see the nutrients flowing through it. There's also new tree bumping technology with realistic motion when you give them a shake.

Can't wait to catch an Coleacanth. I'll never donate to the museum. I'll just stare at it in my mortgaged housed.
 
Theoretically if the Wii U has a better GPU (it seems) it should be able to look better.

Of course, Halo 4 is year 7 of the 360.

Which probably goes to highlight another of Nintendo's problems. Fully mature, fully tapped PS360 boxes with great libraries and console-pushing games, that might not look any lesser than the Wii U's best first efforts.

Kind of reminds me of back when the Atari Jaguar first came out, and often the latest SNES games looked better lol (for a variety of reasons of course).

Yup, but it's always the same for any platform holder for any new console. The 360 was slated for a fair few titles looking 'last gen' at the beginning of its life, as was the PS3. It always happens but whenever a new console is out everyone forgets that just about every console in every generation is pretty much the same: rushed ports, rushed exclusives and only a few standout titles looking truly 'next gen'. And of those few standout titles you still get fanboys saying that those graphics can be done on Y machine from the previous gen.

ZombiU is a prime example, the lighting, shadows, radiosity, ansiotropic filtering and AA are beyond anything the PS3 and 360 are capable of and the vast majority of development has been on underpowered/unfinished dev kits.

And you'll still have fanboys being in denial when the PS3 and 360 SKUs of Black Ops 2 are in sub-HD as per usual and the U SKU is in HD (although I'm pretty sure it won't be in 1080p native like some people are suggesting).
 
And you'll still have fanboys being in denial when the PS3 and 360 SKUs of Black Ops 2 are in sub-HD as per usual and the U SKU is in HD (although I'm pretty sure it won't be in 1080p native like some people are suggesting).

There's already been people saying the WiiU version of BLOPs looks worse than the other versions.
 
There's already been people saying the WiiU version of BLOPs looks worse than the other versions.

I'm not sure if what you say is true or not, but just because its running in a higher resolution doesn't mean the game is going to look better anyways. Overall that is. Resolution is only one aspect of the graphical make up of the game. The COD engine has been optimized for PS3/360 over the last 5 years, and I wouldn't expect them to port it over to Wii U and completely optimize it under a year. Wii U has completely different architecture. We could easily see pared down alpha, other effects, or a less steady framerate.
 
There's already been people saying the WiiU version of BLOPs looks worse than the other versions.

Even though it's able to run another players game on the gamepad simultaneously? Seems to me, it probably takes a graphical hit while in that mode and some fanboy has chosen to pick up on it. If the graphics were already worse than on PS3/360 then it seems unlikely that they would have added that mode in.
 
Metroid--- -------Halo 4 or ME3 quality graphics at minimum, ray tracing and refractions on Samus's suit
Legend of Zelda--The Witcher 2 caliber graphics or better, except in the series trademark cartoony art style.
Mario Kart 8------ModNation Racers or LBP Karting level of detail with smooth framerates, a better sense of speed and support for 6 players (4 people with wiimotes share the TV while the other 2 use Gamepads).


Is this expecting too much or selling Nintendo short?
 
Huh? It is a frame buffer pool, Halo 3 used two frame buffers to produce its HDR. I don't think it is legit to say the eDRAM had special features for this.

Well, Xenos has a special HDR FP10 mode which requires substantially less performance from the ROPs than FP16, making the use of it much more feasible in terms of performance. Since the ROPs are in the EDRAM, in the end it is only thanks to the huge BW of the EDRAM that all these framebuffer intensive effects can be done, and thus by having 32mb of Edram the Wii U will be able to do so much more in this regard.
 
Not sure if anyone has mentioned this or not so Ill post here.

It seems IBM has confirmed that the processor CPU in the Wii-U is not according to this tweet, based on IBM's Power 7 chip, as previously believed, but rather a Power-based CPU.

Now I have no earthy idea what that means good or bad, but at least we have some solid IMB info that might help or hurt the Wii-U spec wars..

The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.

http://www-03.ibm.com/press/us/en/pressrelease/34683.wss
 
Not sure if anyone has mentioned this or not so Ill post here.

It seems IBM has confirmed that the processor CPU in the Wii-U is not according to this tweet, based on IBM's Power 7 chip, as previously believed, but rather a Power-based CPU.

Now I have no earthy idea what that means good or bad, but at least we have some solid IMB info that might help or hurt the Wii-U spec wars..



http://www-03.ibm.com/press/us/en/pressrelease/34683.wss

Been posted.
 
Not sure if anyone has mentioned this or not so Ill post here.

It seems IBM has confirmed that the processor CPU in the Wii-U is not according to this tweet, based on IBM's Power 7 chip, as previously believed, but rather a Power-based CPU.

Now I have no earthy idea what that means good or bad, but at least we have some solid IMB info that might help or hurt the Wii-U spec wars..



http://www-03.ibm.com/press/us/en/pressrelease/34683.wss

Umm... this was posted over a year ago. In fact the article is from June 2011.
 
Metroid--- -------Halo 4 or ME3 quality graphics at minimum, ray tracing and refractions on Samus's suit
Legend of Zelda--The Witcher 2 caliber graphics or better, except in the series trademark cartoony art style.
Mario Kart 8------ModNation Racers or LBP Karting level of detail with smooth framerates, a better sense of speed and support for 6 players (4 people with wiimotes share the TV while the other 2 use Gamepads).


Is this expecting too much or selling Nintendo short?

I have a feeling it will be a 2d Metroid. I don't think Retro is making a first person Metroid or any Metroid for that mattter, I am still guessing it will be a new IP.
 
I have a feeling it will be a 2d Metroid. I don't think Retro is making a first person Metroid or any Metroid for that meeter, I am still guessing it will be a new IP.

That would totally suck.

I want a 3D sequel to Metroid Fusion. Make it just as desolate (no computer buddy!), and introduce enemies that can only be seen directly by holding up your Wii Pad (it's your special visor).
 
Well, Xenos has a special HDR FP10 mode which requires substantially less performance from the ROPs than FP16, making the use of it much more feasible in terms of performance. Since the ROPs are in the EDRAM, in the end it is only thanks to the huge BW of the EDRAM that all these framebuffer intensive effects can be done, and thus by having 32mb of Edram the Wii U will be able to do so much more in this regard.
I don't think Xenos' ROPs could blend fp formats other than fp10 at all, IIRC. Or use them with MSAA, for that matter.
 
So, I've been thinking about RAM in the context of the other components in the Wii U, and I'm thinking that Samsung's 20nm DDR3 chips might be what Nintendo's going with. They've been manufacturing them in 4Gb densities since late last year, and it's made on a much smaller process than any other manufacturer uses, which means much lower energy consumption and heat. Samsung advertise that 4GB in a laptop consumes just 0.97W at 800MHz (1600MT/s), and the RAM is rated up to 1067MHz (2133MT/s), although obviously with higher power usage. This means, given a 128 bit bus, Nintendo could run it at 800MHz to give 25.6GB/s bandwidth in under half a watt, with potential to go as far as 34.1GB/s at 1067MHz at no more than a couple of watts. Seems perfect for Nintendo's need of low power without sacrificing performance.

However, I can't find out if they're available with 32-bit interfaces or only 16-bit, which would be necessary if they want a 128-bit interface with only 4 chips.

As a side note, here's a thought on clock speeds. Assuming a 120MHz DSP:
CPU: 1.92GHz
GPU: 480MHz
RAM: 960MHz
It'd be possible with the aforementioned RAM, it would fit Nintendo's apparent desire for clocks which are clean multiples, and (assuming 640 SPUs) it would just exceed the 600GFlops which people seem to be focussing on.

Edit:
Does IBM even make anything that's not Power right now?

It's as empty of a statement as empty statements get.

Just being pedantic, but they also do a line of CISC chips called the z/Architecture for mainframes.
 
So, I've been thinking about RAM in the context of the other components in the Wii U, and I'm thinking that Samsung's 20nm DDR3 chips might be what Nintendo's going with. They've been manufacturing them in 4Gb densities since late last year, and it's made on a much smaller process than any other manufacturer uses, which means much lower energy consumption and heat. Samsung advertise that 4GB in a laptop consumes just 0.97W at 800MHz (1600MT/s), and the RAM is rated up to 1067MHz (2133MT/s), although obviously with higher power usage. This means, given a 128 bit bus, Nintendo could run it at 800MHz to give 25.6GB/s bandwidth in under half a watt, with potential to go as far as 34.1GB/s at 1067MHz at no more than a couple of watts. Seems perfect for Nintendo's need of low power without sacrificing performance.

However, I can't find out if they're available with 32-bit interfaces or only 16-bit, which would be necessary if they want a 128-bit interface with only 4 chips.

As a side note, here's a thought on cock speeds. Assuming a 120MHz DSP:
CPU: 1.92GHz
GPU: 480MHz
RAM: 960MHz
It'd be possible with the aforementioned RAM, it would fit Nintendo's apparent desire for clocks which are clean multiples, and (assuming 640 SPUs) it would just exceed the 600GFlops which people seem to be focussing on.

0_o
 
It could do MSAA fine. It couldn't do blending with FP16 though.
Right, my bad - it couldn't downsample non-blendable formats. Had to look that up through posts on b3d, though, as I don't have Xenos specs here, and the descendants are a tad different from Xenos where it comes to render targets.
 
So, I've been thinking about RAM in the context of the other components in the Wii U, and I'm thinking that Samsung's 20nm DDR3 chips might be what Nintendo's going with. They've been manufacturing them in 4Gb densities since late last year, and it's made on a much smaller process than any other manufacturer uses, which means much lower energy consumption and heat. Samsung advertise that 4GB in a laptop consumes just 0.97W at 800MHz (1600MT/s), and the RAM is rated up to 1067MHz (2133MT/s), although obviously with higher power usage. This means, given a 128 bit bus, Nintendo could run it at 800MHz to give 25.6GB/s bandwidth in under half a watt, with potential to go as far as 34.1GB/s at 1067MHz at no more than a couple of watts. Seems perfect for Nintendo's need of low power without sacrificing performance.

However, I can't find out if they're available with 32-bit interfaces or only 16-bit, which would be necessary if they want a 128-bit interface with only 4 chips.

As a side note, here's a thought on clock speeds. Assuming a 120MHz DSP:
CPU: 1.92GHz
GPU: 480MHz
RAM: 960MHz
It'd be possible with the aforementioned RAM, it would fit Nintendo's apparent desire for clocks which are clean multiples, and (assuming 640 SPUs) it would just exceed the 600GFlops which people seem to be focussing on.

Edit:


Just being pedantic, but they also do a line of CISC chips called the z/Architecture for mainframes.

Still trying to take my hypotheses I see. >_>


XD
 
Still trying to take my hypotheses I see. >_>

Well, I suppose I'll grant you prior art on that one, although once you assume a 120MHz base the options are pretty limited for each component. Anyway, my main point was really about the Samsung 20nm DDR3, which could reach these sorts of speeds in a very limited power and heat budget.
 
Metroid--- -------Halo 4 or ME3 quality graphics at minimum, ray tracing and refractions on Samus's suit
Legend of Zelda--The Witcher 2 caliber graphics or better, except in the series trademark cartoony art style.
Mario Kart 8------ModNation Racers or LBP Karting level of detail with smooth framerates, a better sense of speed and support for 6 players (4 people with wiimotes share the TV while the other 2 use Gamepads).


Is this expecting too much or selling Nintendo short?
Ray tracing [at the moment] is too demanding for any game wanting more than 1fps with respectable detail.

And I expect Nintendo to do far better when they have 4x the RAM at their disposal (with 2x available for games).
 
Well, I suppose I'll grant you prior art on that one, although once you assume a 120MHz base the options are pretty limited for each component. Anyway, my main point was really about the Samsung 20nm DDR3, which could reach these sorts of speeds in a very limited power and heat budget.

Just messing with you because of Fourth Storm's post. But yeah assuming clock multiples the options are limited. Though it would be interesting if they did take the GPU to 600Mhz (which I don't expect) I could see them going with 900Mhz memory and have the CPU at 1800.

And considering the TDP for the 30nm version would have been great, I can only imagine the benefit of the 20nm version... unless you have a link.
 
Just messing with you because of Fourth Storm's post. But yeah assuming clock multiples the options are limited. Though it would be interesting if they did take the GPU to 600Mhz (which I don't expect) I could see them going with 900Mhz memory and have the CPU at 1800.

And considering the TDP for the 30nm version would have been great, I can only imagine the benefit of the 20nm version... unless you have a link.

Yeah, the 640 SPUs at 600MHz was my prediction from way back when, and would make most sense at 1800/600/900. That said, 480MHz is probably more realistic given the system wattage we're dealing with now. I also just like the x4/x8/x16 multipliers for being nice and neat, although it's probably going beyond "clean multiples" and into "obsessive compulsive".

Regarding the Samsung 20nm, there are a couple of their PDFs here and here, mostly dealing with server use, which is their main customer base for it at the moment.
 
For some reason I'm expecting something like 2.4-2.6GHz or something like that for the CPU's raw clock speed. I'm not basing that off of much though.
 
Yeah, the 640 SPUs at 600MHz was my prediction from way back when, and would make most sense at 1800/600/900. That said, 480MHz is probably more realistic given the system wattage we're dealing with now. I also just like the x4/x8/x16 multipliers for being nice and neat, although it's probably going beyond "clean multiples" and into "obsessive compulsive".

Regarding the Samsung 20nm, there are a couple of their PDFs here and here, mostly dealing with server use, which is their main customer base for it at the moment.

OCD is right up Nintendo's alley for something like that. Thanks for the links. In the 30nm PDF (bottom of page 3) it mentions that 4GB of memory using 2Gb chips (speed not mentioned), is 1.22W. So I'd assume that we're easily looking at less than 1W for what Wii U would use.

For some reason I'm expecting something like 2.4-2.6GHz or something like that for the CPU's raw clock speed. I'm not basing that off of much though.

2.4Ghz would be the choice based on the multiples.
 
OCD is right up Nintendo's alley for something like that. Thanks for the links. In the 30nm PDF (bottom of page 3) it mentions that 4GB of memory using 2Gb chips (speed not mentioned), is 1.22W. So I'd assume that we're easily looking at less than 1W for what Wii U would use.

On this page they give the 0.97W for a 20nm 4GB module that I used before. In both cases it's at 1.35V. Voltage matters more for power consumption than clock speed, although they do point out that speeds >800MHz aren't achievable at 1.35V. In any case for 2GB I don't see consumption increasing much beyond a watt even if you're breaking the 1GHz barrier.

Of course, given how small the numbers involved are, it's entirely possible that the 30nm chips would fit the bill and be chosen instead for budgetary reasons.
 
On this page they give the 0.97W for a 20nm 4GB module that I used before. In both cases it's at 1.35V. Voltage matters more for power consumption than clock speed, although they do point out that speeds >800MHz aren't achievable at 1.35V. In any case for 2GB I don't see consumption increasing much beyond a watt even if you're breaking the 1GHz barrier.

Of course, given how small the numbers involved are, it's entirely possible that the 30nm chips would fit the bill and be chosen instead for budgetary reasons.

All good, more juice for the GPU! :)
 
Top Bottom