Rumor: Wii U final specs

The Wii had slightly over double the amount of RAM that the GC did, yet I can't really think of any games games that provided significantly better quality textures and such.

Will most likely be the case for the Wii-U. :/

Well I don't know about textures specifically, but games like Monster Hunter and Xenoblade look amazing, and far superior to anything that came out on the GC.

If every Wii U game looked as good as Xenoblade I'd be pretty happy.

Edit: except for the face textures, those are the stuff of nightmares
 
Sort of kind of not the same thing, sort of.


Its all wishy washy until someone gets their hands on the thing and takes it apart.

It's basically a Blu-ray drive and discs, modified. Nintendo should offer the software/licence/codec as a paid downloadable app. Someone told me once that you can get away with cheaper drives which don't require constant disc spinning, ala movie discs, soooo that might be an eventual factor too.
 
dude you are spreading so much baloney.

Also didn't somebody already get a look at the Wii U PSU? What were the results I had forgot...

Aha found it http://www.neogaf.com/forum/showpost.php?p=39306355&postcount=4531

Max draw is 75 watts.

Speculation is all baloney until confirmed why so concerned? Also those are not final units they could have increased them for all we know, but still I am more inclined that it is is 75 watts and the TDP is 45 watts and in that case GDDR5 would end up being too slow if the Wii U gpu was clocked under 500MHz let's say maybe 486Mhz then DDR3 would probably have enough bandwidth. I was speculating on ram anyway which lead me to putting up a guess of what the cpu and gpu could be.
 
Here's something I didn't think I'd be saying:

Hopefully the Wii U's RAM is all DDR3.

To explain, as far as I'm concerned there are two possible configurations for the RAM in the Wii U:

1GB GDDR5 (games) + 1GB DDR3 (system)
or
2GB DDR3 (shared)

While the first one sounds better, because of the split pools it severely limits Nintendo's ability to free up more memory for games in future, and gives devs a headache if they do. With a common pool they can gradually increase the RAM available to games via firmware updates, in a way that's easy for developers to make use of. It would also have lower latency than the GDDR5, and almost the same bandwidth, as they could go with a single wide bus instead of two narrow ones (although this depends on how much bandwidth the OS is taking up). The console could well have 1.5GB+ available to games within a couple of years.
 
They could have done that while still having a beefy system.
Using what, fairy dust? PS3 and 360 are still loud and bulky after numerous revisions. Enlighten me as to how Nintendo would release a more powerful yet smaller, cooler, quieter, and more reliable console with a smaller footprint for a mass market price.
 
Speculation is all baloney until confirmed why so concerned? Also those are not final units they could have increased them for all we know, but still I am more inclined that it is is 75 watts and the TDP is 45 watts and in that case GDDR5 would end up being too slow if the Wii U gpu was clocked under 500MHz let's say maybe 486Mhz then DDR3 would probably have enough bandwidth. I was speculating on ram anyway which lead me to putting up a guess of what the cpu and gpu could be.

I suppose it's OK if you let you have that one, since you're making up all your other numbers anyway.
 
They could have done that while still having a beefy system.
Not for ~$250 with a $100+ controller packed in, they couldn't.

There are more things to consider in a real world market scenario than what hardcore gamers want from a box. I'm not enamored with the specs but the price is darn right.
 
Here's something I didn't think I'd be saying:

Hopefully the Wii U's RAM is all DDR3.

To explain, as far as I'm concerned there are two possible configurations for the RAM in the Wii U:

1GB GDDR5 (games) + 1GB DDR3 (system)
or
2GB DDR3 (shared)

While the first one sounds better, because of the split pools it severely limits Nintendo's ability to free up more memory for games in future, and gives devs a headache if they do. With a common pool they can gradually increase the RAM available to games via firmware updates, in a way that's easy for developers to make use of. It would also have lower latency than the GDDR5, and almost the same bandwidth, as they could go with a single wide bus instead of two narrow ones (although this depends on how much bandwidth the OS is taking up). The console could well have 1.5GB+ available to games within a couple of years.

It's highly unlikely it's physically split. It's highly unlikely the GPU and CPU are even separate chips.
 
Here's something I didn't think I'd be saying:

Hopefully the Wii U's RAM is all DDR3.

To explain, as far as I'm concerned there are two possible configurations for the RAM in the Wii U:

1GB GDDR5 (games) + 1GB DDR3 (system)
or
2GB DDR3 (shared)

While the first one sounds better, because of the split pools it severely limits Nintendo's ability to free up more memory for games in future, and gives devs a headache if they do. With a common pool they can gradually increase the RAM available to games via firmware updates, in a way that's easy for developers to make use of. It would also have lower latency than the GDDR5, and almost the same bandwidth, as they could go with a single wide bus instead of two narrow ones (although this depends on how much bandwidth the OS is taking up). The console could well have 1.5GB+ available to games within a couple of years.

This may sound dumb, but can't you have different kinds of Ram and still use them as a unified pool depending on how you link them?
 
Here's something I didn't think I'd be saying:

Hopefully the Wii U's RAM is all DDR3.

To explain, as far as I'm concerned there are two possible configurations for the RAM in the Wii U:

1GB GDDR5 (games) + 1GB DDR3 (system)
or
2GB DDR3 (shared)

While the first one sounds better, because of the split pools it severely limits Nintendo's ability to free up more memory for games in future, and gives devs a headache if they do. With a common pool they can gradually increase the RAM available to games via firmware updates, in a way that's easy for developers to make use of. It would also have lower latency than the GDDR5, and almost the same bandwidth, as they could go with a single wide bus instead of two narrow ones (although this depends on how much bandwidth the OS is taking up). The console could well have 1.5GB+ available to games within a couple of years.
A single DDR3 pool is better from multiple perspectives. I'm (and have been since WUST1) betting on a single pool.
 
Using what, fairy dust? PS3 and 360 are still loud and bulky after numerous revisions. Enlighten me as to how Nintendo would release a more powerful yet smaller, cooler, quieter, and more reliable console with a smaller footprint for a mass market price.

I...I just wanted to be your friend, GrotesqueBeauty. :(
 
dude you are spreading so much baloney.

Mister "32MB for OS? Not surprised!", ladies and gentlemen.


Anyway, a question for the tech savy (that excludes you, specialguy): what is the "cost" of using GPGPU functions on the GPU? Are they "free"? Or does it mean less power for actual graphics if a game heavily relies on GPGPU?
 
A single DDR3 pool is better from multiple perspectives. I'm (and have been since WUST1) betting on a single pool.

I've been betting on the same, it just occurred to me that if they were going with GDDR5 at all now, they'd do so as a split pool, given the cost of 2GB of the stuff. Although I'd long expected DDR3, I'm now actually at the point where I'd find it preferrable.
 
Mister "32MB for OS? Not surprised!", ladies and gentlemen.


Anyway, a question for the tech savy (that excludes you, specialguy): what is the "cost" of using GPGPU functions on the GPU? Are they "free"? Or does it mean less power for actual graphics if a game heavily relies on GPGPU?

Sadly I am not savvy but I'm willing to bet a flop used here is a flop which cannot be used there.
 
Mister "32MB for OS? Not surprised!", ladies and gentlemen.


Anyway, a question for the tech savy (that excludes you, specialguy): what is the "cost" of using GPGPU functions on the GPU? Are they "free"? Or does it mean less power for actual graphics if a game heavily relies on GPGPU?

From what I understand, GPGPU is just using the shader cores on the graphics card to do non-graphics stuff. So yes, if you use it for GPGPU purposes, you'll have fewer cores/less time for shaders etc.
 
Mister "32MB for OS? Not surprised!", ladies and gentlemen.


Anyway, a question for the tech savy (that excludes you, specialguy): what is the "cost" of using GPGPU functions on the GPU? Are they "free"? Or does it mean less power for actual graphics if a game heavily relies on GPGPU?

The games are never going to use GPGPU. I've never seen a game anywhere use GPGPU (maybe crappy Physx? Which is only visual effects anyway? ugh).

A GPU is for graphics in a game console. If it's doing something else you're doing it wrong. If you want to use GPGPU to encode a video on your PC faster, cool (and they're surprisingly ineffective even at that, offering only quite modest speedups over the CPU given their brute force). Other than that, no.

For Wii U games to look their best they'll just have to work around the (apparently) crap CPU. Luckily, this should be doable in time imo. You'd much rather have a crap CPU than a crap GPU for games.

Mister "32MB for OS? Not surprised!", ladies and gentlemen.

How is guessing a number wrong a bad reflection on me? Hell, it almost makes Nintendo look worse that they had to reserve such a massive amount for the OS, not better. The only silver lining is if they can reclaim some of that for games.
 
Mister "32MB for OS? Not surprised!", ladies and gentlemen.


Anyway, a question for the tech savy (that excludes you, specialguy): what is the "cost" of using GPGPU functions on the GPU? Are they "free"? Or does it mean less power for actual graphics if a game heavily relies on GPGPU?

You're absolutely divvy-ing up resources.

In fact there might be extra overhead in mixing thread types depending on the GPU...although I have no idea what the current state of task balancing on GPUs is right now (mixing graphics and gpgpu threads), and how efficient it is.


The games are never going to use GPGPU. I've never seen a game anywhere use GPGPU (maybe crappy Physx? Which is only visual effects anyway? ugh). Give it up.

A GPU is for graphics in a game console. If it's doing something else you're doing it wrong. If you want to use GPGPU to encode a video on your PC faster, cool (and they're surprisingly ineffective even at that). Other than that, no.

Never say never. I think it'll happen, if the other systems go (relatively) 'small' CPU, and you'll see more middleware allowing the use of GPU for processing. The question for Wii-U is if it can be performance competitive then. A shift to GPGPU might not actually benefit Wii-U if it is in the GPU that it has the bigger performance deficit vs PS4/720 - if 'the game' starts to get tied to GPU power rather than CPU power, performance barriers to porting could become even worse if the other systems have notably larger ex-graphics gpu budgets. Scaling against GPU resources would not longer be 'simply' about scaling graphics in order to make a port work.
 
The games are never going to use GPGPU. I've never seen a game anywhere use GPGPU. Give it up.

A GPU is for graphics in a game console. If it's doing something else you're doing it wrong. If you want to use GPGPU to encode a video on your PC faster, cool. Other than that, no.

I find it odd that Iwata would go out of his way to mention the GPGPU if it were completely irrelevant and they didn't have a use for it, especially considering Nintendo isn't exactly spec crazy right now. But that's just me.
 
The games are never going to use GPGPU. I've never seen a game anywhere use GPGPU (maybe crappy Physx? Which is only visual effects anyway? ugh). Give it up.

A GPU is for graphics in a game console. If it's doing something else you're doing it wrong. If you want to use GPGPU to encode a video on your PC faster, cool (and they're surprisingly ineffective even at that). Other than that, no.

And therein lies the magic of consoles. If given a tool, it will probably be used.
 
The games are never going to use GPGPU. I've never seen a game anywhere use GPGPU (maybe crappy Physx? Which is only visual effects anyway? ugh). Give it up.

A GPU is for graphics in a game console. If it's doing something else you're doing it wrong. If you want to use GPGPU to encode a video on your PC faster, cool (and they're surprisingly ineffective even at that). Other than that, no.

Not sure if this matters but gpgpu is used in few 3d apps inluding Mari, and photoshop functions that use 3d rendering. Mustly used to speed up texture rendering i think.
 
For comparison of console vs cheap desktop ram:

When the GameCube game out, most people were on computers with 100mhz RAM. The GC had 24MB of 324mhz RAM (The Xbox's 64MB of RAM clocked at 200mhz)

When the 360 came out, most people were running around 300mhz ram. The 360's RAM is around 700mhz.

There's a tradeoff when making a console. You have less RAM than desktops, but that RAM is also much faster and lower latency than desktop RAM for the general market. Add in the fact that you're custom ordering it and it's not like MS, Sony, or Nintendo can just call up NewEgg for 8GB sticks for all consoles.
 
wouldn't DDR3 reduce the effectiveness of the already conservative GPU?
As I'm not sure if I've posted my view on the subject in this thread yet (and I can't be bothered to search), I'll do that now.

U-GPU already has access to a split-mem architecture - 32MB edram, and 2GB of main ram. The first pool likely provides BW in the hundreds of GB/s. The second pool - a couple of tens of GB/s (i.e. from mid 20's to low 30's GB/s). Now, in contrast to the 360, the edram is very likely not a 'mere' framebuffer. That means that relatively small but often-used render targets might not need to be resolved to main RAM - they can sit in edram during their entire lifespan (alternatively, some write-only targets could sit entirely in main RAM - Xenos' memexport style). Long story short, in the average main ram BW will go essentially toward static texture assets and resolved large target (e.g. deferred shading g-buffers). But all read-modify-write fb BW will be covered by edram. As shown in practice by various platforms (360 being a good example), such a split could be very beneficent toward a balanced performance.
 
How is guessing a number wrong a bad reflection on me? Hell, it almost makes Nintendo look worse that they had to reserve such a massive amount for the OS, not better. The only silver lining is if they can reclaim some of that for games.

Because you always have an answer ready even if you don't have the faintest idea what you're talking about. Also your hindsight is 20/20.

32MB? CALLED IT!
What? 1GB? Incompetent fucks!

Actually, it surprises me you didn't get a funny avatar quote to go along. I mean, so many possibilities.
 
I find it odd that Iwata would go out of his way to mention the GPGPU if it were completely irrelevant and they didn't have a use for it, especially considering Nintendo isn't exactly spec crazy right now. But that's just me.

Maybe whoever wrote his copy thinks GPGPU is just another term for unified shader architecture. Maybe it's just there to blunt concerns about the CPU's power. Point is, we know what GPGPU looks like on DX10.1 level hardware, and it's not particularly useful. It's certainly not a panacea for an underachieving CPU.
 
Because you always have an answer ready even if you don't have the faintest idea what you're talking about. Also your hindsight is 20/20.

32MB? CALLED IT!
What? 1GB? Incompetent fucks!

Actually, it surprises me you didn't get a funny avatar quote to go along. I mean, so many possibilities.
Is this really necessary?
 
Honestly, in spite of all the debate over the horsepower of the system I am glad Nintendo is making it fairly compact and energy efficient. Hopefully it'll run cool and be quiet as a result. Those sorts of things tend to get overlooked in these hardware discussions, but as a gamer they make a big difference in my day to day use of a system. As much as I like my slim PS3's software library, I've been underwhelmed by the loud drive, cracking noises as the case changes temperature, and overall amount of heat it puts off. Even though the Wii is one of Nintendo's lesser hardware designs imo, it still beats the hell out of the competition when it comes to being quiet and unobtrusive.
With that power draw, the lack of a disk tray, no hard drive and a small cooling system? This thing ought to be quiet as a Nintendo ninja.
 
Iwata was specifically talking about the amount of power the system can draw though, not what the brick is rated for. I thought he made that pretty clear.

Yes but there are 4 usb ports which will have to take a maximum of 2.5 watts each so that would leave 65 watts for the rest of the unit. 5 watts would go to the slimline blu laser drive at 5x and another 10 watts for all the wireless, audio,nand as well as I/O and fans. This will leave about 50 watts for GPU+Ram and CPU+Ram I doubt the cpu will be clocked higher than 2.187GHz it could even be 1.458Ghz. Even so at 1.458Ghz the "enhanced broadway cores" I would speculate, might still be faster then the xenon but let's hope it is clocked at 2.187GHz. For the GPU i was looking at either 486Mhz or 607.5Mhz but 35 watts does not leave too much room to work with for the GPU and 10watts also for the CPU as the ram would be maybe 10 watts. still possible for a 607.5Mhz Clock at this much power though but we need more info.

But this is going off the interpretation off iwata's mention of the 75 watt power draw so it could be way off. He also mentioned typical usage would be 45 watts which would be less than you can probably have the cpu and and gpu at 75% load and one gamepad drawing wireless power and it might use 45 watts

Here's something I didn't think I'd be saying:

Hopefully the Wii U's RAM is all DDR3.

To explain, as far as I'm concerned there are two possible configurations for the RAM in the Wii U:

1GB GDDR5 (games) + 1GB DDR3 (system)
or
2GB DDR3 (shared)

While the first one sounds better, because of the split pools it severely limits Nintendo's ability to free up more memory for games in future, and gives devs a headache if they do. With a common pool they can gradually increase the RAM available to games via firmware updates, in a way that's easy for developers to make use of. It would also have lower latency than the GDDR5, and almost the same bandwidth, as they could go with a single wide bus instead of two narrow ones (although this depends on how much bandwidth the OS is taking up). The console could well have 1.5GB+ available to games within a couple of years.


A single DDR3 pool is better from multiple perspectives. I'm (and have been since WUST1) betting on a single pool.

very good points thanks for that. I would basically rule out GDDR5 as well as DDR3 looks to be the smartest choice looking at the other factors involved and might narrow down the estimated speeds of which the GPU will operate.
 
Honestly, in spite of all the debate over the horsepower of the system I am glad Nintendo is making it fairly compact and energy efficient. Hopefully it'll run cool and be quiet as a result. Those sorts of things tend to get overlooked in these hardware discussions, but as a gamer they make a big difference in my day to day use of a system. As much as I like my slim PS3's software library, I've been underwhelmed by the loud drive, cracking noises as the case changes temperature, and overall amount of heat it puts off. Even though the Wii is one of Nintendo's lesser hardware designs imo, it still beats the hell out of the competition when it comes to being quiet and unobtrusive.

I always chuckle when my 360's case makes cracking noises.
 
Yes but there are 4 usb ports which will have to take a maximum of 2.5 watts each so that would leave 65 watts for the rest of the unit. 5 watts would go to the slimline blu laser drive at 5x and another 10 watts for all the wireless, audio,nand as well as I/O and fans. This will leave about 50 watts for GPU+Ram and CPU+Ram I doubt the cpu will be clocked higher than 2.187GHz it could even be 1.458Ghz. Even so at 1.458Ghz the "enhanced broadway cores" I would speculate, might still be faster then the xenon but let's hope it is clocked at 2.187GHz. For the GPU i was looking at either 486Mhz or 607.5Mhz but 35 watts does not leave too much room to work with for the GPU and 10watts also for the CPU as the ram would be maybe 10 watts. still possible for a 607.5Mhz Clock at this much power though but we need more info.






very good points thanks for that. I would basically rule out GDDR5 as well as DDR3 looks to be the smartest choice looking at the other factors involved and might narrow down the estimated speeds of which the GPU will operate.

Isn't 75 watts peak, doesn't it use more like 40 watts in use?
 
My opinion on this has been pretty consistent throughout this gen, Nintendo undershot what would have been an acceptable power level and Sony and MS overshot it.

If you consider how long it took MS to get the price of the 360 down to mass market level and stable hardware they could have easily released a console somewhere between the Wii and PS3 in terms of power for $299 in 2005 and then have been poised to release something along the power level of the WiiU in 2010 for the same price.

How different would MS's fortunes and the overall health of the market have been under that scenario?
Sure, that is possible, but do you think that games would be any cheaper to produce now than before? The biggest reasons for why studios close down is because the games are too expencive to make and they sell too little to make up for the developement budgets. If PS3/Xbox 360 level of hardware had been released in 2012/2013, i dont think that the games would be a lot cheaper to produce. Then we are still in the same situation.
 
I think for the price the Wii U is an amazing deal. I just don't get people bitching about the specs but at the same time would not accept a higher price. If the unit were designed around a BOM of just $50 more, man we would have a really slick piece of kit in our hands.
 
75w peak and 45 typical.

I'm guessing this doesn't give us many clues about the exact components inside. I assume most would be from spinning the optical drive?

Well, it certainly tells us what is NOT inside as we have at least some idea what wattage various parts are.
 
I think for the price the Wii U is an amazing deal. I just don't get people bitching about the specs but at the same time would not accept a higher price. If the unit were designed around a BOM of just $50 more, man we would have a really slick piece of kit in our hands.

The gamepad on it's own not so much but I was expecting it to be over 100 since the DS3 on the PS3 is so expensive
 
Pretty sure DS3 and 360 controllers are marked up like crazy though.

As are Wii remotes though I guess.


Oh yeah I am sure of that thats why I was sure the gamepad would also be marked up above like those controller and priced more as it is the "next gen" in controller terms.
 
Touched upon this in the other thread, does the high retail price of the controller (relative to the system) give any clue towards the bom for the actual console.
Or are we assuming an insane mark up?
 
Touched upon this in the other thread, does the high retail price of the controller (relative to the system) give any clue towards the bom for the actual console.
Or are we assuming an insane mark up?

The best I can say is, that gamepad is probably vastly more expensive to produce then your typical game pad, which usually sells for what, 40 - 50 bucks at launch? So yeah, there is a mark up but it is probably quite a bit more expensive to produce and every dollar it costs is another dollar that cannot go into the system itself since it is included. Again, the price of the unit also dictates the technology....Nintendo probably made some concessions to get the Wii U at a lower price point after the 3DS debacle.
 
I'd imagine the philosophy is that when your friends come over they would bring their own (if needed)

the high price probably indicates its more for a replacement than utility.
 
Top Bottom