Rumor: Wii U final specs

I dont know whats normal these days, but around 40watt when playing games that are heavy on the hardware usage sounds pretty good. In comparison, the PS3 Slim is around 90watt (console from 2009 however, but still).
 
I dont know whats normal these days, but around 40watt when playing games that are heavy on the hardware usage sounds pretty good. In comparison, the PS3 Slim is around 90watt (console from 2009 however, but still).

that's one of the perks of a GPGPU it can compute more code while using less wattage than a CPU would.

I think that's how it goes (some with more knowledge of this drop in to explain)
 
If the two screens are showing different views of the same scene then textures would be shared. If it's entirely different scenes then it could be entirely different textures. All of the games we've seen material from so far are of the first type.

ed: Wait, P100 has this in-doors mini-game on the pad while the main screen is showing the standard view. That could potentially use fairly different assets for the two views.

nvm read a later post about COD
 
So, playing catch up.

One side is saying that the Wii U can actually use up to 75W?

The other side is saying the PSU has been shown as 75W ergo that's not possible?

Does anyone still have these photos that show the latter?



AMD released a press release a long while ago iirc.


One side consists of people who have always been wrong about WiiU and constantly downplays the system, the other side has the word of Iwata and insiders...
 
So, playing catch up.

One side is saying that the Wii U can actually use up to 75W?

The other side is saying the PSU has been shown as 75W ergo that's not possible?

Does anyone still have these photos that show the latter?



AMD released a press release a long while ago iirc.

I don't get the argument pertaining to this whole thing. As a PC gamer, I know that when I buy a power supply (given it's a good brand) the rating on the PSU is its maximum output to the PC--not what it pulls from the wall. I'm surprised this is actually up for debate somehow, but that's just me.....
 
So, playing catch up.

One side is saying that the Wii U can actually use up to 75W?

The other side is saying the PSU has been shown as 75W ergo that's not possible?

Does anyone still have these photos that show the latter?



AMD released a press release a long while ago iirc.

There were never any photos as far as I remember (I asked if i'd missed any earlier and nobody replied but if any exist please post them). I only remember one person a few months ago claiming a certain rating for a pre retail WiiU. Seems reasonable enough, but no there's no eye witness evidence for it as far as I know.
 
I find it absolutely insane the controller only lasts 3-5 hours on a charge that takes 2.5 hours. You can't plug it into the system to charge it either, so you're going to have an AC tether while gaming. That's ghetto to me. It wouldn't be quite as cheesy if you could at least charge it via USB or something...
 
I find it absolutely insane the controller only lasts 3-5 hours on a charge that takes 2.5 hours. You can't plug it into the system to charge it either, so you're going to have an AC tether while gaming. That's ghetto to me. It wouldn't be quite as cheesy if you could at least charge it via USB or something...

Nintendo wants to sell gamepad plus later on.
 
There were never any photos. We just had one person a few months ago claiming a certain rating for a pre retail WiiU as far as I remember. Seems reasonable enough, but no there's no eye witness evidence for it.

There was a photo (a blurry one, at that) of a Wii U booth with the PSU visible in the bottom right corner. I can't find it now :(
But people deduced from that the 75w.
 
There were never any photos. We just had one person a few months ago claiming a certain rating for a pre retail WiiU as far as I remember. Seems reasonable enough, but no there's no eye witness evidence for it.

Oh. I thought there was photographic evidence. But apparently on further inspection it was a post from The Boat:

Gahiggidy I'm sorry but I won't post any photos of the power brick, it's nothing interesting though. It's bigger than Wii's I think, the input is 230V, 50 Hz, 0.9 A and the output is 15V, 5.0A. Obviously :P


EDIT:
There was a photo (a blurry one, at that) of a Wii U booth with the PSU visible in the bottom right corner. I can't find it now :(
But people deduced from that the 75w.
Oh OK, I thought there was a photo too.

Anyway, assuming this wasn't a lie - and I don't really see reason to lie - does 75W rating imply that 75W can actually be used as some are saying, or that there'll be loss due to thermodynamics?

If there's a loss, what would be upper and lower bounds for efficiency - I see some floating around 65-70% while others are floating ~90% - and/or what's typical?
 
Look through my post history to find the guy who read off he psu specs. It was from a media showing in Spain or Portugal, as far as I recall.

nvm.
 
Oh. I thought there was photographic evidence. But apparently on further inspection it was a post from The Boat:

Gahiggidy I'm sorry but I won't post any photos of the power brick, it's nothing interesting though. It's bigger than Wii's I think, the input is 230V, 50 Hz, 0.9 A and the output is 15V, 5.0A. Obviously :P


EDIT:
Oh OK, I thought there was a photo too.

Anyway, assuming this wasn't a lie - and I don't really see reason to lie - does 75W rating imply that 75W can actually be used as some are saying, or that there'll be loss due to thermodynamics?


Any energy loss is from the input to the output only (due to heat etc). If the output equals 75w, then that's what it outputs.


Typical loss used to be like 40% (60% efficient). Pc CPUs are now well over 80% efficient though. No idea what Nintendo have done, but they're touting 'energy efficiency', sooooo.....
 
Any energy loss is from the input to the output only (due to heat etc). If the output equals 75w, then that's what it outputs.

Ok, following on from this. Realistically, do consumer electronics ever draw the full output of their PSU rating? Do they commonly approach said power draw?

E.g. apparently from what I can find the PS3 CECH30XX output 12V 13A = 156W (Can someone correct this if wrong?) Does it ever actually use this much?
 
Oh. I thought there was photographic evidence. But apparently on further inspection it was a post from The Boat:

Gahiggidy I'm sorry but I won't post any photos of the power brick, it's nothing interesting though. It's bigger than Wii's I think, the input is 230V, 50 Hz, 0.9 A and the output is 15V, 5.0A. Obviously :P


EDIT:
Oh OK, I thought there was a photo too.

Anyway, assuming this wasn't a lie - and I don't really see reason to lie - does 75W rating imply that 75W can actually be used as some are saying, or that there'll be loss due to thermodynamics?

If there's a loss, what are upper and lower bounds for efficiency - I see some floating around 65-70% while others are floating ~90%.

The Wii has a 52W rated PSU and uses a max of ~25W at the wall. If Nintendo are using a 75W rated PSU then history suggest's the Wii U will use around 40W at the wall.

That is how my simple mind is comprehending this.
 
Oh. I thought there was photographic evidence. But apparently on further inspection it was a post from The Boat:

Gahiggidy I'm sorry but I won't post any photos of the power brick, it's nothing interesting though. It's bigger than Wii's I think, the input is 230V, 50 Hz, 0.9 A and the output is 15V, 5.0A. Obviously :P


EDIT:
Oh OK, I thought there was a photo too.

Anyway, assuming this wasn't a lie - and I don't really see reason to lie - does 75W rating imply that 75W can actually be used as some are saying, or that there'll be loss due to thermodynamics?

If there's a loss, what would be upper and lower bounds for efficiency - I see some floating around 65-70% while others are floating ~90% - and/or what's typical?

Well using the same rating the original 360 PSU was 203w. While the console used 178w, which is 88% of the PSUs rating. So 80% wouldn't seem unreasonable at all for max power draw.
 
Why are still discussing this?

The gaffer who saw the PSU with their own eyes months ago reported the PSU was rated at 75W. That's output rating. PSUs are never rated in any other way. What it draws from the wall can only be measured by a watt-meter (or be calculated by those who designed the part).

Iwata said 'the console draws 75W at peak, 40W (or 45, don't remember anymore) typical'. That's what the device draws from its power source.

The supposition that somehow the above two mean that the console draws 75W from the wall, the nearest power plant, or the black hole in the centre of the galaxy, is something that only a certain GAF contingent can come up with.

That was me. I'm remembered! I MATTER!
*cough*
 
Ok, following on from this. Realistically, do consumer electronics ever draw the full output of their PSU rating? Do they commonly approach said power draw?

E.g. apparently from what I can find the PS3 CECH30XX output 12V 13A = 156W (Can someone correct this if wrong?) Does it ever actually use this much?

Not sure what model that is, but the original model used like 180w maxed out.

EDIT: can't find figures on this model for max load :(

The Wii has a 52W rated PSU and uses a max of ~25W at the wall. If Nintendo are using a 75W rated PSU then history suggest's the Wii U will use around 40W at the wall.

That is how my simple mind is comprehending this.


Read the last few posts friend. They're enlightening :) It appears the psu outputs 75w max, that's not its max input. That means 75w is available to the console. We won't know it's efficiency until someone hooks it up to a watt metre and sees how much juice the psu is sucking in.

So, it would seem; It's not a '75w rated' PSU. It outputs up to 75w
 
The Wii has a 52W rated PSU and uses a max of ~25W at the wall. If Nintendo are using a 75W rated PSU then history suggest's the Wii U will use around 40W at the wall.

That is how my simple mind is comprehending this.

Where is the extra 27w in the Wii coming from? I think you mean that the Wii draws only 25w from the PSU, even though it outputs a max of 52w.
 
Not sure what model that is, but the original model used like 180w maxed out.
CECH3000 was last years revision I believe.

I could be (probably am) reading this wrong (?) but I'm finding the original model had an output of 12V 32A = 384W?

Well using the same rating the original 360 PSU was 203w. While the console used 178w, which is 88% of the PSUs rating. So 80% wouldn't seem unreasonable at all for max power draw.
Okay given this estimate.

We're looking at 60W max power draw - which presumably would be inclusive of the USB draw, which I've seen people attribute 10W?

That would mean 50W draw under load.

How much of this power "budget" would be reasonable to allocate to the GPU?
30-35W
25-30W
20-25W
 
didnt they do that with PS3 and xbox360?

Nope.

360 had a MAX 205w TPD and PS3 had a MAX 380w TPD.

There is no way either of them had a GPU that normally used over 100w and they won't on their next platforms either.

(see wikipedia's hardware page for each system for their breakdowns if you're curious)
 
CECH3000 was last years revision I believe.

I could be (probably am) reading this wrong (?) but I'm finding the original model had an output of 12V 32A = 384W?

Yep.

Okay given this estimate.

We're looking at 60W max power draw - which presumably would be inclusive of the USB draw, which I've seen people attribute 10W?

That would mean 50W draw under load.

How much of this power "budget" would be reasonable to allocate to the GPU?
30-35W
25-30W
20-25W

Nope. We're looking at 75w max. The efficiency is measured from the input over the output. We don't know the input, but we know the output is 75w. The input could be, say, 120w. Which would mean it's about 60% efficient (for example)
 
Nope. We're looking at 75w max. The efficiency is measured from the input over the output. We don't know the input, but we know the output is 75w. The input could be, say, 120w. Which would mean it's about 60% efficient (for example)
I know what you're saying - but that's not really what I'm asking. (I.e. through that link the AC input for the OG PS3 was 100-240V 6A, vs the DC output of 384W?) The fault of mixing terminologies, I assume.

Rather, referring to what would be reasonable to expect in terms of power usage under load from the console e.g. 178W from the 203W output, or 180W of the 384W output.

I think Donnie's answer seemed a reasonable estimate to work from.
 
Where is the extra 27w in the Wii coming from? I think you mean that the Wii draws only 25w from the PSU, even though it outputs a max of 52w.

Don't get technical! The PSU can provide upto 52W but the Wii only demands ~25W.

My logic and what I have read say's that Nintendo (and Sony) do this because a PSU is most efficient around 50% load.

But I really don't understand enough about electrics to be too confident about the above.
 
Yep.



Nope. We're looking at 75w max. The efficiency is measured from the input over the output. We don't know the input, but we know the output is 75w. The input could be, say, 120w. Which would mean it's about 60% efficient (for example)

Isn't the input 207w according to the info Boat posted? Still seems, looking at past consoles that 80-90% of the output is what you should expect the console to use at most.
 
I find it absolutely insane the controller only lasts 3-5 hours on a charge that takes 2.5 hours. You can't plug it into the system to charge it either, so you're going to have an AC tether while gaming. That's ghetto to me. It wouldn't be quite as cheesy if you could at least charge it via USB or something...

It's possible the pad needs more than what USB can deliver which is why the AC adapter is needed.

Also, Maybe Nintendo didn't want you sucking more juice through the console for charging, generating more heat etc.
 
So to answer your question, that means that 60-65% is the total amount of the rated maximum that is available to the equipment being powered.

Wrong. PSU Efficiency has nothing to do with "total power". If a power brick is 80% efficient, then that just means the power outlet from the wall will consume 20% more than the power brick's 75 watt rating. Which means the Wii-U can use 75 watts total.
 
Isn't GFLOPS/watt a really shitty way to measure power when you think about it? It kindof makes the assumption that nothing but the shader ALUs draw any power whatsoever. One would think that an anisotropic texture lookup would draw more power than a single multiply+add operation. Not to mention the rasteriser, alpha blending and memory writes.

Also I heard Xenos uses vector5 SIMDs; do modern GPUs still do that (or even bigger vectors)? how often do you really need the 5th element? seems like a bit of a waste of flops
 
Top Bottom