Rumor: Wii U final specs


Nearly all of that appears to be displacement maps and tessellation, so who knows.

Honestly I can barely tell the difference between the DX9 and 11 versions. Basically flat floors with repetitive brick tiling have displacement maps...

I've said this before and I'll say it again, graphic tech advancements have really slowed down in terms of features. The PS1 didn't have a z-buffer, PS2 didn't have programmable shaders. Nothing on that level is ever going to be introduced again until the rendering pipeline completely changes paradigm.
 
Is it really 40 watts while displaying graphically intensive games? Or can it actually reach 75? I'm hearing 2 different things here.

Well, if you believe a 10w gpu can produce graphics like the Wii-U does than believe the former.

Since that's really unlikely I trust to believe the latter.
 
Thank you for the small summary. So, at this point, we just need to know many of these recent effects the GPU can push per second right ? And if we want to see shit flying all other place, there must be a good CPU in the WiiU, right ?
That's the only thing required to have Star Wars 1311, correct ?
 
So now that the GPU is confirmed, what does that really mean for its graphics potential beyond current gen?

The only thing confirmed about the GPU is that it has 3.2 times more eDRAM to play with than the Xenos and that it has a more modern feature set than DX10.1 equivalent feature set, unless I've missed something..?
 
So.. Wii U disc drive reads 2.5 times faster than PS3 one, no?
Could be CAV "up to 5x" vs PS3's CLV "always 2x", but yeah. 5x ~BD-ROM (single layer) vs 2x BD-ROM.


E to clarify the terms:
http://en.wikipedia.org/wiki/Constant_linear_velocity
CAV drive bandwidth drops to ~43% of its "rated" maximum at the innermost track of the disc. RPMs stay constant.
A CLV drive maintains its bandwidth across the entire disc. RPM increase wheen moving the read head towards the spindle, decrease when moving toward the edge.
 
This is a good catch. Suddenly the huge distance between the two power figures makes a lot more sense.
Of course. A top rating for a device is normally with 'everything on'. Average/nominal is during 'normal' use. I can bet the average in this case does not include bus-powered USB devices.
 
It's also nice to read the PS3 is 9 MBps, and the WiiU 22.5 MBps...Which will probably result in much faster loading times?

Majority of BD drives currently have 10x - 12x read speeds. So 22.5 Mb/s isn't that fast. Well.. much faster than what PS3 has today with it's 2x drive.

dvdvsbdrs1af.png
 
Iwata said that the max power draw is 75w. I took that to mean that the console will actually be using that much power at times and that it isn't just what the power brick is rated for.

What about the so-called power supply/brick efficiency? No power supply can ever use 100% of its power rating unless I'm mistaking?
 
Majority of BD drives currently have 10x - 12x read speeds. So 22.5 Mb/s isn't that fast. Well.. much faster than what PS3 has today with it's 2x drive.
12x is the MAXIMUM you can get today but it doesn't really represent the majority of drives sold today, I'd be surprised if the PS4 has anything faster than a 8x drive. High speed drives are pretty expensive.
 
Why would Nintendo want to max out the psu? Aren't they most effective at around 50-60%? If the psu is rated at 75w then 45w seems correct for max consumption for the console?
 
Why would Nintendo want to max out the psu? Aren't they most effective at around 50-60%? If the psu is rated at 75w then 45w seems correct for max consumption for the console?
45W is not the max consumption. 75W is. Hence the PSU being able to deliver this much. It does not matter if the PSU's at full efficiency during that time as that's not the normal power draw of the console.

So then we are looking at a GPU around 25w
If you expect CPU+RAM to do 15-20W then yes. Otherwise no.
 
45W is not the max consumption. 75W is. Hence the PSU being able to deliver this much. It does not matter if the PSU's at full efficiency during that time as that's not the normal power draw of the console.


If you expect CPU+RAM to do 15-20W then yes. Otherwise no.
Has any console ever gotten near the max rating of the power supply whilst running a game?
 
Has any console ever gotten near the max rating of the power supply whilst running a game?
Define 'whilst running a game'. The PSU is supposed to handle situations when the console is running a game + the radios are on + the USB bus is feeding a device, etc.
 
Has any console ever gotten near the max rating of the power supply whilst running a game?

Unlikely, but you could say the same about PC components (CPUs, GPUs, RAM, etc.) not reaching their advertised TDP while running games. The reason they would have chosen a 75W power supply is that that's the theoretical maximum usage of all components in the console, even if the chances of reaching that high are effectively negligible. Published TDP values for components are likewise going to be theoretical maxima.
 
Has any console ever gotten near the max rating of the power supply whilst running a game?
Depends on what rating you look at. I have a notebook PSU sitting right in front of me that has an input rating of, oh, let's say 160W*. But it's specced to output no more than 65W. And no, that gap is not efficiency. It's because input power ratings generally do not mean what people think they mean.

In other news, yes, fairly close.

*
it's actually specifying 1.6A over a fairly wide-range 100~240V AC, so I could as well call it 380W -- much like what some people think the PS3 "consumes" (which it doesn't, obviously)
 
So then we are looking at a GPU around 25w
!!!BACK-OF-THE-ENVELOPE ALERT!!!

Assuming R7xx architecture (because of "DX 10.1" reference; originally 55nm) shrunk to 40nm at 30% power savings, 25W puts us at 35W of 55nm chip equivalent, which is roughly the halfway point between Radeon HD 4550 and Radeon HD 4650. IOW 240 stream processors, 24 texture units, 600MHz.

This means nothing. It's a computation based on a speculative consumption figure and preexisting products. Don't blame me if it's not pleasant to you somehow.
 
Depends on what rating you look at. I have a notebook PSU sitting right in front of me that has an input rating of, oh, let's say 160W*. But it's specced to output no more than 65W. And no, that gap is not efficiency. It's because input power ratings generally do not mean what people think they mean.

In other news, yes, fairly close.

*
it's actually specifying 1.6A over a fairly wide-range 100~240V AC, so I could as well call it 380W -- much like what some people think the PS3 "consumes" (which it doesn't, obviously)
I would be VERY cautious when computing power for variable signals by multiplying current and voltage written on the box...
 
!!!BACK-OF-THE-ENVELOPE ALERT!!!

Assuming R7xx architecture (because of "DX 10.1" reference; originally 55nm) shrunk to 40nm at 30% power savings, 25W puts us at 35W of 55nm chip equivalent, which is roughly the halfway point between Radeon HD 4550 and Radeon HD 4650. IOW 240 stream processors, 24 texture units, 600MHz.

This means nothing. It's a computation based on a speculative consumption figure and preexisting products. Don't blame me if it's not pleasant to you somehow.




Why shrink something while you already have RV740 which is 40nm ?
 
No it won't play CD/DVD/BD or any media disk. Nintendo doesn't want to license them.


It essentially has a 4.5X BD drive. I hope they allow for Dual-layer disks in time. Kinda like the Wii essentially uses single layer DVDs written backwards. (with some games later being dual layer like SSBB)
Wonder what changes Nintendo made to dodge licenses this time.

For the moment, only 25GB discs were delivered. For what it worth, it took between 30 and 50 min to burn an average E3 build with the special writers studios got not too far prior the show.
 
I would be VERY cautious when computing power for variable signals by multiplying current and voltage written on the box...
I would be very cautious of anyone misrepresenting PSU input ratings as consumption figures in any context. I know what they mean, but it's such a pain in the ass to explain that it's better to work towards widespread dismissal.

PSU output ratings, which you can really only get with external power bricks OTOH, are solid, useful information. Any decent engineer (and on top of that, business realities of sourcing off-the-shelf PSUs) will enforce a certain headroom margin on the PSU capacity, but it still is an actual capacity measure, and most importantly it's always an upper bound.
 
Why shrink something while you already have RV740 which is 40nm ?
Because it violates the "DX10.1" thing. I'm a bit fuzzy on the Radeon codenames ... RV740 is which one of the HD5xxx family?

edit: found it. HD 4770. 40nm but still "DX10.1". Honestly didn't know such a thing existed, sorry.
90W part at 750MHz, 640SP, 32TMUs. Seems I extrapolated the 55=>40nm transition quite well :D
 
So the GPU in the WiiU is a modified HD Radeon 4770 ? This is the only GPU going by the codename RV740, right ?

Is the whole RV740 thingy confirmed, please ?
 
For the moment, only 25GB discs were delivered. For what it worth, it took more between 30 and 50 min to burn an average E3 build with the special writers studios got not too far prior the show.
I just hope they can deliver 50GB disks over time and it's not a limit set in stone. The Wii was only supposed to use 4.7GB disks at first but that didn't stop them from making higher capacity disk later on. (though I recall some launch Wiis had trouble with reading double layer disks.
 
12x is the MAXIMUM you can get today but it doesn't really represent the majority of drives sold today, I'd be surprised if the PS4 has anything faster than a 8x drive. High speed drives are pretty expensive.

Not even Best Buy sells anything under 8, and most of them are either 8x or 12x. Anyway, like I said in one of my previous posts, I expect up to 8x.
 
Could someone put this recent power draw discussion into layman's terms? It seems like this could be a decent indicator of what we're looking at for overall performance.
 
Because it violates the "DX10.1" thing. I'm a bit fuzzy on the Radeon codenames ... RV740 is which one of the HD5xxx family?

edit: found it. HD 4770. 40nm but still "DX10.1". Honestly didn't know such a thing existed, sorry.
90W part at 750MHz, 640SP, 32TMUs. Seems I extrapolated the 55=>40nm transition quite well :D

What is the source of this info?
 
!!!BACK-OF-THE-ENVELOPE ALERT!!!

Assuming R7xx architecture (because of "DX 10.1" reference; originally 55nm) shrunk to 40nm at 30% power savings, 25W puts us at 35W of 55nm chip equivalent, which is roughly the halfway point between Radeon HD 4550 and Radeon HD 4650. IOW 240 stream processors, 24 texture units, 600MHz.

This means nothing. It's a computation based on a speculative consumption figure and preexisting products. Don't blame me if it's not pleasant to you somehow.

Has 40nm been confirmed anywhere ? I could've sworn it was gonna be 32nm
 
I just hope they can deliver 50GB disks over time and it's not a limit set in stone. The Wii was only supposed to use 4.7GB disks at first but that didn't stop them from making higher capacity disk later on. (though I recall some launch Wiis had trouble with reading double layer disks.

The trouble some had was related the drives having dirt in them rather than anything technical
 
So the GPU in the WiiU is a modified HD Radeon 4770 ? This is the only GPU going by the codename RV740, right ?

Is the whole RV740 thingy confirmed, please ?

Nothing is confirmed, it's all speculation. Speculation about what power the GPU will draw, about how fast it will be, about how many SPU's it'll have, even about what die size it'll have. The only thing confirmed is that it's an AMD GPU that has GPGPU features.
 
Not even Best Buy sells anything under 8, and most of them are either 8x or 12x. Anyway, like I said in one of my previous posts, I expect up to 8x.
In that case I agree. I just felt like pointing this out because I thought you were implying that the speed of the WiiU drive was unjustifiably slow. (which I feel is wrong since I think we can expect next gen drives to be in the 4-8x range depending on prices when the specs of the other two players are finalised. Closer to 8x seems more likely though, maybe 6x for Sony)
The trouble some had was related the drives having dirt in them rather than anything technical
I though it was a combination of both dirt and technical reasons and dirt as I never heard the issue happening on later drives. But I may be wrong.
 
Nothing is confirmed, it's all speculation. Speculation about what power the GPU will draw, about how fast it will be, about how many SPU's it'll have, even about what die size it'll have. The only thing confirmed is that it's an AMD GPU that has GPGPU features.
Any possibility we learn all those infos if someone opens up the console, once it is launched ?
 
1.21 Jiggawhats.

Makes sense, cause the Wii U's tech is from the past. amidoinitrite?

What I heard was as posted at nintendo everything.com

- Uses up to 75 watts of electricity
- Typical power usage is 45 watts

Yeah I'm confused. I thought 75w was it's max draw at time, but typically 45. What typically means is anyone's guess. Is that while sitting at the menu, watching a movie, miiversing, or full on gaming?

The power supply rating is given per the output of the PSU, FYI.

I thought some dude took a photo of the PSU a while ago and it was 90W? I'm possibly remembering wrong though.
 
Top Bottom