Rumor: Wii U final specs

Not to be rude but it's a bunch of trolls jumping on the bandwagon whenever anything negative is said about the console, we all knew it wasn't going to be some sort of 5x power leap beast...

I will reserve judgement on the console until one of the big tear down sites give out a full list of ALL the system specs and not just looking at the Ram and running with the 'WiiU is doomed party line'.

I've never seen so many rabid fanboys want something to fail so hard before in my life, it's pretty sad.

A thread is made about something positive on WiiU and it gets two pages, a thread is made about the tiniest little thing thats wrong with the system and its on the front page for days.

Haters gonna hate.

Is it as bad when Sony or MS bring out consoles ?.
It's also a bunch of Nintendo fanboys jumping at the defense of the system with every small thing.

It probably will be bad with the next sony and MS consoles but it doesn't help that the Wii dissapointed some and so far the Wii-U has some big things that need correcting asap and Nintendo absolutely needs to be called out for it. It's definitely possible the next round of games will be impressive especially from Nintendo.
 
Talked with BG briefly the other day via email and he had this to say about the RAM bandwidth. I realise he can't be part of the discussion but I thought I'd pass it on (He's OK with that).

Going back as far as the first WUST, we talked about Nintendo possibly choosing DDR3 over GDDR5 due to latency. The memory clock isn't really surprising since it's not dramatically less than what Thraktor and I talked about in this exchange. http://67.227.255.239/forum/showthread.php?p=42579861#post42579861 I will say though that if Nintendo did stick with clock multiples (DSP originally listed at 120Mhz), the memory could very well be 720Mhz for all we know if they underclocked it. It wouldn't make sense to overclock a part when faster speeds are available to underclock. But what I found interesting is that ifixit.com (Step 12) has a teardown and theirs had Micron memory. Looking at the specs of the available choices I believe Nintendo chose latency (1.25ns @ CL = 11) over BW and at the same time were limited to a 64-bit bus because 32-bit wide DDR3 was not ready or available for production. Wii U's timing is unfortunate since Micron is taking the same memory module and sampiling a "TwinDie" doubling the density and BW so Wii U could have possibly had 4GB with twice the BW. All that said, I think some are giving too much weight/concern to the memory bandwidth. Nintendo has always been picky about balance and BW is not the only factor in memory speed. I decided to put forth the effort to look for/at the latency speeds. So based on the formula I found (and if I did it correctly) this is what we're looking at.

Wii U DDR3 - 13.75ns (based on 800Mhz)
Xbox 360 GDDR3 - 14.29ns (low end), 21.43ns (high end)
PS3 XDR - 35ns (taken from a PS3 wiki)
PS3 GDDR3 - 15.38ns or 16.92ns

I think there was a dev that talked about the latency for Wii U compared to the others so this should give some actual numbers to that. The reason there are multiple numbers for the GDDR3 is because the Samsung's data sheets gave multiple CAS latency options for the possible clock speed, so I calculated all of them since I didn't know which CL MS and Sony chose for their respective GDDR3 memories. There were three other possible outcomes for the 360, so I stuck with the high and low. Going back to Nintendo's choice, they seem to have gone with the lowest latency using Micron's info.

800Mhz - 13.75ns
900Mhz - 14.44ns
933Mhz - 13.93ns
1000Mhz - 14ns

So I didn't do this to justify Nintendo's decision since as you know I've said in the past that Wii U comes short of what I think a next-gen console (from a power perspective) should look like. As with other speculation this was to look at Nintendo's thinking.
 
I don't plan on going near the 720, i would never even read threads about it never mind post in them or troll, this does not seem to be the case for the majority of Gaf.

This is the internet though i suppose.

Look at a person like Van Owen in this thread, would someone be allowed to troll as hard as that in a PS4 / 720 thread close to launch ?.
 
Did Nintendo ever confirm that it was an R700 series GPU ?, as far as im aware all they ever confirmed was that it was a Radeon HD GPU.

Nintendo themselves said DirextX 10.1 GPU. R700 is the highest DX10.1-designed chipset before AMD made the DX11-designed R8xx/Evergreen. Several sources corroborated the R700 before Nintendo themselves stated DX10.1
 
Can someone explain what the benefit of low-latency RAM would be in the particular situation here? Would it serve to balance out the lower-clocked CPU?
 
Talked with BG briefly the other day via email and he had this to say about the RAM bandwidth. I realise he can't be part of the discussion but I thought I'd pass it on (He's OK with that).

Yup, that sounds about right. From all reports, Nintendo seem to have prioritized latency over bandwidth. This will have benefits for some sorts of code, but hinder others. I think the CPU is well fed for what it is (not a floating point cruncher and don't forget the fat L2s). And it seems that some thought was put into reducing the amount of bandwidth taken up by textures and the framebuffer. Devs will have to learn how to balance that within the eDRAM to fully utilize the memory subsystem. If the specs we have are true, then the initial results we've gotten from devs should be seen as encouraging that the system as a whole ads up to more than just the sum of its specs. It can only go up from here, I say, even if the ceiling is far below "true next gen" performance.

Can someone explain what the benefit of low-latency RAM would be in the particular situation here? Would it serve to balance out the lower-clocked CPU?

To an extent. It means less wasted clock cycles.
 
Nintendo are now using Direct X !.

Not sure if you're joking, but MS does indeed issue criteria for hardware support of features.
Can someone explain what the benefit of low-latency RAM would be in the particular situation here? Would it serve to balance out the lower-clocked CPU?

The latency is not so far removed as to assist in anything here. Latency should only not be ignored if the bandwidth and speed were the same.
 
Not sure if you're joking, but MS does indeed issue criteria for hardware support of features.

The HD4xxx series were essentially DX11 compliant, which is why they supported 10.1 when most nVidia cards at the time didn't. I don't see why a (presumably) heavily customised variant in a console environment shouldn't be comparable to a DX11 card as far as feature-sets go. Ability to implement that feature-set, on the other hand...
 
So apparently DICE has posted a tweet saying what we all have been figuring: The GPU is fine (great, even) , but the CPU is pretty bleh...(not unlike the case for the 3DS)
 
So apparently DICE has posted a tweet saying what we all have been figuring: The GPU is fine (great, even) , but the CPU is pretty bleh...(not unlike the case for the 3DS)

Where? I read a remark about a designer that he had heard similar things (as the Metro/horrible guy) from "around the industry" or something, not that he claimed the GPU was great. News to me.
 
So apparently DICE has posted a tweet saying what we all have been figuring: The GPU is fine (great, even) , but the CPU is pretty bleh...(not unlike the case for the 3DS)

The dice bloke's tweets sound like he knows nothing and is just perpetuating the 3x Broadway must be bad talk, despite 3x Broadway possibly be being quite good
 
Yup, that sounds about right. From all reports, Nintendo seem to have prioritized latency over bandwidth. This will have benefits for some sorts of code, but hinder others. I think the CPU is well fed for what it is (not a floating point cruncher and don't forget the fat L2s). And it seems that some thought was put into reducing the amount of bandwidth taken up by textures and the framebuffer. Devs will have to learn how to balance that within the eDRAM to fully utilize the memory subsystem. If the specs we have are true, then the initial results we've gotten from devs should be seen as encouraging that the system as a whole ads up to more than just the sum of its specs. It can only go up from here, I say, even if the ceiling is far below "true next gen" performance.



To an extent. It means less wasted clock cycles.
That does make sense. If I recall correctly, this is not the first time Nintendo prioritized reducing latency over pure bandwidth.
 
You probably know better, as most of gaf users who posted in here.

Considering he has admitted he is not a 'tech' guy then i would take quite a few Gaf posters word over his.

Remember Dice are used to being PC lead platform and are rumoured to be one of the companies that are pushing PS4 / 720 to have at least 4GB's of Ram after Sony / MS initially wanted to just put 2GB's in.

They are used to very high end hardware and want the next gen consoles to be as powerful as possible, WiiU would look weak to them but pretty powerful to a lot of other developers, not everyone is trying to build games using Frostbite 2 on Ultra for next gen.
 
The HD4xxx series were essentially DX11 compliant, which is why they supported 10.1 when most nVidia cards at the time didn't. I don't see why a (presumably) heavily customised variant in a console environment shouldn't be comparable to a DX11 card as far as feature-sets go. Ability to implement that feature-set, on the other hand...

Well, I know that, but it wasn't fully DX11 compatible. Thus, AMD and Microsoft have deemed it a DX10.1 gpu. The very next chipset AMD used they titled their "First DX11" gpu.

the fact that Nintendo specifically said DX10.1 and AMD pretty much marked their last series that was conforming with DX10.1 and not DX11 was the R700 is what makes me think the chip is indeed based on the R700 line
 
Well, I know that, but it wasn't fully DX11 compatible. Thus, AMD and Microsoft have deemed it a DX10.1 gpu. The very next chipset AMD used they titled their "First DX11" gpu.

the fact that Nintendo specifically said DX10.1 and AMD pretty much marked their last series that was conforming with DX10.1 and not DX11 was the R700 is what makes me think the chip is indeed based on the R700 line

Did Nintendo actually say DX10.1 though or are you going off the leaks in this OP ?.
 
Exactly and even if he did know more he couldn't give specifics since the hardware is still under NDA. A slow 'horrible' next gen CPU to Dice could be something similar to Cell / Xenon's performance.

07-minister.jpg
 
Well, I know that, but it wasn't fully DX11 compatible. Thus, AMD and Microsoft have deemed it a DX10.1 gpu. The very next chipset AMD used they titled their "First DX11" gpu.

the fact that Nintendo specifically said DX10.1 and AMD pretty much marked their last series that was conforming with DX10.1 and not DX11 was the R700 is what makes me think the chip is indeed based on the R700 line

I'm pretty damn certain that Nintendo would never actually say a direct x number and if you aren't actually using direct x direct x numbers are fairly meaningless
 
There was an article a couple of months ago where a journalist had interviewed some developers (anonymously ofcourse), who said the GPU delivered "DX10-level performance" with "DX11-level features". Take that for what you will.
 
I'm pretty damn certain that Nintendo would never actually say a direct x number and if you aren't actually using direct x direct x numbers are fairly meaningless

All Nintendo have officially said about the tech specs of the actual console is it was using GPGPU tech, 2GB's of Ram (1GB for games, 1GB for OS), that the CPU was an IBM Power based multi-core processor and that the GPU was an AMD Radeon based High Definition GPU.

Nothing else, EVERYTHING else is speculation, including the 'leaked' Ram speed, i really don't think googling a part number tells the full story, Nintendo could have quite easily asked the companies to clock it at a different speed for their personal needs.
 
All Nintendo have officially said about the tech specs of the actual console is it was using GPGPU tech, 2GB's of Ram (1GB for games, 1GB for OS), that the CPU was an IBM Power based multi-core processor and that the GPU was an AMD Radeon based High Definition GPU.

Nothing else, EVERYTHING else is speculation, including the 'leaked' Ram speed, i really don't think googling a part number tells the full story, Nintendo could have quite easily asked the companies to clock it at a different speed for their personal needs.

The RAM chips on the motherboard pictures released have their product code and speed bin codes printed on the chips, their specs and speed is known. They are going to be clocked at 800Mhz at most. This is true for all RAM chips, from 3 different manufacturers (Micron, Hynix and Samsung), from 3 different tear-downs.

Just because you do not know how to take that information and process it does not mean others do not.
 
There's nothing terribly spectacular about the "GPGPU" statement because technically EVERY AMD and Nvidia GPU starting from the Geforce 8/Radeon HD 2xxx series can handle general purpose computations. Even low-end integrated graphics.
 
There was an article a couple of months ago where a journalist had interviewed some developers (anonymously ofcourse), who said the GPU delivered "DX10-level performance" with "DX11-level features". Take that for what you will.

That's a pretty dumb statement because "DX10-level performance" could range from something like a Geforce 8200 all the way to an 8800 GTX.
 
There was an article a couple of months ago where a journalist had interviewed some developers (anonymously ofcourse), who said the GPU delivered "DX10-level performance" with "DX11-level features". Take that for what you will.

Gaming Blend interview with David Helgason, the CEO of Unity

Gaming Blend: While the Wii U doesn't specifically use DirectX functionality, will the Unity Engine for the Wii U allow for DirectX 11 equivalent functionality in regards to shaders, soft and self shadowing as well as potential scalability for shader 5.0 (or higher)?

Helgason: Yeah. We'll do a – we'll make it potentially possible to do.
 
The RAM chips on the motherboard pictures released have their product code and speed bin codes printed on the chips, their specs and speed is known. They are going to be clocked at 800Mhz at most. This is true for all RAM chips, from 3 different manufacturers (Micron, Hynix and Samsung), from 3 different tear-downs.

Just because you do not know how to take that information and process it does not mean others do not.

Well, as BG pointed out from yonder, it could actually be clocked less than 800MHz to be in line with multipliers. But yeah, it's not going to be an overclocked chip cause that just wouldn't be right.
 
Gaming Blend interview with David Helgason, the CEO of Unity

No, there was another one. I'll see what i can find.

One source encouraged us to think of Unreal Engine 3 as requiring the performance and capabilities of Direct X 9 but advised that demos running UE3 with enhanced specs, a la Samaritan and 1313, require DX9 performance but DX11 shader capabilities. They consider the Wii U close to that DX9-performance/DX11-capabilities combo but possibly hampered by its CPU, which they believe Nintendo is requiring to run at lower speeds in order to keep its chips from getting too hot and therefore allowing the machine to run as quietly as the Wii-and with relatively low power consumption.

http://kotaku.com/5920931/the-wii-us-power-problem
 
There's nothing terribly spectacular about the "GPGPU" statement because technically EVERY AMD and Nvidia GPU starting from the Geforce 8/Radeon HD 2xxx series can handle general purpose computations. Even low-end integrated graphics.

I never said there was just that the term 'GPGPU' had come from the mouth of Iwata himself.
 
The RAM chips on the motherboard pictures released have their product code and speed bin codes printed on the chips, their specs and speed is known. They are going to be clocked at 800Mhz at most. This is true for all RAM chips, from 3 different manufacturers (Micron, Hynix and Samsung), from 3 different tear-downs.

Just because you do not know how to take that information and process it does not mean others do not.

My bad, i didn't actually know there was three tear downs going on, only one.

Have they found anything else 'definite' about the system yet ?.
 
My bad, i didn't actually know there was three tear downs going on, only one.

Have they found anything else 'definite' about the system yet ?.

As pestul just posted, 33w draw whilst playing NSMBU, it was also revealed that the tech behind the streaming to the GamePad is a Broadcom Miracast chip.
 
There's nothing terribly spectacular about the "GPGPU" statement because technically EVERY AMD and Nvidia GPU starting from the Geforce 8/Radeon HD 2xxx series can handle general purpose computations. Even low-end integrated graphics.

It's not about being able to do GPGPU, it's about their performances. An Nvidia 600 series GPU will run circles around an AMD 2000HD series GPU in GPGPU performances. Even the stream processors between an AMD and Nvidia have different performance level per SP. I am not exactly defending the GPGPU argument, but what if I said that "There's nothing terribly spectacular about the "CPU" statement because technically EVERY AMD and Nvidia CPU starting from the Intel Pentium/K6-2 series can handle computations. Even low-end integrated ARMs."
 
What was the most Iwata said the console would draw power wise 45 watts ?.

Surely if we know the power requirements of the Ram we could make an educated guess at the speed of the CPU and GPU with a FLOP conclusion based on the GPU's power draw ?.

Help someone who knows how much power each chip draws and who is good at math :p.
 
What was the most Iwata said the console would draw power wise 45 watts ?.

Surely if we know the power requirements of the Ram we could make an educated guess at the speed of the CPU and GPU with a FLOP conclusion based on the GPU's power draw ?.

Help someone who knows how much power each item draws and who is good at math :p.

Not sure if we'll be able to single out each component's power draw, but the system still eats up about half as much as the latest 360S.
 
Not sure if we'll be able to single out each component's power draw, but the system still eats up about half as much as the latest 360S.

Can't 2011 components do more with a smaller power draw compared to 2005 components though ?.

I think people who are automatically saying low power draw = weak sauce are being a bit negative.

How many Watts did Gamecube draw and look at how small yet powerful that thing was compared to other current gen systems at the time.
 
Can't 2011 components do more with a smaller power draw compared to 2005 components though ?.

I think people who are automatically saying low power draw = weak sauce are being a bit negative.

How many Watts did Gamecube draw and look at how small yet powerful that thing was compared to other current gen systems at the time.

Has anyone measured the power draw on, say, Darksiders or Batman of CoD?
 
Well, as BG pointed out from yonder, it could actually be clocked less than 800MHz to be in line with multipliers. But yeah, it's not going to be an overclocked chip cause that just wouldn't be right.


800 for the Ram
1200 for the CPU
400 for the GPU
200 for the DSP

A little over a 1.5 increase in power from the Wii
which was a 1.5 increase from the Gamecube.
 
Can't 2011 components do more with a smaller power draw compared to 2005 components though ?.

I think people who are automatically saying low power draw = weak sauce are being a bit negative.

How many Watts did Gamecube draw and look at how small yet powerful that thing was compared to other current gen systems at the time.

The 360S actually consumes less than half as much power as the very first 360 model.
The GameCube consumed around 23 watts, and its chips were made at a 180nm process.
 
Top Bottom