Rumor: Wii U final specs

Well as far as I know volts x amps = watts. But if that calculation is different for the input than it is for the output then I'd welcome some clarification from anyone who knows better.

Quite a large disparity between input/output then? Hmmmm.

Can't wait for someone to test all this :D
 
Do people really think MS and Sony are going to blow 160W of their power budget on the GPU?

We're definitely not going to see an 8000 series GPU in the PS4 or 720 unless you're willing to spend a ridiculous amount of money to pay for it. Not going to happen, particularly when we're in the middle of the worst worldwide recession in donkey's years.

Both Sony and Microsoft know that they need to keep the retail price down to less than 400 dollars. We may even see both platform holders using the lesser amount of RAM, 2GB and 6GB respectively, from the target specs given to developers. Nobody wants a console to fail as badly in terms of sales as the PS3 did for the first two years of its life.
 
CECH3000 was last years revision I believe.

I could be (probably am) reading this wrong (?) but I'm finding the original model had an output of 12V 32A = 384W?

Okay given this estimate.

We're looking at 60W max power draw - which presumably would be inclusive of the USB draw, which I've seen people attribute 10W?

That would mean 50W draw under load.

How much of this power "budget" would be reasonable to allocate to the GPU?
30-35W
25-30W
20-25W

That's not how it works.

It's this simple:

The Wii U's PSU can supply the console a maximum of 75 watts. Note: The 75w is NOT what the PSU draws from the power point, but what it can feed the Wii U.

But like any PSU, it the Wii U's PSU likely doesn't like being placed under 100% load. 80% would likely be the sweet spot where the PSU can run 24/7 365 without fault.

Thus 56w is likely a very realistic constant draw for the console.
 
Ok, the reason you don't want to push a PSU consistantly at it's power rating is because heat over time damages components. That said, the higher a PSU's efficiently, the closer to that max it can run without damage due to less heat production.

If the efficiency is 60%, 40% is lost as heat...

If the efficiency is 80%, only 20% is lost as heat.

Also, the more watts it's rated, the more heat it produces per percentage lost simply because the percentage lost conains more watts total. a low wattage PSU can function much closer to its theoretical maximum for that reason.
 
http://www.cinemablend.com/games/Wi...ble-DirectX-11-Equivalent-Graphics-47126.html

Did anyone happen to catch this recently? Wii-U is supporting the new Unity Engine and the creator is saying some pretty good things about the Wii-U.

David Hegalson - Unity Technology's CEO
"All that talk about the Wii U being “weak”, “underpowered”, and not capable of outputting graphics like the Xbox 360 or PS3, is apparently hogwash. Unity Technology's CEO David Hegalson squashes the rumors by acknowledging just how far the Wii U's tech can scale and what developers will be capable of utilizing with the Unity alone."

He goes on more to talk about how the Wii-U WILL use some of the DX11 equivalent features when working with the Unity Engine.

For those who havent seen it this is their new demo on youtube made on the PC of course.
http://www.youtube.com/watch?v=Qd1pGJ5Hqt0
 
http://www.cinemablend.com/games/Wi...ble-DirectX-11-Equivalent-Graphics-47126.html

Did anyone happen to catch this recently? Wii-U is supporting the new Unity Engine and the creator is saying some pretty good things about the Wii-U.

David Hegalson - Unity Technology's CEO


He goes on more to talk about how the Wii-U WILL use some of the DX11 equivalent features when working with the Unity Engine.

For those who havent seen it this is their new demo on youtube made on the PC of course.
http://www.youtube.com/watch?v=Qd1pGJ5Hqt0

Yeah this was posted, but the thread didn't last long.

http://www.neogaf.com/forum/showthread.php?t=492249
 
^ Already has a topic. The title doesn't match the article. Beaten by Meelow.

Do you understand what the glfop are and how its rated? It seem to be a misunderstanding of what they can change to boost this number. What are you saying they are changing to boost the glfop so high?

You shouldn't be questioning people. That said it was based on the dev kit GPU supposedly being 576GF at 450Mhz. It was speculation based on Nintendo continuing to use clock multiples like GC and Wii. We know the DSP was 120Mhz and 640ALUs at 480Mhz would be ~614GF.
 
^ Already has a topic. The title doesn't match the article. Beaten by Meelow.



You shouldn't be questioning people. That said it was based on the dev kit GPU supposedly being 576GF at 450Mhz. It was speculation based on Nintendo continuing to use clock multiples like GC and Wii. We know the DSP was 120Mhz and 640ALUs at 480Mhz would be ~614GF.

Hey Bg, what do you think of that 8xxx GPU rumor?
 
That's not how it works.

It's this simple:

The Wii U's PSU can supply the console a maximum of 75 watts. Note: The 75w is NOT what the PSU draws from the power point, but what it can feed the Wii U.

But like any PSU, it the Wii U's PSU likely doesn't like being placed under 100% load. 80% would likely be the sweet spot where the PSU can run 24/7 365 without fault.

Thus 56w is likely a very realistic constant draw for the console.

Right, this guy has it down.
But I'd go further down than that. 56W with everything working at the same time, disc spinning, CPU & GPU full load, USBs loaded, Wifi, etc.

So, the typical draw is back to 40-45W with a game. Most of that is from a constant load on the CPU, GPU, RAM, and Wifi sending signal.
Everything else will be a small part of that, intermittent disc spin, intermittent game saves to internal flash, 1 or 2 USB devices but idling most of the time, and so on.
 
So all the reasonable people saying no way it's a Power7 proved right again.

Ho hum.

Well actually we still don't know the CPU, it could be Power7, it might not be, who knows.

P.S I'm not defending anything but there where people that said "no way the Wii U uses a GPGPU" and they where proven wrong.

We'll have to see at launch if the CPU is Power7 or something else.

I do find it weird that IBM "corrected" them self's a year and 3 months after the original post.
 
^ Already has a topic. The title doesn't match the article. Beaten by Meelow.



You shouldn't be questioning people. That said it was based on the dev kit GPU supposedly being 576GF at 450Mhz. It was speculation based on Nintendo continuing to use clock multiples like GC and Wii. We know the DSP was 120Mhz and 640ALUs at 480Mhz would be ~614GF.

Can I ask where you got the FLOP info on the Dev kit GPU?
Or are you speculating it has 640ALUs and calculating it based on reasonable clocks?
 
That's not how it works.

It's this simple:

The Wii U's PSU can supply the console a maximum of 75 watts. Note: The 75w is NOT what the PSU draws from the power point, but what it can feed the Wii U.

But like any PSU, it the Wii U's PSU likely doesn't like being placed under 100% load. 80% would likely be the sweet spot where the PSU can run 24/7 365 without fault.

Thus 56w is likely a very realistic constant draw for the console.
Yeah, this was the conclusion I was working from, or thereabouts.
Right, this guy has it down.
But I'd go further down than that. 56W with everything working at the same time, disc spinning, CPU & GPU full load, USBs loaded, Wifi, etc.

So, the typical draw is back to 40-45W with a game. Most of that is from a constant load on the CPU, GPU, RAM, and Wifi sending signal.
Everything else will be a small part of that, intermittent disc spin, intermittent game saves to internal flash, 1 or 2 USB devices but idling most of the time, and so on.
So what are some estimates for WiFi power usage and 2GB of RAM (for either DDR3 or GDDR5)?

What does that leave for the GPU?
 
Well actually we still don't know the CPU, it could be Power7, it might not be, who knows.

P.S I'm not defending anything but there where people that said "no way the Wii U uses a GPGPU" and they where proven wrong.

We'll have to see at launch if the CPU is Power7 or something else.

I do find it weird that IBM "corrected" them self's a year and 3 months after the original post.

It's not a Power7. At best, it's a derivative of it, but it was never going to be a Power7.
 
I'm just glad AMD managed to sweep up so many console CPU/GPUs... lately I've been worried about them since they are more or less being pushed out of the PC CPU industry.
 
I'm just glad AMD managed to sweep up so many console CPU/GPUs... lately I've been worried about them since they are more or less being pushed out of the PC CPU industry.

It's their own fault on the PC front. Their latest lineup of CPUs was so underwhelming, it's not even funny.
 
It's their own fault on the PC front. Their latest lineup of CPUs was so underwhelming, it's not even funny.

It's not their own fault on GPU's though. Their GPU's are great, and still nobody buys them cause everybody is an Nvidia fanboy.

Sadly winning all consoles, while not a bad thing, wont help AMD much either. It's not a very lucrative field. They get a minor one off payment and thats about it basically I think.

If winning the consoles was lucrative, AMD stock wouldn't be at historic lows right now.

I could see a danger of AMD going out of business or being bought up by a Qualcomm that doesnt care about PC GPU's sometime in the next gen. I'm sure the console makers dont care very much and have all the patents and agreements in place where it doesn't matter.

Biggest benefit to AMD imo will be all the PC ports of console games next gen being based more on AMD architecture and theoretically running better on AMD PC cards. Theoretically anyway. If it was that big a deal though I'm sure Nvidia wouldn't have allowed it to happen. Besides within a couple years both IHV's GPU's will probably look quite different than what goes into next gen console anyway, lessening any benefit.
 
Yeah, this was the conclusion I was working from, or thereabouts.

So what are some estimates for WiFi power usage and 2GB of RAM (for either DDR3 or GDDR5)?

What does that leave for the GPU?

I'm guessing something similar to the wattages of AMD's e4690 and e6760.
~25-35W for GPU+RAM
And ~10-15W for the CPU.
Wifi, I'm not sure, about 1w or less.
 
It's their own fault on the PC front. Their latest lineup of CPUs was so underwhelming, it's not even funny.
It's the money difference they have against intel that's not even funny, meaning once upon a time they were able to compete seeing intel made bad choices, but not now; they can't invest as much on R&D or on securing experimental tech like tri gate.

They're doing well enough, but they're just being played at this point; intel even caps the max their top cpu's can do just so if AMD can muster to equal them they'll just have to tweak some things and/or make more cores available.
 
It's the money difference they have against intel that's not even funny, meaning once upon a time they were able to compete seeing intel made bad choices, but not now; they can't invest as much on R&D or on securing experimental tech like tri gate.

While this is true, I think there's definitely a place for two CPU makers. AMD doesn't have to perform the best to survive. Hell for most of their life they've rarely had the lead on intel, in CPU performance, yet they've been in business for a long time. There's a place for a low cost, decent performing Intel alternative. Hell some people even claim Intel purposefully allows AMD to live to prevent antitrust actions.

Bulldozer was just a dog, though. Note I mentioned "decent" performance, and BD doesn't have that.

Anyway drifting OT...
 
Right, this guy has it down.
But I'd go further down than that. 56W with everything working at the same time, disc spinning, CPU & GPU full load, USBs loaded, Wifi, etc.

So, the typical draw is back to 40-45W with a game. Most of that is from a constant load on the CPU, GPU, RAM, and Wifi sending signal.
Everything else will be a small part of that, intermittent disc spin, intermittent game saves to internal flash, 1 or 2 USB devices but idling most of the time, and so on.

Exactly, it also aligns perfectly with what Iwata said about 40w typical in game consumtpion.

Exclude USB, ROM drive, Wifi and tablet streaming, the core Wii U components like the GPU and CPU will draw somewhere around 40w. Add in all the afformentioned, USB, ROM drive, etc, the Wii U could consume a peak of 75w. Realisti high load power consumption is likely somewhere between 55-65w depending on how efficient the PSU really is.

IMHO it's both possible and realistic that Nintendo would be going for a high efficiency power supply. Nintendo love small form factors, reliablity, and seem to be really pushing for power efficiency with the Wii U.

A 80% or higher PSU is going to mean less wasted power, so better green credentials and low power consumption for end users. It's also going to mean the PSU brick can be small and enclosed, as it wont be generating significant heat. A 65% PSU is going to generate more heat then one that is 80%. A more efficient PSU is also more reliable, as the components are more efficient and run cooler.

Also people need to stop worrying about power consumption and how it relates to the Wii U's technical power. The Wii U by all accounts has dedicated DSP, I/O controller, and i believe even an ARM CPU for some O/S functions. Not to mention IBM's modern Power based CPUs are incredibly efficient.

The reason consoles like the Xbox 360 and PS3 have been such high draw is not just because they had very high end parts for their time, but also because they went down a path where they had single components like the CPU doing an awful lot of functions. The Xbox 360's CPU does I/O, sound, general processing, FPU tasks, SMID tasks etc. The Xbox 360's CPU is no where near as efficient as a dedicated DSP at doing sound, or as good as a dedicated I/O controller at handling I/O tasks. Tthe Xbox 360 performs many tasks that a processor of significnatly less wattage and clock speed could do, and do better. Nintendo have simply identified which tasks could be done by a seperate dedicated chip better, and put it in their console.

Classic example would be Creative XFi DSP. Do you think the Xbox 360's Xenon processor can do sound better then a XFi DSP. No, it cant. That's despite the Xenon processor being almost 10x higher clocked, featuring SMT, and having 3 cores. The Creative XFi DSP is designed for sound, that's why it's so freaking good at it. The XFi DSP not only would shit on the Xenon processor at sound, it does it on wattage so low it can be powered off a old PCI bus. As do would a dedicated I/O controller of a few hundred megahurtz shit on the Xbox 360's CPU at doing I/O requests, and again use a lot less power.

With the Wii U using lots of dedicated chips for specific tasks, that boosts efficiency and reduces power consumption. Sure the Wii U won't be a power house, but it does seem like Nintendo have invested significant engineering into ensuring the Wii U's hardware is as power efficient as possible, and that consumers are getting maximum performance per wattage. If BG's suggestion of the Wii U's GPU being around 6570 performance /600GLFOPs are true, the Wii U Would actually quite a engineering achievement. Not because it's a power house of a console, but because it delivers brlliant performance per watt.
 
Hey Bg, what do you think of that 8xxx GPU rumor?

It would be interesting to see if switch to something like that if those specs are true. Would definitely increase the increase the gap between PS4 and Wii U.

Can I ask where you got the FLOP info on the Dev kit GPU?
Or are you speculating it has 640ALUs and calculating it based on reasonable clocks?

Wsippel first got details a year ago.
 
Sadly winning all consoles, while not a bad thing, wont help AMD much either. It's not a very lucrative field. They get a minor one off payment and thats about it basically I think.
Depends on the deal and if they manufacture the chip or not.

In Gamecube's case I believe it was paid for and Nintendo controlled production to the point they manufactured it on NEC factories. As for X360 and PS3 I reckon these are manufactured by the original supplier (not necessarily the brand, but companies like TSMC) but later packaging could go through them or they could be entitled to royalties nonetheless. I know Xbox 1 paid a fixed royalty (that never went down) on Xbox which was poorly negociated, and that they still pay royalties for every Xbox 360 to Nvidia due to the X360 emulator needing some of their libraries.

I'm sure AMD getting to be a major supplier in consoles is a big deal, and a big money injection.
 
It's not their own fault on GPU's though. Their GPU's are great, and still nobody buys them cause everybody is an Nvidia fanboy.

Sadly winning all consoles, while not a bad thing, wont help AMD much either. It's not a very lucrative field. They get a minor one off payment and thats about it basically I think.

If winning the consoles was lucrative, AMD stock wouldn't be at historic lows right now.

I could see a danger of AMD going out of business or being bought up by a Qualcomm that doesnt care about PC GPU's sometime in the next gen. I'm sure the console makers dont care very much and have all the patents and agreements in place where it doesn't matter.

Biggest benefit to AMD imo will be all the PC ports of console games next gen being based more on AMD architecture and theoretically running better on AMD PC cards. Theoretically anyway. If it was that big a deal though I'm sure Nvidia wouldn't have allowed it to happen. Besides within a couple years both IHV's GPU's will probably look quite different than what goes into next gen console anyway, lessening any benefit.

I'm not so sure, I don't think Nintendo is just licensing the tech, they are probably also renting out their foundaries (probably their biggest asset right now).

As said elsewhere it's not AMD's fault they can't keep up with CPUs... They just do not have the funding. APUs are the way AMD sees to compete in the future simply because their on-board graphics will always smoke Intel's offerings... even if Intel's own offerings have improved quite a bit.
 
While this is true, I think there's definitely a place for two CPU makers. AMD doesn't have to perform the best to survive. Hell for most of their life they've rarely had the lead on intel, in CPU performance, yet they've been in business for a long time. There's a place for a low cost, decent performing Intel alternative.
Indeed, the biggest market is the low end one, always was always will be.

But the cpu open market seems to value too much the tech being the best. It's like if you're buying a Atom/Celeron (let's call it a Fiat) you're still getting the lineage of a i7/Xeon (Ferrari) kind of thing; people want the best manufacturer, not always the best bang for the buck. And then intel has a big name in the market; having a big name doesn't mean they'll always have the edge, but AMD hasn't managed to crack that for a few years now.
Hell some people even claim Intel purposefully allows AMD to live to prevent antitrust actions.
Perhaps they do, they certainly don't see them as a pebble in their shoe at this point.
Bulldozer was just a dog, though. Note I mentioned "decent" performance, and BD doesn't have that.
They're trying though.

Lots of things come out wrong in their first generation, but AMD Fusion's are definetly not as bad as atoms, for instance. They seem to be going for the low end, really.
 
How the hell do you make an error like that for a year? So stupid.
Do the people that deal with the twitter accounts even have direct access to such info?

The people dealing with twitter, facebook and all that crap are usually PR's or the marketing team.

They might not even know the lineage now or be out of bounds for them, so they're trying to not overstep it either by claiming it isn't or claiming it is.
 
It would be interesting to see if switch to something like that if those specs are true. Would definitely increase the increase the gap between PS4 and Wii U.
Well, I believe a is real possibility. But the performance achieved with the SI part they are aiming for should be around what have been stated before. After all, Sony is not the ones to understate the performance of their future hardware. All the contrary I would say.
 
Do the people that deal with the twitter accounts even have direct access to such info?

The people dealing with twitter, facebook and all that crap are usually PR's or the marketing team.

They might not even know the lineage now or be out of bounds for them, so they're trying to not overstep it either by claiming it isn't or claiming it is.
I generally assume the bold for companies - regardless of who the account is named as, unless it's a verified personal account.

As to your last sentence - they first claimed it was a "POWER7 chip" and are only now dialing it back to the vague "Power Architecture base" which tells us nothing that wasn't already known from the first press release.
 
Indeed, the biggest market is the low end one, always was always will be.

But the cpu open market seems to value too much the tech being the best. It's like if you're buying a Atom/Celeron (let's call it a Fiat) you're still getting the lineage of a i7/Xeon (Ferrari) kind of thing; people want the best manufacturer, not always the best bang for the buck. And then intel has a big name in the market; having a big name doesn't mean they'll always have the edge, but AMD hasn't managed to crack that for a few years now.Perhaps they do, they certainly don't see them as a pebble in their shoe at this point.They're trying though.

Lots of things come out wrong in their first generation, but AMD Fusion's are definetly not as bad as atoms, for instance. They seem to be going for the low end, really.

AMD has openly admitted they are done trying to compete on the high end CPU. The white flags waved so unless Intel fucks up and gets lazy again Intel is the king.

Im surprised IBM hasn't issued any earning statements indicating the massive revenue stream loss that they are apparently going to endure losing 2 of 3 console makers.
 
I always wondered how the teams operate inside AMD when they are working in house for all three competitors.

Is it completely separate like samsung, whose component division has almost zero connection with the hardware department? From what i heard, they still have to bid like any other company for certain parts in samsung phones.

Or is it one big building where a key card and rank can get you access and info on all three projects Where engineers on each team regulatory interact making corporate espionage much easier.

Not much to do with the current topic, i was just curious.
 
...
Also people need to stop worrying about power consumption and how it relates to the Wii U's technical power. The Wii U by all accounts has dedicated DSP, I/O controller, and i believe even an ARM CPU for some O/S functions. Not to mention IBM's modern Power based CPUs are incredibly efficient...

I'm gonna have to disagree there, I think form factor and power consumption can tell help us speculate what the performance is like under the hood. I know that alone won't be enough. It'll give us atleast an idea, just nowhere near a clear picture. We need more info like: what process technology is the GPU being produce at.
 
Do you think that was just for PS4 or also Xbox 720?

Could be both though PS4 would definitely be the most likely of the two based on their original target specs.

Well, I believe that is real possibility. But the performance achieved with the SI part they are aiming for should be around what have been stated before. After all, Sony is not the ones to understate the performance of their future hardware. All the contrary I would say.

If those specs are true, those would be some serious efficiency improvements. Based on what a CU contains, the 8870 has 28 CUs when looking at the 1.1Ghz clock since that comes out to ~3.94TF and the 24CUs for the 8850 @ 975Mhz. I have felt PS4 would probably end up using Sea Islands, but my biggest question would be the prices listed.
 
Could be both though PS4 would definitely be the most likely of the two based on their original target specs.



If those specs are true, those would be some serious efficiency improvements. Based on what a CU contains, the 8870 has 28 CUs when looking at the 1.1Ghz clock since that comes out to ~3.94TF and the 24CUs for the 8850 @ 975Mhz. I have felt PS4 would probably end up using Sea Islands, but my biggest question would be the prices listed.
A Smaller SI for cheaper? :)
 
Could be both though PS4 would definitely be the most likely of the two based on their original target specs.

Ahh, if the PS4 does use the 8xxx GPU do you think that could hurt Wii U third party support or possibly help it? (sorry I'm asking all these questions lol)

Personally I find the article not 100% legit because it didn't sound like it was confirmed but more speculation on the writer.
 
A Smaller SI for cheaper? :)

Compared to the 7950?

Ahh, if the PS4 does use the 8xxx GPU do you think that could hurt Wii U third party support or possibly help it? (sorry I'm asking all these questions lol)

Personally I find the article not 100% legit because it didn't sound like it was confirmed but more speculation on the writer.

The jump was rather big in a direct comparison to the 7800 line (not saying it's impossible) and the 8870 based on those specs is an improved 7950. So I would be surprised to see it released at a price that is $170 cheaper than what the 7950 debuted at.

As for ports in that scenario, the market is still going to be the primary determinant IMO. And if PS4 and Xbox 3 are essentially "port buddies" then that should free up dev resources for a Wii U version of a game.
 
Compared to the 7950?



The jump was rather big in a direct comparison to the 7800 line (not saying it's impossible) and the 8870 based on those specs is an improved 7950. So I would be surprised to see it released at a price that is $170 cheaper than what the 7950 debuted at.

As for ports in that scenario, the market is still going to be the primary determinant IMO. And if PS4 and Xbox 3 are essentially "port buddies" then that should free up dev resources for a Wii U version of a game.

Thanks, I am just scared that devs will be like "Wii U isn't powerful enough to run this" like they did with the Wii...Should I be scared for that? lol.
 
Thanks, I am just scared that devs will be like "Wii U isn't powerful enough to run this" like they did with the Wii...Should I be scared for that? lol.

Not sure why you would be scared. But there would definitely be some for varying reasons saying that. Some may not have the resources to devote to a Wii U downport since it definitely wouldn't be a 1:1 port. Some may just be picky about how their game looks, which would be their right to do so.
 
Not sure why you would be scared. But there would definitely be some for varying reasons saying that. Some may not have the resources to devote to a Wii U downport since it definitely wouldn't be a 1:1 port. Some may just be picky about how their game looks, which would be their right to do so.
Or some may just want to make games which require more than 2005-level CPU performance for gameplay reasons, and thus be pretty much unportable.
 
Top Bottom