Rumor: Wii U final specs

IBM eDRAM is designed for IBM processes, Cu45-HP and Cu32. GPUs typically use different processes (40nm or 28nm CMOS), which is what Renesas designed their eDRAM for. Renesas has a lot of experience with eDRAM, I really wouldn't worry about it..

Well yes, historically GPUs have been on 55 nm, 40 nm, 28 nm, and etc. But is there any technological reason for a custom design to stick to those prerequisites? There are Trinity systems at 32nm, I believe. I was also mislead by GF and IBM sharing the same 32/28 nm HKMG tech. IBM's eDRAM is made to be placed on-chip, but I guess I can see there still being some challenges coupling it with a non-IBM core design.

It sounds like much of the team had experience with the Wii and GCN previously, so much of the Wii U design was a natural extension of the previous 2. I know Renesas and NEC are now one. I wonder if some of the old NEC people lent any ideas.
 
Am I drunk or would 32MB UX8GD eDRAM really only require 16mm^2!?

256Mbit * 1024^2 * 0.06 square micron / 1000^2

Something wrong with my formula?
Nope. If it's indeed .06um^2 per cell then that amounts to 16mm^2
 
But there's this:

http://forum.beyond3d.com/showpost.php?p=1672537&postcount=2845

A 28nm GPU of those dimensions would likely have to have a whole lot of transistors. Doesn't seem realistic imo.

500-700m trans (not even counting EDRAM) jives so much better with what we've seen out of the machine to date than 1-1.4B would it's not even funny.

What about this,
http://www.globalfoundries.com/technology/28nm.aspx

28nm transistors offer up to 60% higher performance than 40nm at comparable leakage with up to 50% lower energy per switch and 50% lower static power.

And this,
http://onlinebigtrends.com/tag/renesas-electronics-corp/

The key for foundries is to perfect that technology, and shrink the space between the transistors on a chip that make it work, so more of them can be fitted on. Currently, the thinnest TSMC and other top makers are able to produce at is the 28 nanometer (nm) level – meaning the gap between the transistors is several thousand times thinner than a human hair, and tens of millions of transistors can fit on the head of a pin.
 
Judging by how well console hardware is built and tested on average, I would say Nintendo's standards actually do seem kind of exacting, lol.

Yeah, I wasn't being ironic. Nintendo have a great reputation for solid hardware, and I believe their engineers are some of the best in the world when it comes to making robust, simple and cheap hardware. Looking at the Wii U design, there's absolutely nothing exotic or superfluous in there at all, unlike the PS3 and 360, where the internals are all over-engineered messes, with hundreds of custom components all crammed in there.

Because the japanese are known for their laziness and poor work ethic . Especially those at nintendo. All those guys probably play video games all day and get nothing done. Especially iwata. Hes clocked over 2000 hours on dota 2 alone

I never suggested otherwise.

None of what they spoke of is "mundane". This is how product design works, i'm afraid. Small details can require a lot of work and being so exacting in these apparently 'mundane' tiny details can often yield the greatest rewards. Name me one great product design and I'll name a seemingly insignificant detail they've spent a ton of time on.

I said "seemingly mundane". And sure, the details are important, but I think there's more important things to talk about regarding the Wii U before going into that sort of minutiae...

And I've no doubt you can tell me about the subtle design features of any product I care to mention, because pretty much all companies talk about those sorts of things these days. But I'd be more interested if you could identify where these products were lacking. When that guy was talking in this interview about how they put USB ports on the front of the console, changed the front flap to open inwards, and moved the sync button to the outside, it did make me wonder how much people actually notice these things. Were consumers frustrated by the lack of front USB ports, concealed sync button and the opening direction of the front flap in the Wii? Did people appreciate the front USB ports, unconcealed sync button and flap opening direction on the 360? It's not clear exactly where common sense ends and engineering genius begins...
 
Interesting read thanks to iwata asks, any transistor count estimations for the CPU?

The GPU is almost certainly not 40nm. 28nm is likely considering the problems with manufacturing, we'll have to see what sort of performance the GPU has, though I'm pretty sure it's sub TFLOPs range, or EPIC would be targeting the console with UE4, then again with that same token of thought, it hints at XB3 being around 1TFLOPs for the GPU (obviously on the plus side) -This is especially a strong hint if you take into account that EPIC was targeting multiple TFLOPs for UE4 and dropped it to just 1TF after actually seeing all 3 consoles.

Sorry to go off topic, I'm still much more interested in the CPU atm, we know the GPU's performance range is fairly big compared to current gen consoles but will end up shy of other next gen consoles.
 
If that's really true (skeptical), that's crazy. EDRAM should be a no brainer in every next gen console.
I was reading somewhere that using edram would make little to no sense if they go with gddr5 memory for video memory at least.That the bandwidth simply wouldn't be an issue anymore so no need to try to overcome it with inclusion of edram.
 
I was reading somewhere that using edram would make little to no sense if they go with gddr5 memory for video memory at least.That the bandwidth simply wouldn't be an issue anymore so no need to try to overcome it with inclusion of edram.

probably read it from me lol. i've been bringing it up repeatedly because some people seem to be pushing on gddr5 in wii u. in which case it doesnt have 32mb edram. it makes no sense to have both.

ms and nintendo are likely using ddr3.

thats why durango can pack 8gb, it's cheap. gddr5 is expensive and also difficult to have over 4gb in any case.

so to me, packing a little edram in might give you a big ram advantage.

but i'm not a programmer so i cant say which approach is better.

even with e/d/sram, ddr3 may be a hindrance.
 
I was reading somewhere that using edram would make little to no sense if they go with gddr5 memory for video memory at least.That the bandwidth simply wouldn't be an issue anymore so no need to try to overcome it with inclusion of edram.
While GDDR5 helps a lot when it comes to BW, it makes the matter worse when it comes to latency. eDRAM helps both with BW and latency.


I really, really tried to understand all you guys wrote here, but it's just all doesn't make sense to me :S Is it possible, with all the info from "Iwata asks" to estimate the horsepower of the WiiU compared to the 360 and PS3?
Haha. No.

But it wipes the floor with those two in terms of efficiency.
 
I really, really tried to understand all you guys wrote here, but it's just all doesn't make sense to me :S Is it possible, with all the info from "Iwata asks" to estimate the horsepower of the WiiU compared to the 360 and PS3?
 
I really, really tried to understand all you guys wrote here, but it's just all doesn't make sense to me :S Is it possible, with all the info from "Iwata asks" to estimate the horsepower of the WiiU compared to the 360 and PS3?
No. It's just neat inside design & manufacturing info.
 
Am I drunk or would 32MB UX8GD eDRAM really only require 16mm^2!?

256Mbit * 1024^2 * 0.06 square micron / 1000^2

Something wrong with my formula?

No. That looks right.

I think the 0.06 number is probably just the memory cell and doesn't include the overhead associated with actually being able to read and write to that memory. For some variations of 1T-SRAM, that could double the actual size.

Even if it did in this case, that's still only around ~32mm for 32MB which is not bad at all. If that's true, then the GPU is almost certainly 40nm. With a 28nm GPU you basically be looking at something like Cape Verde + 32MB EDRAM in the size of the pictured GPU.
unless it were cape verde running at 300 MHz
 
Interesting read thanks to iwata asks, any transistor count estimations for the CPU?

The GPU is almost certainly not 40nm. 28nm is likely considering the problems with manufacturing, we'll have to see what sort of performance the GPU has, though I'm pretty sure it's sub TFLOPs range, or EPIC would be targeting the console with UE4, then again with that same token of thought, it hints at XB3 being around 1TFLOPs for the GPU (obviously on the plus side) -This is especially a strong hint if you take into account that EPIC was targeting multiple TFLOPs for UE4 and dropped it to just 1TF after actually seeing all 3 consoles.

Sorry to go off topic, I'm still much more interested in the CPU atm, we know the GPU's performance range is fairly big compared to current gen consoles but will end up shy of other next gen consoles.

But they didn't say they had problems with yield though did they? I thought they were talking about the development of the console before manufacture.
 
No. That looks right.

I think the 0.06 number is probably just the memory cell and doesn't include the overhead associated with actually being able to read and write to that memory. For some variations of 1T-SRAM, that could double the actual size.

Even if it did in this case, that's still only around ~32mm for 32MB which is not bad at all. If that's true, then the GPU is almost certainly 40nm. With a 28nm GPU you basically be looking at something like Cape Verde + 32MB EDRAM in the size of the pictured GPU.

from b3d/entropy

http://forum.beyond3d.com/showpost.php?p=1672608&postcount=2868

http://forum.beyond3d.com/showpost.php?p=1672622&postcount=2873


The 40nm eDRAM density is slightly higher than IBMs on 45nm SOI (0.067um2). On IBMs process, that works out to 0.24mm2 finished 1Mbit macros or 32MByte in 61mm2, so the same amount on UX8 might be 55mm2 or so, leaving on the order of 100mm2 for the rest of the GPU if initial size estimates are correct.


For whatever reason, bgassassin seems adamant that the GPU is made using finer lithography. I have no idea what he bases that on.

That calculation is just taking the size of the basic element and multiplying it with 256 million. For instance, the cell needs to be connected..... The way it works out is that they design à macro that takes care of signalling, power supply and refresh, yada yada yada which then serves as the basic building block. In IBMs case the area per effective bit in the finished macro is 3.5 times higher than the basic structure. That's à huge difference, implying that the macro layout is actually more important than the basic cell size. I made à big assumption that this would be identical between companies and processes when I back-of-the-enveloped the corresponding size of 32MB 40nm eDRAM by Renesas. Use it for entertainment purposes only.

His numbers are close to what my estimates were though, so I'm kinda proud of that. Or I guess to say the simplest of calculations yield reasonable results here lol.

I think his first post nails it. You're probably left with ~100mm^2 of a 40nm GPU left over. Which equates to ~50mm^2 of a 28nmGPU. Which explains why the games dont look spectacular or why they didn't just use a Cape Verde (123mm^2@28nm), etc. It's still going to beat an X360 even in my book.
 
More powerful machines mean better ports. Haven't seen strong evidence of that yet. The Wii U exclusives so far dont look above 360 level either.

BOPS2 1080p vs 720p- on others? Sonic All Star Racing 2 with more effects and 5 players? Trine director putting a content that Ps360 can't run?

An console that is a little more powerful will not gain good ports in the first months of life. Look, for example, the port of Ps2's games for 3DS.

There is more juice on Wii U, a enough amount to a developer that don't know much about that machine can port your game from Ps360 with better graphics.
 
That we have no solid information on Renesas eDRAM beyond 40nm has me leaning towards that process now. Hmmm. If that's the case I wonder if the 600 Mhz e4690 would be a better "base" to build from. It is based on R700 with a 320:32:8 core config. Seeing as that I don't believe ROPs have been a bottleneck this gen, we've heard 1.5x general performance rumors, and that there is still that nice group of 32 TMUs to feed off that on-chip eDRAM, this config might actually make some sense. That whole e4690 MCM with RAM was 25 watts at 55 nm...

Plus, if Matt is to be believed, there would be less general purpose registers than the e6760, but it's possible Nintendo beefed up the ones available. Maybe even incorporated eDRAM there...
 
BOPS2 1080p vs 720p- on others? Sonic All Star Racing 2 with more effects and 5 players? Trine director putting a content that Ps360 can't run?

An console that is a little more powerful will not gain good ports in the first months of life. Look, for example, the port of Ps2's games for 3DS.

There is more juice on Wii U, a enough amount to a developer that don't know much about that machine can port your game from Ps360 with better graphics.

Call of Duty on the PS3/360 isn't even 720p, it's like 571p or something like that.
 
BOPS2 1080p vs 720p- on others? Sonic All Star Racing 2 with more effects and 5 players? Trine director putting a content that Ps360 can't run?

An console that is a little more powerful will not gain good ports in the first months of life. Look, for example, the port of Ps2's games for 3DS.

There is more juice on Wii U, a enough amount to a developer that don't know much about that machine can port your game from Ps360 with better graphics.

Can you link me to the source that states that Black Ops 2 runs at 1080p?

And no, a Treyarch PR guy calling it "full HD" is not a source
 
from b3d/entropy

http://forum.beyond3d.com/showpost.php?p=1672608&postcount=2868

http://forum.beyond3d.com/showpost.php?p=1672622&postcount=2873






His numbers are close to what my estimates were though, so I'm kinda proud of that. Or I guess to say the simplest of calculations yield reasonable results here lol.

I think his first post nails it. You're probably left with ~100mm^2 of a 40nm GPU left over. Which equates to ~50mm^2 of a 28nmGPU. Which explains why the games dont look spectacular or why they didn't just use a Cape Verde (123mm^2@28nm), etc. It's still going to beat an X360 even in my book.

I have no doubt about that, I'm more interested in how it stacks up against next gen.
 
from b3d/entropy

http://forum.beyond3d.com/showpost.php?p=1672608&postcount=2868

http://forum.beyond3d.com/showpost.php?p=1672622&postcount=2873






His numbers are close to what my estimates were though, so I'm kinda proud of that. Or I guess to say the simplest of calculations yield reasonable results here lol.

I think his first post nails it. You're probably left with ~100mm^2 of a 40nm GPU left over. Which equates to ~50mm^2 of a 28nmGPU. Which explains why the games dont look spectacular or why they didn't just use a Cape Verde (123mm^2@28nm), etc. It's still going to beat an X360 even in my book.

I can't believe I'm saying this, but I think I agree with you. I'm hopping on the 384 GFLOPS bandwagon.
 
from b3d/entropy

http://forum.beyond3d.com/showpost.php?p=1672608&postcount=2868

http://forum.beyond3d.com/showpost.php?p=1672622&postcount=2873






His numbers are close to what my estimates were though, so I'm kinda proud of that. Or I guess to say the simplest of calculations yield reasonable results here lol.

I think his first post nails it. You're probably left with ~100mm^2 of a 40nm GPU left over. Which equates to ~50mm^2 of a 28nmGPU. Which explains why the games dont look spectacular or why they didn't just use a Cape Verde (123mm^2@28nm), etc. It's still going to beat an X360 even in my book.

So what does all this mean?

No 580GFLOP machine? :(
The Wii U is 2x more powerful than current-gen confirmed then.

When was this confirmed? BG keeps saying it's 580GFLOP.
 
I have no doubt about that, I'm more interested in how it stacks up against next gen.
It probably doesn't. This looks like a machine designed to match or slightly exceed current consoles, while having a much lower power usage. Everything rumored about next PS and Xbox seems to indicate that they're shooting for similarly high power usage they had at launch with previous consoles.

Gemüsepizza;43093786 said:
2x more powerful? I thought the Xbox 360 was at 250 GFLOPS.
And PS3 GPU was rated at something like 500GFLops, IIRC. The number itself is meaningless as it depends on how it's counted and what the GPU does with it.
 
No 580GFLOP machine? :(
The Wii U is 2x more powerful than current-gen confirmed then.

It may be a truth we have to accept. And the eDRAM being on die is still huge. Of course this is just my non expert assessment, but 28nm seems like a long shot. The config I mentioned a couple posts back makes more sense balance-wise taking into account the CPU and DDR3 bandwidth. And in my opinion, having 32 TMUs makes sense for Wii BC as long as they are hooked up to 32 macros of eDRAM, which they presumably are.
 
It probably doesn't. This looks like a machine designed to match or slightly exceed current consoles, while having a much lower power usage. Everything rumored about next PS and Xbox seems to indicate that they're shooting for similarly high power usage they had at launch with previous consoles.


And PS3 GPU was rated at something like 500GFLops, IIRC. The number itself is meaningless as it depends on how it's counted and what the GPU does with it.

This was the number which was used by Tim Sweeney during an Unreal Engine talk. I know that Sony/MS tend to inflate their numbers for marketing purposes, but he seemed to know what he was talking about.
 
The guy speaking in the Wii U conference said it, and other people after that said it as well.

Are you actively trying to be as obtuse as possible? We all know it's going to be the exact same version as the HD twins until they release a statement otherwise.
You don't think they would proudly plaster that little tidbit of info everywhere if it were actually true?

By all accounts, it looks the same, it plays the same. Why even perpetuate this nonsense, it'll just string people along on false promises.
 
We are now on the sub 400 GFLOP/s possibility? my oh my...

That's obviously the worst case. I am sure people can make better statements when we get the exact production process, but as Nintendo won't tell, we probably will have to wait until someone gets his hands on the chip to look inside.


Still a improvise from the less than 720p with only 1 screen on Ps360.

I am not so sure if it is an improvement in a fast-paced egoshooter like Call of Duty, when you have to take your eyes off the screen.
 
Gemüsepizza;43095102 said:
I am not so sure if it is an improvement in a fast-paced egoshooter like Call of Duty, when you have to take your eyes off the screen.

But wouldn't it actually be better with two screens because there are moments in the games already where you lose all visuals around your character such as when driving the RC car or bringing up the class selection? Now you can do those things AND still keep an eye out around you.
 
Gemüsepizza;43095102 said:
That's obviously the worst case. I am sure people can make better statements when we get the exact production process, but as Nintendo won't tell, we probably will have to wait until someone gets his hands on the chip to look inside.

Yes but it's a real possibility, something that sincerely i didn't want to see (didn't even thought) in a new home console even from Nintendo... but i guess it was my fault to expect something.
That guy said it but that doesn't mean it's confirmed, BG is saying 580GFLOPS.
in fact i didn't said "sub 400 GFLOP/s period" but "sub 400 GFLOP/s possibility"
 
Yes but it's a real possibility, something that sincerely i didn't want to see (didn't even thought) in a new home console even from Nintendo... but i guess it was my fault to expect something.

in fact i didn't said "sub 400 GFLOP/s period" but "sub 400 GFLOP/s possibility"

Yeah i'm just saying though people shouldn't believe everything on the internet without facts, I mean I remember when a lot of people on Gaf said "The Wii U won't have a GPGPU, it's just a Nintendo fanboy dream" and than look what happened, that's why I don't believe until the facts come out.
 
Top Bottom