Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
Power7 is an entirely SMP design (massive, at that). That bit of info smells of an OS-dedicated core (wasteful, but still not entirely unheard of).
You sure? I'd assume that core 0 handles the OS and main game logic. Also, Nintendo loves their dedicated silicon, so I'd expect a rather powerful programmable I/O processor doing all the low level stuff (in fact, that one's pretty much confirmed by some former AMD guys). Video encoding would probably use dedicated logic as well, and knowing Nintendo, I wouldn't be surprised to see a dedicated audio DSP again (which would make a far bigger difference than most people would think, as you certainly know). Most of that stuff wouldn't even need to be in current devkits, as long as developers don't try to code too close to the metal. And I think I've heard a while ago that Nintendo at the very least discourages that anyway, if they even allow it at all. If you have to go through their APIs, they can change a lot of things at any time, and developers don't really need exact specs in many cases. As long as available performance goes up, everything should be just fine.
 
Do you know or remember what gave the PS2 that second wind?
That second wind, going by the chart, was the 24th quarter, or somewhere around 2006. 2005-2006 had a lot of the best PS2 games: God of War, Shadow of the Colossus, Bully, Black, Final Fantasy XII, Kingdom Hearts II, Okami, Gran Turismo 4. As well, when the PS3 came out in 2006, the PS2 got a price drop, and a lot of people wanting a PS3 for Christmas got a PS2 due to the high price and hard to find cheaper units (Sony didn't make very many 20GB systems at all).

(PSTwo came out in 2004)
 
Back in April:

IGN spoke to sources close to the company, who revealed that the new console, currently known only as Project Cafe, will be built around AMD's R700 GPU architecture.

The sources also said that rumours about the console's graphical power were true, with Project Cafe set to outperform even the PlayStation 3's NVIDIA 7800GTX-based processor.

Project Cafe will output at 1080p, with stereoscopic 3D an unconfirmed possibility, according to the sources.

The CPU for Project Cafe is said to be a custom-made triple-core IBM PowerPC chipset which is fundamentally similar to the Xbox 360's CPU, albeit with faster clock speeds.

Foxconn is said to be manafacturing the console, which will be roughly similar in size to the Xbox 360 and may be designed similarly to the classic NES. Like Nintendo, Foxconn has yet to comment on any aspect of the leak.

http://www.bit-tech.net/news/gaming/2011/04/25/project-cafe-system-specs-leaked/1

If we believe its a triple-core IBM cpu (not confirmed by IBM), then we must believe its based on a PowerPC chipset, built around the AMD R700, right?

How valid do you all think this rumor was? Is there anything stated to invalidate what was described?
 
I just hope after the holiday season dies down, nintendo throws us peasants a bone
We can only hope. And even if they do, will it be anything relating to specs? :/

Actually, I'd take anything, even a photo of the screws used to hold it together.
 
Back in April:



http://www.bit-tech.net/news/gaming/2011/04/25/project-cafe-system-specs-leaked/1

If we believe its a triple-core IBM cpu (not confirmed by IBM), then we must believe its based on a PowerPC chipset, built around the AMD R700, right?

How valid do you all think this rumor was? Is there anything stated to invalidate what was described?

an off the shelf R700 in early dev kits has been the prevalent rumor almost since day 1. Quite a few sources have reported the same thing, iirc
 
That second wind, going by the chart, was the 24th quarter, or somewhere around 2006. 2005-2006 had a lot of the best PS2 games: God of War, Shadow of the Colossus, Bully, Black, Final Fantasy XII, Kingdom Hearts II, Okami, Gran Turismo 4. As well, when the PS3 came out in 2006, the PS2 got a price drop, and a lot of people wanting a PS3 for Christmas got a PS2 due to the high price and hard to find cheaper units (Sony didn't make very many 20GB systems at all).

(PSTwo came out in 2004)
March 2000 original launch puts the Slim launch of late 2004 in quarter 19--the lowpoint. I think it was a combination of clearing out the old model and not-great initial supplies of the new model that caused the apparent dip/rebound. If memory serves, PS2 even missed hitting a million in December in the US that year.

EDIT: Also FWIW, I toyed with adding some other systems in with the PS2/Wii comparison, but there's not decent quarterly data for PS1's first few years, DS has a HUGE arc dominating either, and nothing else really compares.
 
I just hope after the holiday season dies down, nintendo throws us peasants a bone

Game announcements or hell even teasers that are just announcements of announcements would be all I'd need. THAT should show how info starved I am.

That being said, I just realized something. Would this info about 1 core being entirely dedicated to OS shenanigans imply that the system would only use 2 cores/4 threads total for gaming applications? If so, is this a handicap or no? I know POWER7 is a beast even though the specifics are all way above my head (I'm better with understanding GPUs at a "better than casual" level), but something about only having that much to work with sits weirdly with me.
 
I gotcha DC. I'm used to those responses being a "corrective" type response when a person doesn't include they agree or something.

Bgassassin, can you explain why you think this and why you think Nintendo would do it? Just cost/heat or does it offer some magic mojo enhancing capabilities?

Asymmetrical cache. Guess it just sounds weird that the three cores would be the same with one having more cache and doing more work so to speak. But since we're trying to guess something like point E with info probably from point A, and Nintendo's slightly eccentric taste in hardware it's tough to get a proper grasp on what's going on in their heads.

That second wind, going by the chart, was the 24th quarter, or somewhere around 2006. 2005-2006 had a lot of the best PS2 games: God of War, Shadow of the Colossus, Bully, Black, Final Fantasy XII, Kingdom Hearts II, Okami, Gran Turismo 4. As well, when the PS3 came out in 2006, the PS2 got a price drop, and a lot of people wanting a PS3 for Christmas got a PS2 due to the high price and hard to find cheaper units (Sony didn't make very many 20GB systems at all).

(PSTwo came out in 2004)

I can believe that. Thanks.
 
I just did some calculations and I think with 12 VLIW4 clusters (768 shader units) the chip's die should be around 90mm² @28nm + 33mm² for 32MB 1T-SRAM (~15mm² if they use 1T-SRAM-Q)

I think this is small enough (about the size of the Apple A5 SoC) for mass production even on a not yet mature process (which will get better over time)
 
Power7 is an entirely SMP design (massive, at that). That bit of info smells of an OS-dedicated core (wasteful, but still not entirely unheard of).

Interesting. Maybe Nintendo are gonna beef up the OS a bit. I'd like to be pleasantly surprised as we all were when we first glimpsed the Wii OS and its channels. Still holding out on the idea of a channel where I can just observe my friends play their individual games.

So, would the one core that handles the OS and acts as an admin be less beefy than the other two? Then the two stronger cores would be reserved for pure number crunching? This is quite the twist...
 
beside the nintendo faithful, just how many "traditional" gamers do you think only owned a wii?
My calculations came out somewhere between 0, and 0.5.

Interesting. Maybe Nintendo are gonna beef up the OS a bit. I'd like to be pleasantly surprised as we all were when we first glimpsed the Wii OS and its channels. Still holding out on the idea of a channel where I can just observe my friends play their individual games.

So, would the one core that handles the OS and acts as an admin be less beefy than the other two. Then the two stronger cores would be reserved for pure number crunching? This is quite the twist...
I remember that rumour/statement/whatever it was about watching other's play on the channels. I always thought that would be awesome - albeit unlikely.

Asymmetrical cache. Guess it just sounds weird that the three cores would be the same with one having more cache and doing more work so to speak. But since we're trying to guess something like point E with info probably from point A, and Nintendo's slightly eccentric taste in hardware it's tough to get a proper grasp on what's going on in their heads.
Sure, but I guess if they expect it to run double duty as an OS core, they might need to do that.
 
Back in April:



http://www.bit-tech.net/news/gaming/2011/04/25/project-cafe-system-specs-leaked/1

If we believe its a triple-core IBM cpu (not confirmed by IBM), then we must believe its based on a PowerPC chipset, built around the AMD R700, right?

How valid do you all think this rumor was? Is there anything stated to invalidate what was described?

Seems like there was a spec of truth in there that got lost amongst all the static. We know it doesn't look anything like an NES and is much smaller than a 360. We also know it uses POWER7 technology in some capacity, which is not very much like Xenon. About the only thing they seemingly got right was tri-core. So, in a nutshell, I don't think the bulk of that rumor was valid.
 
Interesting. Maybe Nintendo are gonna beef up the OS a bit. I'd like to be pleasantly surprised as we all were when we first glimpsed the Wii OS and its channels. Still holding out on the idea of a channel where I can just observe my friends play their individual games.

So, would the one core that handles the OS and acts as an admin be less beefy than the other two? Then the two stronger cores would be reserved for pure number crunching? This is quite the twist...

Could this also have anything to do with Nintendo's online plan?
Creating a decentralized "peer to peer" network using the WiiU?
 
So, would the one core that handles the OS and acts as an admin be less beefy than the other two? Then the two stronger cores would be reserved for pure number crunching? This is quite the twist...
That was my original reaction, but what some other guys have been suggesting makes more sense: this OS-running core could be a 'vanilla' core just with "a tad" more cache, and that would make it suitable for being the OS core as well as also running some game code. The key is in the extra cache - one of the main performance hurdles when running entirely different processes (e.g. an OS and some game) on the same physical core is how such processes tend to screw each-other's cache states, also known as cache thrashing. Giving this "hetero-software" core some extra cache (and using a bit of cache partitioning - something a Power7 core might do natively) could solve the cache thrashing problem, and get this core to perform about or on-par with the 'dedicated game' cores.
 
Interesting. Maybe Nintendo are gonna beef up the OS a bit. I'd like to be pleasantly surprised as we all were when we first glimpsed the Wii OS and its channels. Still holding out on the idea of a channel where I can just observe my friends play their individual games.

So, would the one core that handles the OS and acts as an admin be less beefy than the other two? Then the two stronger cores would be reserved for pure number crunching? This is quite the twist...

I agree, I hope they build of the Channel idea, although I think they won't.
 
That's true. And I believe the GPU won't be 28nm, but the it won't be new and expensive for the life time of the console.

Is it worth eating the costs on the first 5-10 million units sold in the first year until the process matures so that your console has a more longevity? They're probably aiming to sell 75+ million WiiU over the next 6-8 years, so over the life of the console it might make sense to take an upfront hit.

It would, but this is Nintendo we're talking about. No way they'll take the hit. Besides, if they were, they'd probably go with a 7000-series part. If that were the case, they'd run into an issue in that devs would not be able to use the code they're currently using and would have to port everything again from scratch. We're definitely looking at a 40nm part at best.
 
I think the amount of decline is sometimes exaggerated--it seems like a bigger difference since PS3 and X360 have had the real unusual event of very late increases.
Thanks for the effort but I wasn't talking only about hardware sales or sales as a whole.

Lets talk about THE market leader.

The PS2 just kept getting all the attention, the big hardware sales (always the top selling console in each territory), the amazing software sales (same), the massive third party support and all those exclusives. It was a juggernaut.

Hell, even if we only take the sales it got during the current generation into account, the PS2 would be neck and neck with both the PS3 and the 360 in hardware sales. Think about that.

Previous market leaders didn't reach those levels but they were still big.

But in the case of the Wii things seem different; at first it got tons of attention and sales thanks to how different and social gaming oriented it was, but at some point all the attention it got, all the hype and the support just disappeared, it evaporated.

Nowdays the Wii isn't the top selling console anymore, the PS3 is the top selling console worldwide despite its far higher price point, MS is getting all the attention in the US thanks to Kinect, and the third party support is lower than ever (and it was already bad to begin with).

That has never happened before to the market leader, is unprecedented. It's like the system is going downhill with no brakes.
 
The extra cache on the main core unlocks the 4th D.

Kutaragi involvement confirmed!

I just did some calculations and I think with 12 VLIW4 clusters (768 shader units) the chip's die should be around 90mm² @28nm + 33mm² for 32MB 1T-SRAM (~15mm² if they use 1T-SRAM-Q)

I think this is small enough (about the size of the Apple A5 SoC) for mass production even on a not yet mature process (which will get better over time)

Nice calculation. I would just add that AMD had other plans for the switch to VLIW4, but since 28nm wasn't ready when Cayman released they left some stuff out (they didn't say what) to avoid making the die even bigger. So if those things that were left out can work for a console I could see them making a return... unless Nintendo throws a complete curveball and has a GPU with 8-12 compute units. :P

That was my original reaction, but what some other guys have been suggesting makes more sense: this OS-running core could be a 'vanilla' core just with "a tad" more cache, and that would make it suitable for being the OS core as well as also running some game code. The key is in the extra cache - one of the main performance hurdles when running entirely different processes (e.g. an OS and some game) on the same physical core is how such processes tend to screw each-other's cache states, also known as cache thrashing. Giving this "hetero-software" core some extra cache (and using a bit of cache partitioning - something a Power7 core might do natively) could solve the cache thrashing problem, and get this core to perform about or on-par with the 'dedicated game' cores.

So coding extraordinaire, does a POWER7 core with two threads dedicated to OS and two to gaming sound plausible? And by plausible I mean splitting the threads in that manner.

And the latter part of your post made me think of an article linked to in a GC webpage I mentioned awhile back. Surprisingly the link is still alive.

http://www.eetimes.com/electronics-news/4166704/GameCube-clears-path-for-game-developers

To improve the internal data flow, IBM tried to eliminate "cache trashing," or wasting cache space on transient data. The 256-Kbit Level-2 cache can be locked down so that it retains only the data that needs to be reused. There's also an internal direct memory access that moves data from the cache while allowing the device to process a different set of data. This mechanism helps mitigate the incremental latency associated with compressing and decompressing the data.

Ah. Okay. I should have known! lol

No L3 cache?

I've gotten multiple indications that it's about 3MB of L2 cache (in the first kit) so I'm done with 16MB for now. And from my "reeducation" these past few months it would see that L3 wouldn't be that necessary for a console.
 
Alright, with this latest info, I think we can begin to sift through the rumors and begin to paint a picture of what Wii U is gonna pack.

Apparently, I missed the extent of this announcement back in June.
http://www-03.ibm.com/press/us/en/pressrelease/34683.wss
IBM plans to produce millions of chips for Nintendo featuring IBM Silicon on Insulator (SOI) technology at 45 nanometers (45 billionths of a meter). The custom-designed chips will be made at IBM's state-of-the-art 300mm semiconductor development and manufacturing facility in East Fishkill, N.Y.

Here's some specs I've drawn up.

CPU:
-3.5 Ghz Tri-core Power7.
-1 Power7 core w/ 4way SMT and 512kb L2
-2 streamlined cores with 2way SMT and 256kb L2
-16 MB embedded DRAM.

As for the GPU, more explanation is probably needed. I figure the case size may grow slightly and gain some more vents but it's definitely in the ball park of what Nintendo are gunning for. Rumors also keep popping up of dev kits using 770LEs and having 640 spus at 500 Mhz. Now, we also heard of heating problems early on. I figure Nintendo chose that Radeon card for dev kits because it is the closest AMD product to the specs they are gunning for EXCEPT the process. It seems risky but I'm gonna go out on a limb and say that Nintendo are pursuing a 28nm gpu. It seems the only realistic way of getting the chip to run cool enough.

I also think that the rumors of WiiU having a SoC were in reference to the GPU, and much like Hollywood, it will include an audio DSP, an ARM core of necessary strength, and it's own chunk of 1T-SRAM. I find it odd that there has been no announcement on any of that thus far, but Nintendo doesn't seem to have focused much on the graphics for most of those Wii U "experiences" so maybe we'll get more on that next year as real games are ready to be shown. Anyway...

GPU Package:
-28nm SoC
-600Mhz GPU w 640 SPUs
-integrated audio dsp
-"Starlet 2"
-16 mb embedded 1t-SRAM for frame buffer.

Oh yes, and 1 GB GDDR3 unified memory architecture. Hopefully they up that a bit.
 
Maybe interesting:

http://www.theverge.com/hd/2011/11/23/2581978/lenovo-q180-htpc-worlds-smallest-desktop

This HTPC is only half the size(volume) of the Wii and uses a 2,13GHz Atom (10W TDP) and a Radeon HD 6450 (27W TDP). These components alone use twice the power of the whole Wii.
The WiiU's volume is 3x higher than that of the Lenovo, so I expect them to be at least in the 50-60W TDP league.


It would, but this is Nintendo we're talking about. No way they'll take the hit. Besides, if they were, they'd probably go with a 7000-series part. If that were the case, they'd run into an issue in that devs would not be able to use the code they're currently using and would have to port everything again from scratch. We're definitely looking at a 40nm part at best.

Why, because of VLIW4? I am in no way a programmer but I think you are exaggerating here.
 
Maybe interesting:

http://www.theverge.com/hd/2011/11/23/2581978/lenovo-q180-htpc-worlds-smallest-desktop

This HTPC is only half the size(volume) of the Wii and uses a 2,13GHz Atom (10W TDP) and a Radeon HD 6450 (27W TDP). These components alone use twice the power of the whole Wii.
The WiiU's volume is 3x higher than that of the Lenovo, so I expect them to be at least in the 50-60W TDP league.

Of course, you're not going to be playing Battlefield 3 on a rig like this... and prices begin at $349 online.

For the basic model.
 
GPU Package:
-28nm SoC
-600Mhz GPU w 640 SPUs
-integrated audio dsp
-"Starlet 2"
-16 mb embedded 1t-SRAM for frame buffer.
Although the CPU architecture is weird, I think blu's theory makes a lot of sense. The GPU is really puzzling me though. lherre pointed out that the Wii U should be released within the year, so you'd think the GPU would be near final. That's why I'm wondering why there still seems to be a R770LE included in the devkits.

Although it's a good GPU that would send the 360 and PS3 crying to their moms, I can't wrap my head around why Nintendo would even consider putting it in the final unit. First of all, it's an outdated chip on a 55 nm process. To fit it into the power requirements, as well as fit in the brain_stew confirmed EDRAM, you would presume they customize it and shrink it down. Why they choose to customize a chip that has faulty shader units, and therefore ships with a more complex chip than necessary is beyond me. What's more is that Nintendo doesn't have to choose a 2008 chip. 2008 chips are actually designed in 2006 - the Southern Islands architecture AMD is releasing the coming months actually should have been finished at least halfway through last year. Even if Nintendo is being conservative, they still could have picked the Northern Islands (HD6xxx) architecture as the foundation for their chip. That one was on the drawing boards in 2009.

So basically, it would make absolutely no sense to use the RV770LE. But if it's still used in the devkits, what's going on?
Maybe interesting:

http://www.theverge.com/hd/2011/11/23/2581978/lenovo-q180-htpc-worlds-smallest-desktop

This HTPC is only half the size(volume) of the Wii and uses a 2,13GHz Atom (10W TDP) and a Radeon HD 6450 (27W TDP). These components alone use twice the power of the whole Wii.
The WiiU's volume is 3x higher than that of the Lenovo, so I expect them to be at least in the 50-60W TDP league.
The ability to disperse heat actually increases not with volume, but with surface area, which should be even greater (?). So yeah, I think the Wii U has enough room for decent components. EDIT: The HD6470 has 27W TDP only in the form of a PCI-e card. This chip should come in a more compact and embedded form factor. The power draw of the chip itself is probably only about half of that figure.
Why, because of VLIW4? I am in no way a programmer but I think you are exaggerating here.
Yeah. If the APIs stay the same pretty much all of the code can be reused. From what I understand VLIW4 is an efficiency improvement, and should only make difference in performance and not functionality or programmibility. Only close-to-metal things, which I guess should be discouraged in devkit phase, will maybe need to be rewritten. It is important though that any new GPU should be able to deliver as much as the old one in every department.
 
Power 7 has never made any sense, its far too big and complex for a console CPU with a lot of silicon with a lot of "wasted silicon" for a game console. Double precision floating point is useless in a console.
 
Power 7 has never made any sense, its far too big and complex for a console CPU with a lot of silicon with a lot of "wasted silicon" for a game console. Double precision floating point is useless in a console.
A user at Beyond3D pointed out that a single POWER7 core is about as big as an original Xenon core, by simply measuring an IBM-published die shot of the octocore version at 45nm. That's before core simplifications are made. It's not too 'big'.

Besides, IBM isn't making a custom CPU for just Nintendo, but for at least one other manufacturer as well. Even if something less than a POWER7 would suffice for Nintendo, I doubt that would be the case for MS or Sony. I'd say that a simple gaming derivative of the POWER7, without all the useless complex core features and enterprise chip features, is very likely to be used as the building block for Wii U, Xbox 720 and/or PS4.

Or are you pointing this out as a fact?
 
I'm going to force some Wii U news out of someone by saying that if there isn't any new info in the next week, I'm going to burn a picture of Iwata.
 
So the latest rumours suggest a 4-core design with 1 weird core that is probably going to be used by the OS?
No, definitely a 3-core, with one 'master' core and two other cores. The 'master' core has more cache than the others and that may suggest either that it's somewhat more powerful in other ways as well (like maybe having more hardware threads or a higher clock speed), or that it uses the extra cache to more efficiently switch to running the OS.
 
No, definitely a 3-core, with one 'master' core and two other cores. The 'master' core has more cache than the others and that may suggest either that it's somewhat more powerful in other ways as well (like maybe having more hardware threads or a higher clock speed), or that it uses the extra cache to more efficiently switch to running the OS.

Can I just ask what these 'recent rumours' were?
As far as I'm concerned, we've learnt nothing since E3.
Llhere's posts have been too vague imo, and I question his reliability... then again, he's all we've had for a long time, so.
 
Maybe interesting:

http://www.theverge.com/hd/2011/11/23/2581978/lenovo-q180-htpc-worlds-smallest-desktop

This HTPC is only half the size(volume) of the Wii and uses a 2,13GHz Atom (10W TDP) and a Radeon HD 6450 (27W TDP). These components alone use twice the power of the whole Wii.
The WiiU's volume is 3x higher than that of the Lenovo, so I expect them to be at least in the 50-60W TDP league.




Why, because of VLIW4? I am in no way a programmer but I think you are exaggerating here.

50-60W would make it barely current-gen (unless you mean just the GPU).

Meh, maybe I'm wrong. It would still require some significant changes, however.

But, I think I know a sneaky way to get some info from our friends. ;)
 
Why three cores though?
Doesn't POWER7 come in 4, 6, or 8 cores?
I assume they don't have any big problems with yields, or perhaps I'm wrong.
Or maybe it's a matter of die size?
It's not an off-the-shelf part (and it won't be off-the-shelf cores, either). It's completely irrelevant what Power7 CPUs IBM sells, as Nintendo will get a custom chip either way, so they can have as many cores as they need.

By the way, as far as I understand, all Power7 CPUs sold have eight cores, but a number of cores is deactivated. You can have them activated later by buying a license key from IBM and installing this on the CPU itself, which then unlocks additional cores. Yeah, I'm not making that up - that's how it used to be with POWER CPUs. The chip is so low volume that it's cheaper for IBM to produce just a single version.
 
Although the CPU architecture is weird, I think blu's theory makes a lot of sense. The GPU is really puzzling me though. lherre pointed out that the Wii U should be released within the year, so you'd think the GPU would be near final. That's why I'm wondering why there still seems to be a R770LE included in the devkits.

Although it's a good GPU that would send the 360 and PS3 crying to their moms, I can't wrap my head around why Nintendo would even consider putting it in the final unit. First of all, it's an outdated chip on a 55 nm process. To fit it into the power requirements, as well as fit in the brain_stew confirmed EDRAM, you would presume they customize it and shrink it down. Why they choose to customize a chip that has faulty shader units, and therefore ships with a more complex chip than necessary is beyond me. What's more is that Nintendo doesn't have to choose a 2008 chip. 2008 chips are actually designed in 2006 - the Southern Islands architecture AMD is releasing the coming months actually should have been finished at least halfway through last year. Even if Nintendo is being conservative, they still could have picked the Northern Islands (HD6xxx) architecture as the foundation for their chip. That one was on the drawing boards in 2009.

So basically, it would make absolutely no sense to use the RV770LE. But if it's still used in the devkits, what's going on?

Nintendo might just be trying to stick to "proven" technology, so they picked something old like they always have.
 
Why three cores though?
Doesn't POWER7 come in 4, 6, or 8 cores?
I assume they don't have any big problems with yields, or perhaps I'm wrong.
Or maybe it's a matter of die size?
If the CPU is indeed 'based on POWER7' we mean it's a new CPU built around the POWER7 core (a modified version of it). Although code execution on the Wii U CPU would be similar to a POWER7 CPU, the CPU package will be of completely different design, and can therefore use any amount of cores Nintendo and IBM want. The reason we can be quite certain it's a completely different design is because the POWER7 CPU has many features that are designed for enterprise servers instead of gaming consoles, and that it would take too much power for a small console as well.

Yields somehow don't seem to be a problem with IBM, but any large core will be expensive to make, more difficult to build a motherboard around, and more prone to production errors. Nintendo will therefore likely minimize the chip's area as far as they can. The original 360 CPU was 176 mm^2, and that one had some concessions in its design to minimize die size. I think area that's probably the upper limit of what we're going to see with the Wii U CPU.
BurntPork said:
Nintendo might just be trying to stick to "proven" technology, so they picked something old like they always have.
Oh come on. The N64 was completely built from unproven hardware. The GameCube also had GPU and RAM designs that were not seen anywhere before (and a state-of-the-art CPU). The 3DS uses high-end unproven FCRAM, a weird GPU and a type of screen nobody had seen before. Using "proven" technology for graphics is such a BS argument, not even Nintendo would use that. The Wii doesn't represent every console Nintendo has ever made.
 
Chips like those are extremely low-yield. Nintendo would only be able to make 2-4 million every year at most.
You're wrong. This chip (Juniper) is smaller (166mm^2) than the original Xbox 360 GPU (182mm^2 w/o EDRAM), and the 40nm production node is getting quite reliable and is continually improving.

The official TDP of the chip is 39W though. Still, that's a non-customized chip that doesn't use AMD's latest tech either.
 
I don't think there would be any price benefit to going with any of AMD's legacy products.

Too bad the 7xxx series is too new. The low TDP would have been an easy sell.
 
You're wrong. This chip (Juniper) is smaller (166mm^2) than the original Xbox 360 GPU (182mm^2 w/o EDRAM), and the 40nm production node is getting quite reliable and is continually improving.

The official TDP of the chip is 39W though. Still, that's a non-customized chip that doesn't use AMD's latest tech either.

No, I mean that the mobile version of Juniper is super-high quality. Most chips Nintendo would try to produce would end up more like the desktop version. Hardly any would have that low power consumption, which is why the chip is so much more expensive than Juniper.
 
Status
Not open for further replies.
Top Bottom