How big is the power gap between Wii U and PS3/360?

The PowerPC 750 (a.k.a., the G3)

The PowerPC 750, known to Apple users as the G3, is a design based heavily on the 603/603e. Its four-stage pipeline is the same as that of the 603/603e, and many of the features of its front-end and back-end will be familiar from the previous article's discussion of the older processor. Nonetheless, the 750 sports a few very powerful improvements over the 603e that made it faster than even the 604e.
PowerPC 750 summary table

Introduction date: November 10, 1997
Process: 0.25 micron
Transistor Count: 6.35 million
Die size: 167mm2
Clock speed at introduction: 233-266MHz
Cache sizes: 64KB unified L1, 512KB L2
First appeared in: Power Macintosh G3/233

The 750's significant improvement in performance over the 603/603e is the result of a number of factors, not the least of which are the improvements that IBM made to the 750's integer and floating-point capabilities.

A quick glance at the 750's layout will reveal that its execution core is wider than that of the 603. More specifically, where the 603 has a single integer unit the 750 has two, a simple integer unit (SIU) and complex integer unit (CIU). The 750's complex integer unit handles all integer instructions, while the simple integer unit handles all integer instructions except multiply and divide. Most of the integer instructions that execute in the SIU are single-cycle instructions.

Like the 603 (and the 604), the 750's floating-point unit can execute all single-precision floating-point operations, including multiply, with a latency of three cycles. Unlike the 603, though, the 750 doesn't have to insert a pipeline bubble after every third instruction in its pipeline. Double-precision floating-point operations, with the exception of operations involving multiplication, also take three cycles on the 750. Double-precision multiply and multiply-add operations take four cycles, because the 750 doesn't have a full double-precision FPU.

The 750's load-store unit and system register unit perform the functions described above for the 603, so they don't merit further comment.
The 750's front end and instruction window

The 750 fetches up to four instructions per cycle into its six-entry instruction queue (c.f. the 603's six-entry IQ), and it dispatches up to two non-branch instructions per cycle from the IQ's two bottom entries. The dispatch logic follows the four dispatch rules described above when deciding when an instruction is eligible to dispatch, and each dispatched instruction is assigned an entry in the 750's six-entry reorder buffer (compare the 603's five-entry ROB).


Figure POWERPC.4: The PowerPC 750

As on the 603 and 604, newly-dispatched instructions enter the reservation station of the execution unit to which they have been dispatched, where they wait for their operands to become available so that they can issue. The 750's reservation station configuration is similar to that of the 603, in that with the exception of the two-entry reservation station attached to the 750's LSU, all of the execution units have a single-entry reservation station. And like the 603, the 750's branch unit has no reservation station.

Because the 750's instruction window is so small, it has half the rename registers of the 604. Nonetheless, the 750's six general-purpose and floating-point rename register still put it ahead of the 603's number of rename registers (five GPR and four FPR). Like the 603, the 750 has one rename register each for the CR, LR, and CTR.

You would think that the 750's smaller reservation stations and shorter ROB would put it at a disadvantage with respect to the 604, which has a larger instruction window. But the 750's pipeline is shorter than that of the 604, so it needs fewer buffers to track fewer in-flight instructions. Even more importantly, though, the 750 has one very clever trick up its sleeve that it uses to keep its pipeline full.
Branch prediction on the 750

In the previous article's discussion of branch prediction, we talked about how dynamic branch prediction schemes use a branch history table (BHT) in combination with a branch target buffer (BTB) to speculate on the outcome of branch instructions and to redirect the processor's front end to a different point in the code stream based on this speculation. The BHT stores information on the past behavior (i.e., taken or not taken) of the most recently executed branch instructions, so that the processor can determine whether or not it should take these branches if it encounters them again. The target addresses of recently taken branches are stored in the BTB, so that when the branch prediction hardware decides to speculatively take a branch it will have immediate access to that branch's target address without having to recalculate it. The target address of the speculatively taken branch is loaded from the BTB into the instruction register, so that on the next fetch cycle the processor can begin fetching and speculatively executing instructions from the target address.

The 750 improves on this scheme in a very clever way. Instead of storing only the target addresses of recently taken branches in a BTB, the 750's 64-entry branch target instruction cache (BTIC) stores the instruction that's located at the branch's target address. When the 750's branch prediction unit examines the 512-entry BHT and decides to speculatively take a branch, it doesn't have to go code storage to fetch the first instruction from that branch's target address. Instead, the BPU loads the branch's target instruction directly from the BTIC into the instruction queue, which means that the processor doesn't have to wait around for the fetch logic to go out and fetch the target instruction from code storage. This scheme saves valuable cycles, and it helps keep performance-killing bubbles out of the 750's pipeline.
PowerPC 750 conclusions

In spite of its short pipeline and small instruction window, the 750 packed quite a punch. It managed to outperform the 604, and it was so successful that a 604-derivative was scrapped in favor of just building on the 750. The 750 and its immediate successors, all of which went under the name of "G3," eventually found widespread use both in the embedded arena and across Apple's entire product line, from its portables to its workstations.

The G3 lacked one important feature that separated it from the x86 competition, though: vector computing capabilities. While comparable PC processors supported SIMD in the form of Intel's and AMD's vector extensions to the x86 instruction set, the G3 was stuck in the world of scalar computing. So when Motorola decided to develop the G3 into an even more capable embedded and media workstation chip, this lack was the first thing they addressed.

powerpc.4.png


http://arstechnica.com/features/2004/10/ppc-2/

It's a bit like AMDs bulldozer architecture come to think of it, twice the integer units per core than floating point units.
 
I will say this: I know little about technical terms, but the Wii U has not felt to me like its underpowered, even compared to the Ps4 and XBo.

I feel that this mainly has to do with diminishing returns, but also with the polish Nintendo achieves with its titles. As a matter of fact, I feel this is why I feel the Wii U games look and play superior to old gen.

That said, I feel that PS4 and PS3 are not too far appart either. When I play God of War 3 vs. something like Infamous SS, in neither instance do I feel like presentation is lacking. Its weird, but at least for me, diminishing returns is a very real thing. (i do hate how long system updates take on ps3 and like the ps4 and wii u much better as a kit though).
 
Still don't believe me? Perhaps I should go further?
Final boss fight with Zeus has you destroying Gaia's heart
Do you want me to keep listing portions of the game? It really wasn't very good, I'd rather not.

If you are referring to this post of me "lumping" them together when someone else did, there's a reason why I put a tilde in front of 30, hint: it isn't for Uncharted or TLOU.
http://www.neogaf.com/forum/showpost.php?p=116970122&postcount=543
I don't know or care if you've played the game or not, most of the details you're providing could be from a Let's Play or a FAQ and there's really no way for me to verify that. It doesn't really matter. Lumping all of those games together as if they all have the same performance is silly when Uncharted has a pretty solid framerate, Last of Us frequently dips below 30FPS, and God of War runs at an unlocked framerate and averages closer to 40FPS than 30.
 
That said, I feel that PS4 and PS3 are not too far appart either. When I play God of War 3 vs. something like Infamous SS, in neither instance do I feel like presentation is lacking. Its weird, but at least for me, diminishing returns is a very real thing. (i do hate how long system updates take on ps3 and like the ps4 and wii u much better as a kit though).

The last gen lasted nearly a decade. Launch games on it were a step above the previous gen, but nothing compared to mid to late cycle 7G games. At the time you could still say Halo 2 had a great presentation on Xbox, but now there's no denying how much more the 360 could do.
 
When used well there is a gap, but it's a meager one.

The difference between a GCN and Wii. You're not likely to see that volume of grass in a PS360 title without a serious performance hit. This doesn't make WiiU much more powerful, but Zelda does show there is an advantage in WiiU's favor.

Little else.
 
I don't know or care if you've played the game or not, most of the details you're providing could be from a Let's Play or a FAQ and there's really no way for me to verify that. It doesn't really matter. Lumping all of those games together as if they all have the same performance is silly when Uncharted has a pretty solid framerate, Last of Us frequently dips below 30FPS, and God of War runs at an unlocked framerate and averages closer to 40FPS than 30.
The game was boring enough to play let alone watch. 3 is by far my least favorite in the series, wrote off the franchise after that god awful ending.

Again, I did not initially lump them together the poster I quoted did and the variations are the reason included a tilde. The game does dip to the 30's as evident in the video I posted regardless.
 
The last gen lasted nearly a decade. Launch games on it were a step above the previous gen, but nothing compared to mid to late cycle 7G games. At the time you could still say Halo 2 had a great presentation on Xbox, but now there's no denying how much more the 360 could do.

Im sure it will improve, but the difference last gen to previous was noticeable. Not happening for me this time, and as I said presentation heavy games like GoW3, Uncharted and such just look good any way you slice it and by any metric.
 
When used well there is a gap, but it's a meager one.

The difference between a GCN and Wii. You're not likely to see that volume of grass in a PS360 title without a serious performance hit. This doesn't make WiiU much more powerful, but Zelda does show there is an advantage in WiiU's favor.

Little else.

I wonder if the grass swaying is running on GPGPU. That sort of thing is what modern GPUs are great at while CPUs would clunk on.
 
I forget where I read this, but the GPU was rumored to have been only 15% weaker than the X1's. Which visually kind of makes sense, just looking at these games.

The CPU is more powerful than the X360's, but runs at a lower clock rate.
 
I forget where I read this, but the GPU was rumored to have been only 15% weaker than the X1's. Which visually kind of makes sense, just looking at these games.

Was it Bullshit Magazine? No way. It's 172Gflops vs 1.2Tflops on a newer GCN architecture. Even if you believe the unlikely 320 shaders theory, that's 340 vs 1200, and GCN (HD 7000 series) vs VLIW5 (HD 4000 series). Plus, the huge bandwidth difference, half the ROPs, etc. It's just crazy to say 15% difference.
 
I wonder if the grass swaying is running on the GPGPU. That sort of thing is what modern GPUs are great at while CPUs would clunk on.
Very possible.
I forget where I read this, but the GPU was rumored to have been only 15% weaker than the X1's.

The CPU is more powerful than the X360's, but runs at a lower clock rate.
Both of your points are just flat wrong.

A 176gflop GPU is in no way 15% weaker than a 1.2Tflop GPU. You're nearly at an order of magnitude performance differential.

As for the CPU it really depends on what you throw at it. Under certain cases it will perform better than 360's, but anything integer heavy is going to kill it in comparison.
 
Nintendo doesn't push for 1080p like a large portion of other devs for a start.
Most WiiU games run on higher resolutions than Xbox 360/PS3 games. Even triple A exclusive games like the Halo series didn't even render at 720p until 5-6 years after the release of the console.

WiiU is a powerful system. the 2GB (you cannot use all) of ram for example is huger compared to the 512megs you (don't) have on ps360.
 
Ni no Kuni I'll give you, because that has amazing graphics, but Blue Dragon...are you serious???

You must not have played Blue Dragon. It has some of the most beautiful and jaw dropping scenes I've seen not just in JRPGs, but in any game. It's a beautiful game with some awesome art direction.
 
You must not have played Blue Dragon. It has some of the most beautiful and jaw dropping scenes I've seen not just in JRPGs, but in any game. It's a beautiful game with some awesome art direction.
It really is.

Too bad it has such uneven performance. Like framerate dropping into the teens.
 
You must not have played Blue Dragon. It has some of the most beautiful and jaw dropping scenes I've seen not just in JRPGs, but in any game. It's a beautiful game with some awesome art direction.

Blue Dragon has pretty shitty tech. Still, Captain Toad (which was the original point of discussion here I think) mostly shines through art as well. Zelda U looks mighty impressive to me though.
 
I was told by a developer buddy that it's more or less a ps3.

I don't mind tho, Nintendo is in HD and 60fps and actually 720p.
 
I was told by a developer buddy that it's more or less a ps3.

I don't mind tho, Nintendo is in HD and 60fps and actually 720p.

The only first party PS3 game to not hit 720p was Resistance 3 and that game looks much better than most games on WiiU. Some games like Wipeout and GT were running at resolutions >720p (not quite 1920x1080 but definitely 1080p). Most 3rd party games were also 1280x720, except for few like CoD (which if I recall correctly runs at the unimpressive resolution of 880x720 on WiiU as well)
 
I don't even know what to say to people that think The Last of Us is the best looking last gen game. What in the actual fuck? It's not even the best looking ND PS3 game.

Yeah, I found this remark puzzling as well. Uncharted 3, Killzone 3, GoW3/Ascension, Ni No Kuni, and Journey are visually as impressive. The Last of Us looks very good, but its strongest assets are its character models and animation, whereas its textures and shadows look pretty dull.
 
The gap is generational

Some of the stuff on the Wii U looks pretty damn good, but there's nothing I've seen on it that I'd go as far to say it's a full generation above PS3/360.

But if by generational gap you mean the difference between Gamecube and Wii, then I guess that would be somewhat accurate.
 
People like to evade what matters, I will say it again, Wii U has been prioritized for 720p60FPS, how many of those did we get on PS360? Plus more memory alone can have a big impact on how the games look.

Just see how difficult it has been for some devs to hit 60fps on PS4 and XB1 and you know that Wii U was created with the right priorities in mind.

I am more than happy with the games released and the prospects that are to come.
 
People like to evade what matters, I will say it again, Wii U has been prioritized for 720p60FPS, how many of those did we get on PS360? Plus more memory alone can have a big impact on how the games look.

I am more than happy with the games released and the prospects that are to come.

Nintendo has always been big on 60fps. It was just the switch to HD that's impactful.
 
Nintendo has always been big on 60fps. It was just the switch to HD that's impactful.

Yes I know, but the combination has been delicious!! and that I did not get with my PS3 which I have been playing since 1 year after launch.

I was going to get a PC because of 60fps, but I could not spend what I wanted so I got a PS4 and going to save for a PC later when new chips arrive. I really hope that devs prioritize for 60fps because it really makes a difference, even more than resolution.
 
For unpredictable and highly sequential code, sure, Espresso's short pipeline and out-of-order capabilities will likely make up a lot of that difference, and outperforming Xenon in some cases would not surprise me.

For some predictable, carefully-structured, patterned parallel computational tasks, Xenon could in theory outperform Espresso by MORE than the clock speed difference.

It's not as simple as "one is more capable than the other." It's also worth noting that Xenon is a significantly larger processor than Espresso; there's more than a paradigmatic difference here.

This is another good point - people keep saying the Espresso is more efficient, and it is, but after a decade of the last gen, developers have gotten very good at working around the inefficiencies of the Cell and Xenon. With Cell you can unroll loops for instance to avoid branchy conditions which the SPEs suck at. And you can program in ways that make in-order matter less and come closer to theoretical performance.

There's no programming "for" out of order, it just makes less optimized code better by itself.
 
People like to evade what matters, I will say it again, Wii U has been prioritized for 720p60FPS, how many of those did we get on PS360? Plus more memory alone can have a big impact on how the games look.

Just see how difficult it has been for some devs to hit 60fps on PS4 and XB1 and you know that Wii U was created with the right priorities in mind.

I am more than happy with the games released and the prospects that are to come.

Yes, I am loving the traditional 60fps push from Nintendo.

But I think it was a mistake to not make the console a bit more powerful. The bad multiplatform game performance did hurt the perception of the machine greatly when it launched.
 
It really is.

Too bad it has such uneven performance. Like framerate dropping into the teens.

Ironically, the ugliest looking portion of the game, the wasteland area which serves as a part of the game's world map tutorial is where the game suffers the most from framerate issues. Kluke's phoenix attacks were especially brutal in that area because her shadow is arguably the largest of the group due to the wings.

I found the most beautiful parts of the game, however, to be quite strong in terms of framerate. Remember the first time you go onto that drill vehicle where you find Marumaro? Stunning. May sound like hyperbole (don't mind to be honest), but it seriously approached pixar film beautiful for me at many points. Then there's mural town, or even the cave you enter on your way to try and find a cure to help the sick people in maro's village, the nighttime camp scene where Shu and the rest of the characters get to see their relatives again. There's so much to like in the game. And don't get me started on the more technological environments where the lighting, colors, artstyle and character models just come together so perfectly that it's a real sight to behold on a really nice HDTV. It's the primary reason the moment I got my new Samsung last year, the first game I tossed in because I just had to see it in action was Blue Dragon. Even after every game that released on the 360 and PS3 since, Blue Dragon still has a distinct beauty that is untouched. This is why I often feel the focus on the overall technical impressiveness of games based on which and which fancy graphical technique they're using, or how closely they approach a true to life look can be so overrated. A beautiful game is a beautiful game no matter what technical feats (or lack there of) it took to get there. And, let's be honest, more realistic looking games tend to get the benefit of the doubt over a lot of other games that may be equally or even more beautiful on their own terms. At least that's my impression.

Still, to get back to the topic at hand, I think the Wii-U is a much more powerful/capable system compared to the 360 and PS3, even if there are ultimately questions regarding its CPU. Devs get use to doing things one or two way for so long that when something new comes along and is slower in some regards, even if better in others, it can end up creating all kinds of problems that lead to some of the situations we've seen where 360 or PS3 versions of a multi-platform are better than their Wii-U counterpart. I don't think there's any question that the Wii-U tops both, and this will slowly but surely become a lot more apparent as time goes. If the Wii-U were selling gangbusters and moving third party software at an incredible pace, you'd see how quickly the Wii-U starts to do things people didn't think it would be capable of, largely because it would have greatly risen on the priority lists of developers.

People like to evade what matters, I will say it again, Wii U has been prioritized for 720p60FPS, how many of those did we get on PS360? Plus more memory alone can have a big impact on how the games look.

Just see how difficult it has been for some devs to hit 60fps on PS4 and XB1 and you know that Wii U was created with the right priorities in mind.

I am more than happy with the games released and the prospects that are to come.

I'll admit to not knowing at what resolution and fps most wii-u titles are released at, but if there's as big a commitment to 720p and 60fps as you imply there is, that's something I don't have a problem with.
 
People like to evade what matters, I will say it again, Wii U has been prioritized for 720p60FPS, how many of those did we get on PS360?
60fps is a game design choice, not a hardware priority choice.

Give the same hardware to other developers and you will see other priorities.
 
M°°nblade;117010439 said:
60fps is a game design choice, not a hardware priority choice.

Give the same hardware to other developers and you will see other priorities.

He means in hardware, GPU, eDRAM and bandwidth makes 720p a no brainer. This is without mentioning faster shaders, and the fact you can store all render targets in that very fast 32MB of eDRAM.
 
People like to evade what matters, I will say it again, Wii U has been prioritized for 720p60FPS, how many of those did we get on PS360? Plus more memory alone can have a big impact on how the games look.

Just see how difficult it has been for some devs to hit 60fps on PS4 and XB1 and you know that Wii U was created with the right priorities in mind.

I am more than happy with the games released and the prospects that are to come.


Hardware isn't "prioritized" for a certain framerate, it has X amount of power and developers can divvy it up how they see fit.

The games struggling to hit 60fps on the XBO/PS4 would likely churn on the Wii U, it's nothing to do with being "more optimized" for 60FPS, that doesn't make sense.

Unless you meant prioritizing for 720p resolution rather than shooting for 1080 like the others.
 
I don't think there's much difference at all between the ps360 and wii u. There is a significant difference between the art style that Nintendo uses on its game and the art styles used on a lot of ps360 games though. Quite frankly I haven't seen anything on wii u that I don't think could have been done on the ps360. Quality art and presentation makes Nintendo games seem more graphically impressive than they actually are. Not to take anything away from how great 3d world, mk8, and the like look and play, I just don't see them as a generation ahead graphically.
 
For unpredictable and highly sequential code, sure, Espresso's short pipeline and out-of-order capabilities will likely make up a lot of that difference, and outperforming Xenon in some cases would not surprise me.

For some predictable, carefully-structured, patterned parallel computational tasks, Xenon could in theory outperform Espresso by MORE than the clock speed difference.

It's not as simple as "one is more capable than the other." It's also worth noting that Xenon is a significantly larger processor than Espresso; there's more than a paradigmatic difference here.
I agree with your general conclusion, but just to give you a hint of the magnitude of Xenon/PPE's large-latency & in-order-ness disadvantages, to really achieve better per-clock FLOPS against Gekko, PPE needs to rely on hyperthreading. Otherwise, in single threaded scenarios, the twice-wider SIMD units of PPE could stay effectively at Gekko's paired-singled levels in FLOPS/clock.
 
M°°nblade;117010439 said:
60fps is a game design choice, not a hardware priority choice.

Give the same hardware to other developers and you will see other priorities.

Hardware isn't "prioritized" for a certain framerate, it has X amount of power and developers can divvy it up how they see fit.

The games struggling to hit 60fps on the XBO/PS4 would likely churn on the Wii U, it's nothing to do with being "more optimized" for 60FPS, that doesn't make sense.

Unless you meant prioritizing for 720p resolution rather than shooting for 1080 like the others.

Yes I know but it kind of goes both ways, the hardware has to be capable and the game is design around that goal. I can put it another way then, most Wii U games look good while also going for 60fps, the hardware helps achieving this goal. The same way that PS4 seems to be capable of achieving 1080p60 if the code is optimized and the game is designed around that goal.

Wii U 60fps games
MK8
SM3DW
Sonic Lost World
DKC TF
W101
Nintendo Land
Rayman Legends
Child of Light
NSMBU/NSLU

Coming
Hyrule Warriors
Bayo 1 and 2
SSBU (actually 1080p60)
Sonic Boom
Splatoon
Yoshi´s Wooly World
Kirby and the Rainbow Curse
 
Didn't Kameo have similar long grass that swayed /shrug

Wii U is a small step above the PS3 and 360. Same ballpark. There's a huge gulf between it and PS4/X1, though.
 
Didn't Kameo have similar long grass that swayed /shrug

Wii U is a small step above the PS3 and 360. Same ballpark. There's a huge gulf between it and PS4/X1, though.

Not with the draw distance of Zelda U. I really see no point bringing up PS4 or XO, it's common knowledge that WiiU is technically far behind. It's shame a though, some here even think having a memory advantage is not an advantage.
 
How did Nintendo mess up so much as to have a GPU with 1/10 the Flops as a PS4? One year shouldn't make that much of a difference. Even at 1/4 the power consumption.
 
.
How did Nintendo mess up so much as to have a GPU with 1/10 the Flops as a PS4? One year shouldn't make that much of a difference. Even at 1/4 the power consumption.

45/40nm density vs 28 (remember this is squared, so at the same die size it's a huge difference in the number of transistors, and the Wii U has smaller die sizes on top of that), and half the die size is spent on the eDRAM to make up for the slow DDR3 (which is like the XBO, but that still has a lot more main GPU logic). And with the whole system drawing 33w, I wonder how little the GPU has to sip on. Optimistically, 15?

I don't' think "mess up" is the word if it means they did it by accident, if it was a mistake or not they probably did it on purpose. They believe strongly that consoles should be small and stay out of the way, and have low power consumption. It's a cultural thing, but that culture is also changing faster than them I think (see the PS4).
 
Wii U graphics are WAY better than 360 and PS3.

Wii U sound is WAY worse, though. No optical output :(

technically it's better as all games have access to 5.1 uncompressed LPCM audio (not sure if this is the case on 360. It is on PS3)

optical audio is a backwards compatibility feature. Sucks if you are affected by it, but Wii U's audio capability is on par with current-gen systems.

uncompressed 5.1 > Dolby/DTS 5.1 over optical.
 
.

45/40nm density vs 28 (remember this is squared, so at the same die size it's a huge difference in the number of transistors, and the Wii U has smaller die sizes on top of that), and half the die size is spent on the eDRAM to make up for the slow DDR3 (which is like the XBO, but that still has a lot more main GPU logic). And with the whole system drawing 33w, I wonder how little the GPU has to sip on. Optimistically, 15?

I don't' think "mess up" is the word if it means they did it by accident, if it was a mistake or not they probably did it on purpose. They believe strongly that consoles should be small and stay out of the way, and have low power consumption. It's a cultural thing, but that culture is also changing faster than them I think (see the PS4).

The annoying thing is 28nm was available almost a year before the release of the Wii U. They could have had dev kits on 40nm and planned to release on 28nm. I think they were playing it safe though, 28nm was probably too new for them to risk it.

I think it was a mistake for them to try and get the power consumption down as low as possible (especially when staying on an old die) most people don't care how much power the thing is drawing, and the design of the console isn't exactly great from a size perspective it's not exactly tiny. There's a fair amount of wasted space inside the console (Not nearly as bad as the X1 though). But they probably didn't want it ultra compact to keep manufacturing costs down.
 
Smash bros could run on a links. There has never been anything technically impressive with smash bros.
Looking at Zelda for Wii U , it just looks so beautiful and to me beyond what the PS3 and 360 can do. I also think that a game like Smash Bros Wii U wouldn't run on the PS3 and 360 without framerate issues.

I remember people on here saying that the Wii U has more RAM than the PS3 and 360 but its clocked much slower that it doesn't really help it as much, but was that really true now with the games we see coming out for the Wii U?
 
The annoying thing is 28nm was available almost a year before the release of the Wii U. They could have had dev kits on 40nm and planned to release on 28nm. I think they were playing it safe though, 28nm was probably too new for them to risk it.

I think it was a mistake for them to try and get the power consumption down as low as possible (especially when staying on an old die) most people don't care how much power the thing is drawing, and the design of the console isn't exactly great from a size perspective it's not exactly tiny. There's a fair amount of wasted space inside the console (Not nearly as bad as the X1 though). But they probably didn't want it ultra compact to keep manufacturing costs down.

Based on what I have read outside of this forum, and what has been discussed on this forum in the Latte thread, is the eDRAM limited the die size. You can't shrink the eDRAM, it's process is 40nm.
 
I wonder why the Wii U needs so much RAM just for the OS. I wonder if they will ever lower it and give it back for games?

Hedging their bets for future features.

Nobody wants to be caught with their pants down like Sony on the PS3, so everyone this generation reserved huge chunks of RAM to the OS at the start so they could add new features they think of and match features the competition comes up with. Because once you give RAM back to the developers, you can't take it back later.
 
Top Bottom