WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Oh, we are finally done comparing work in progress games that have a difference in budget possibly higher than 10x? Lets try not to post unfinished trailers and pictures from Rockstar, they often bullshot up their entire game, if there is a demo video of someone playing on a 360/ps3, we could easily discuss that, otherwises there is little point imo.

I posted the gifs because I said, from the start, comparing the two games is stupid.

GTA5 still looks better, and people who say otherwise are in denial. Just because a X has a much higher draw distance... filled with mostly nothing, doesn't mean it's better graphically.
 
Back onto the Wii U GPU.

I understand the explanation about the Wii U's 160 exceeding the X360 216 by means of shader efficiency, but how do you explain that if we include the gamepad overhead, does not make sense to me. No one has mention that yet AFAIK.
 

And when the PS360 versions release, they magically won't look as good as the footage that has been shown (IQ/smoothness wise)! Whodathunk it!

Thank god, let the big boys fight over this embarrassing shit. In fact we're already there. Xbone vs PS4 fights are already the absolute bottom of the human barrel, and it's only going to get worse from here on out. Digital Foundry threads will be legendary in the coming years.

It's funny, because if people cared that much about graphics in their games to the point where 3 average frames per second is the world-crushing blow that it is, there's a more beneficial platform to them.

I dunno man, I see more dignity in two middleweights slugging it out than I do watching a 30 year old welterweight swing punches at two 80 year old welterweights. Meanwhile, the heavyweight eats popcorn in the corner.

I fixed your analogy for you.
 
Hmm. How do we know that?
If someone thought that was all dynamic lighting then I could understand them coming to that conclusion. I can't think of even many 10.1 PC games that could render a scene like that without turning into a slide show.

That said the demo's IQ isn't anywhere close to those pics, I'd be ecstatic if the final game looked anywhere near that smooth.
Back onto the Wii U GPU.

I understand the explanation about the Wii U's 160 exceeding the X360 216 by means of shader efficiency, but how do you explain that if we include the gamepad overhead, does not make sense to me. No one has mention that yet AFAIK.
That's an interesting point. Prior to the systems launch I assumed that the pad would either mirror or just show a really simplistic image compared to what's on the TV (kind of like the 4th output on Nvidia's higher end cards) but I've seen since then that the pad is capable of fully rendering separate scenes and we've been told the system supports a 3rd image on a 2nd pad.

That's really not adding up to a low level GPU's capabilities.
 
Back onto the Wii U GPU.

I understand the explanation about the Wii U's 160 exceeding the X360 216 by means of shader efficiency, but how do you explain that if we include the gamepad overhead, does not make sense to me. No one has mention that yet AFAIK.

If someone thought that was all dynamic lighting then I could understand them coming to that conclusion. I can't think of even many 10.1 PC games that could render a scene like that without turning into a slide show.

That said the demo's IQ isn't anywhere close to those pics, I'd be ecstatic if the final game looked anywhere near that smooth.

That's an interesting point. Prior to the systems launch I assumed that the pad would either mirror or just show a really simplistic image compared to what's on the TV (kind of like the 4th output on Nvidia's higher end cards) but I've seen since then that the pad is capable of fully rendering separate scenes and we've been told the system supports a 3rd image on a 2nd pad.

That's really not adding up to a low level GPU's capabilities.

Yes, that is a good question. I asked a similar one awhile ago, but I dont believe we have an answer for that yet.
 
I dunno man, I see more dignity in two heavyweights slugging it out than I do watching a welterweight swing punches at two 80 year old middleweights.

This is my favourite analogy of the week.

Considering that most of the fighting is done by fat/overly skinny retard internet ninja's, then it's actually not very entertaining at all. It's going to be the same crap as last gen. Sony and MS fans who have consoles that for nearly all intents and purposes are the same will argue over the stupidest crap, the most unnoticeable of pixels and frame rate drops, etc. Nintendo fans will be nowhere near as far behind as they were last time, and some go off the deep end again in what they think the console can do, while PC gamers yuk it up with a brand of elitist scumbaggery that only doesn't seem so bad because Sony and MS fanboys are acting the same in regard to less capable hardware.

Meanwhile, the rest of the world continues to see gamers and gaming in a negative light.
 
If someone thought that was all dynamic lighting then I could understand them coming to that conclusion. I can't think of even many 10.1 PC games that could render a scene like that without turning into a slide show.

That said the demo's IQ isn't anywhere close to those pics, I'd be ecstatic if the final game looked anywhere near that smooth.

That's an interesting point. Prior to the systems launch I assumed that the pad would either mirror or just show a really simplistic image compared to what's on the TV (kind of like the 4th output on Nvidia's higher end cards) but I've seen since then that the pad is capable of fully rendering separate scenes and we've been told the system supports a 3rd image on a 2nd pad.

That's really not adding up to a low level GPU's capabilities.
Well it's quite simple. If you show an independent fully 3d scene on the pad, your main screen will undoubtedly suffer. There are only so many resources to go around. Also apart from the Garden demo I can't recall seeing a Wii U title doing two fully independent scenes of really high quality.
 
I don't agree at all, even ignoring gameplay videos of the likes of Killzone Shadow Fall, Infamous Second Son, Driveclub, Dead Rising 3 and Ryse it's still a massive difference in CPU / GPU speed / cores / shaders and memory bandwidth on paper -


CPU -

PS4 - 8 Core @ 1.6 GHz.
XBO - 8 Core @ 1.6 GHz.
WiiU - 3 Core @1.2GHz.


GPU -

PS4 - 1.8 TFLOPs / DirectX 11 equivalent feature set.
XBO - 1.3 TFLOPs / DirectX 11.1 equivalent feature set.
WiiU - 176 GFLOPs / DirectX 10.1 equivalent feature set.


RAM (Dedicated to games only) -

PS4 - 5GB @ 176GB/s.
XBO - 5GB @ 60GB/s + 32MB of sRAM @ 102 GB/s.
WiiU - 1GB @ 12.8GB/s + 32MB of eDRAM @ 70GB/s.

All PS4 and XBO games can also be installed to their HDD's.

None of what you list here is even remotely fact, especially FLOPS and it certainly doesn't tell the capability of any of the hardware. The probable FLOPs performance is more likely in the 200-250 rang, not 176 or 352.

The clocks mean nothing. If they were the biggest measurement of performance, then that would make the PS4 CPU weaker than the CELL because it has 8 SPU's at 3.2 Ghz that people (including a dev) were swearing for full fledged cores in the Espresso thread.

You can't truncate the other ram sticks, and only 300 MB of the Wii U's RAM is used for system files. That some of the info that was confirmed through vgleaks. That other 700Mb isn't being used at all or no one knows what its used for according to the docs and will like be be put to use for games or game related features in future firmware updates.
 
None of what you list here is even remotely fact, especially FLOPS and it certainly doesn't tell the capability of any of the hardware. The probable FLOPs performance is more likely in the 200-250 rang, not 176 or 352.

The clocks mean nothing. If they were the biggest measurement of performance, then that would make the PS4 CPU weaker than the CELL because it has 8 SPU's at 3.2 Ghz that people (including a dev) were swearing for full fledged cores in the Espresso thread.

You can't truncate the other ram sticks, and only 300 MB of the Wii U's RAM is used for system files. That some of the info that was confirmed through vgleaks. That other 700Mb isn't being used at all or no one knows what its used for according to the docs and will like be be put to use for games or game related features in future firmware updates.
So you're saying developers could, in the future, have access to 1.7 GB of RAM?
Or maybe Nintendo will introduce their own game streaming service.
 
So you're saying developers could, in the future, have access to 1.7 GB of RAM?
Or maybe Nintendo will introduce their own game streaming service.

Probably more like 1.5.

Even the PS3 and 360 games never had access to all of their 512 MB of RAM, and the number they did have access to grew as Sony and Microsoft released more efficient firmware that cleared up more for use.

Where does the game streaming service thing come from? Is that going to be the next "CELL" after the GDDR5 thing?
 
Probably more like 1.5.

Even the PS3 and 360 never had access to all of their 512 MB of RAM, and the number they did have access to grew as Sony and Microsoft released more efficient firmware that cleared up more for use.

Where does the game streaming service thing come from?
Sony and MS are doing it and Nintendo is taking a step with Mario Kart towards that.
 
Well it's quite simple. If you show an independent fully 3d scene on the pad, your main screen will undoubtedly suffer. There are only so many resources to go around. Also apart from the Garden demo I can't recall seeing a Wii U title doing two fully independent scenes of really high quality.
I've never played a multi-player game on the Wii U that uses the pad as the 2nd player's screen but I do remember hearing complaints about the visual quality degrading while playing that way in Sonic Racing. I really don't even know how many games support the feature so maybe someone else can comment.

I can say that in the one section of the W101 demo where the action switches to the lower screen inside the hanger I didn't notice any change in game' performance at all even though you can see they large group of characters on both screens at once at times.

I'll play closer attention the next time that I play to see if the game's performance does dip in that section.
 
I've never played a multi-player game on the Wii U that uses the pad as the 2nd player's screen but I do remember hearing complaints about the visual quality degrading while playing that way in Sonic Racing. I really don't even know how many games support the feature so maybe someone else can comment.

I can say that in the one section of the W101 demo where the action switches to the lower screen inside the hanger I didn't notice any change in game' performance at all even though you can see they large group of characters on both screens at once at times.

I'll play closer attention the next time that I play to see if the game's performance does dip in that section.
Sonic Racing had horrible compression rates for off TV streaming. It is the only game I have seen that did that.
 
The humans in X look worse than some PS2 games. There's a reason why the trailers all avoid showing their faces except that brief moment in the original trailer.

650x.jpg

Humans? Plural? You base this on a picture of Shulk that wasn't even in game footage at the end of the announcement video while the game was still in its earliest stages, and not the actual in game humans who's faces and bodies we can already see as clearly being substantially more detailed than most of what was seen on the PS3 and 360?

Also, of course the usual suspects who you rarely ever see jump into to fist bump and support the unfounded presumption in your post as absolute, indisputable truth like always.

See this post that was right below yours for examples.

I don't think we know a lot about the more technical stuff.

As said before, the two pieces of footage we saw were only 'teasers'. A real trailer will come somewhere in the future and my guess is that they'll start giving out more information and start talking then.

As for the characters, I think Monolith will give them an upgrade these months. Most of Nintendo their games start with gameplay etc., polishing comes last. (take a look at Pikmin and W101 for example ^^

First reveal:

qbw.png


Final result one year later:

611.jpg



BTW: why are people here running circles about ps4-games. I miss the real hw-talk or analysis from Wii U footage :(

Not that I'm expecting you to respond to any argument that uses facts as opposed to pure assumptions like this.

In fact, the bulk of the arguments against the Wii U's capability go exactly like this.

People take the worst images or gameplay examples they can find, truncate everything around it that doesn't support their negative view and then promote ever flaw as the absolute limit of the Wii U's capability followed by some comment placing it as low on the scale vs the PS3/360 as they can get it without sounding to ludicrous. This was rampant with people using the launch ports as examples and apparently still is even now.

Pretty much all arguments against the Wii U's power take the first picture in his post and limit the Wii U to it with 0% open mindedness about any other possibility.
 
Sony and MS are doing it and Nintendo is taking a step with Mario Kart towards that.

Sony is using Gakai primarily to get around the complete lack of backwards compatibility in the PS4 and Microsoft was already doing it with the 360 alongside Kinect as a community feature. What Microsoft is doing with the XboxOne is just beefing it up more.

Nintendo has no need for such a thing as they already have backwards compatibility in their system and they have Miiverse as a community feature.

What are you talking about with Mario Kart? Nintendo allowed ghost data uploading with Mario Kart Wii.

I've said it before, and I'll say it again. Why do people want 3 of the same console? I like the fact that all of the hardware is different. The less saturation in the game industry the better.

Though this is all off topic.

People are making a way bigger deal out of the Wii U's RAM than it actually is, especially where the "bandwidth" is concerned. The RAM performance is clearly better than the PS3/360 according to all of the comments by devs despite how people try to twist the numbers to push the "bandwith starved"/weaker school of though. Its the same with the GPU performance. The CPU performance is still up in the air though it looks like it may possibly be better than the 360/PS3's after all as well.

This is why it is irksome when people start slinging numbers around. The most important numbers to CPU/GPU performance aren't hertz and flops these day, but that's all that you see people posting, because most have an agenda(painting the hardware as weak) and having people only see low number on the forefront with no logical explanation as to how they relate to performance is key for painting a negative picture.
 
Sony is using Gakai primarily to get around the complete lack of backwards compatibility in the PS4 and Microsoft was already doing it with the 360 alongside Kinect as a community feature. What Microsoft is doing with the XboxOne is just beefing it up more.

Nintendo has no need for such a thing as they already have backwards compatibility in their system and they have Miiverse as a community feature.

What are you talking about with Mario Kart? Nintendo allowed ghost data uploading with Mario Kart Wii.

I've said it before, and I'll say it again. Why do people want 3 of the same console? I like the fact that all of the hardware is different. The less saturation in the game industry the better.

Though this is all off topic.

People are making a way bigger deal out of the Wii U's RAM than it actually is, especially where the "bandwidth" is concerned. The RAM performance is clearly better than the PS3/360 according to all of the comments by devs despite how people try to twist the numbers to push the "bandwith starved"/weaker school of though. Its the same with the GPU performance. The CPU performance is still up in the air though it looks like it may possibly be better than the 360/PS3's after all as well.

This is why it is irksome when people start slinging numbers around. The most important numbers to CPU/GPU performance aren't hertz and flops these day, but that's all that you see people posting, because most have an agenda(painting the hardware as weak) and having people only see low number on the forefront with no logical explanation as to how they relate to performance is key for painting a negative picture.
They usually go against what the developers say as they view the developers praising the Wii U as "damage control" or just try and compare it to the PS4 and Xbox One, saying how the developers would do better on those consoles instead of the Wii U. It's just their views.
 
I've never played a multi-player game on the Wii U that uses the pad as the 2nd player's screen but I do remember hearing complaints about the visual quality degrading while playing that way in Sonic Racing. I really don't even know how many games support the feature so maybe someone else can comment.

I can say that in the one section of the W101 demo where the action switches to the lower screen inside the hanger I didn't notice any change in game' performance at all even though you can see they large group of characters on both screens at once at times.

I'll play closer attention the next time that I play to see if the game's performance does dip in that section.

Black Ops 2 does it very well. Also don't forget about Nintendo Land. The other examples are simple item/maps/etc.

Next games to render 3D images on both TV and gamepad I think will be CoD Ghost with the Dog Cam (I wish not really sure) and Splinter Cell.

The other thing that I would like to know from devs is if the Wii U not being x86 is hard to develop for. I am guessing it is easy to develop for (another common misconception) by seeing how quickly Criterion built a good version of NFSMWU. This is my "guess opinion", like I said, I wished more devs could say something about it.
 
Well it's quite simple. If you show an independent fully 3d scene on the pad, your main screen will undoubtedly suffer. There are only so many resources to go around. Also apart from the Garden demo I can't recall seeing a Wii U title doing two fully independent scenes of really high quality.
I believe it is a giving that it will take resources to have an independent full 3D screen on the pad. The question here is what modifications were done to keep the system from having unconsistent performance? Nintendo has a huge focus on consistent performance. For the DS, for example, they had two 2D GPUs and a 3D GPU that "forced" 60fps on one screen or splits 30fps on both so that the system will have a consistent performance. In the case of the 3DS, they doubled the GPU clockspeed to deal with the drain of resources when the system uses 3D. I would expect that Nintendo has designed the system to still have its desired performance even with two independent 3d screens.

None of what you list here is even remotely fact, especially FLOPS and it certainly doesn't tell the capability of any of the hardware. The probable FLOPs performance is more likely in the 200-250 rang, not 176 or 352.

The clocks mean nothing. If they were the biggest measurement of performance, then that would make the PS4 CPU weaker than the CELL because it has 8 SPU's at 3.2 Ghz that people (including a dev) were swearing for full fledged cores in the Espresso thread.

You can't truncate the other ram sticks, and only 300 MB of the Wii U's RAM is used for system files. That some of the info that was confirmed through vgleaks. That other 700Mb isn't being used at all or no one knows what its used for according to the docs and will like be be put to use for games or game related features in future firmware updates.
Xbox1 and PS4 have reserved at least 2GB of their RAM, so I wouldn't expect them to cut too much RAM to be freed from the Wii U anytime soon.
 
Black Ops 2 does it very well. Also don't forget about Nintendo Land. The other examples are simple item/maps/etc.

Next games to render 3D images on both TV and gamepad I think will be CoD Ghost with the Dog Cam (I wish not really sure) and Splinter Cell.

The other thing that I would like to know from devs is if the Wii U not being x86 is hard to develop for. I am guessing it is easy to develop for (another common misconception) by seeing how quickly Criterion built a good version of NFSMWU. This is my "guess opinion", like I said, I wished more devs could say something about it.
For the CPU, my guess that it is not hard to develop for if you are familiar with Gekko/Broadway. The complications happens when you are trying to port CPU coding from the current-gen games to the Wii U due to its different architecture. The other next-gen systems would have issues with that, too, but they have a bit more raw power to work with.

Latte seems to be easier to work with, there appears to be some optimization issues during launch as well.
 
People are making a way bigger deal out of the Wii U's RAM than it actually is, especially where the "bandwidth" is concerned. The RAM performance is clearly better than the PS3/360 according to all of the comments by devs despite how people try to twist the numbers to push the "bandwith starved"/weaker school of though. Its the same with the GPU performance. The CPU performance is still up in the air though it looks like it may possibly be better than the 360/PS3's after all as well.
I'm pretty sure Wii U's CPU is way better than the 360/PS3's, and may be comparable to the PS4 and Xbox One's. If you go back and look at the original IBM presentation for the GameCube's CPU (the last time Nintendo actually explained their console's specs), you'll see that it was very fast for game oriented design at the time, and far exceeded Nintendo's expectations. The GCN CPU can actually be used to render polygons and lighting effects like a shader. The actual presentation says "SIMD FPU accelerates computation for custom lighting and geometry, supporting more realistic (or fantastic) visual effects". This sounds to me as though the CPU could be used as a fully programmable shader in addition to the fixed function stuff on Flipper. Do we know if any GCN/Wii games used this functionality?

To bring us to Wii U: IGN- "Calling Wii U 'Espresso' CPU "similar to Wii's 'Broadway' [CPU]," Marcan broke down the chip as having three cores (compared to Wii's one), with each clocking at a very precise 1.243125GHz (compared to Wii's 729 MHz and GameCube's 485 MHz) and capable of running one thread of data at a time."

Considering the fact the Wii U's GPU can write to the same embedded RAM the CPU can, there's no reason to assume the Wii U's CPU can't help its GPU render graphics- in fact it'd have to in order to maintain backward comparability with the Wii. Reading the original Gekko design explanation really opened my eyes to what the Wii U's CPU likely is, and what the Wii U is. Its very likely the Wii U's entire design (GPU and CPU) is derived directly from the GameCube. And why shouldn't it be? The GameCube was an incredibly efficient console. Now I'm learning why. If the CPU can help render graphics and run game data, then theoretically that gives the developer the option of what he wants to focus on by a game to game basis. In a physics based game, he can put the CPU to work on physics. In a graphic intensive game, he can put the CPU to work lending a hand to Flipper. Incredible.

I'll have a longer post on this in the CPU thread at some point.
 
For the CPU, my guess that it is not hard to develop for if you are familiar with Gekko/Broadway. The complications happens when you are trying to port CPU coding from the current-gen games to the Wii U due to its different architecture. The other next-gen systems would have issues with that, too, but they have a bit more raw power to work with.

Latte seems to be easier to work with, there appears to be some optimization issues during launch as well.

Exactly, this has been pointed out before in the thread, but just wanted to add that I think most get confused by the effort to turn a current gen game into a Wii U game, with actually developing for the Wii U.
 
I'm pretty sure Wii U's CPU is way better than the 360/PS3's, and may be comparable to the PS4 and Xbox One's. If you go back and look at the original IBM presentation for the GameCube's CPU (the last time Nintendo actually explained their console's specs), you'll see that it was very fast for game oriented design at the time, and far exceeded Nintendo's expectations. The GCN CPU can actually be used to render polygons and lighting effects like a shader. The actual presentation says "SIMD FPU accelerates computation for custom lighting and geometry, supporting more realistic (or fantastic) visual effects". This sounds to me as though the CPU could be used as a fully programmable shader in addition to the fixed function stuff on Flipper. Do we know if any GCN/Wii games used this functionality?

To bring us to Wii U: IGN- "Calling Wii U 'Espresso' CPU "similar to Wii's 'Broadway' [CPU]," Marcan broke down the chip as having three cores (compared to Wii's one), with each clocking at a very precise 1.243125GHz (compared to Wii's 729 MHz and GameCube's 485 MHz) and capable of running one thread of data at a time."

Considering the fact the Wii U's GPU can write to the same embedded RAM the CPU can, there's no reason to assume the Wii U's CPU can't help its GPU render graphics- in fact it'd have to in order to maintain backward comparability with the Wii. Reading the original Gekko design explanation really opened my eyes to what the Wii U's CPU likely is, and what the Wii U is. Its very likely the Wii U's entire design (GPU and CPU) is derived directly from the GameCube. And why shouldn't it be? The GameCube was an incredibly efficient console.

I'll have a longer post on this in the CPU thread at some point.

Sony first party developers uses the PS3's Cell to handle rendering tasks all the time, so I don't think of that as a huge advantage over current gen. And both next gen consoles have unified memory, so they should be able to as well.
 
Humans? Plural? You base this on a picture of Shulk that wasn't even in game footage at the end of the announcement video while the game was still in its earliest stages, and not the actual in game humans who's faces and bodies we can already see as clearly being substantially more detailed than most of what was seen on the PS3 and 360?

Also, of course the usual suspects who you rarely ever see jump into to fist bump and support the unfounded presumption in your post as absolute, indisputable truth like always.

See this post that was right below yours for examples.



Not that I'm expecting you to respond to any argument that uses facts as opposed to pure assumptions like this.

In fact, the bulk of the arguments against the Wii U's capability go exactly like this.

People take the worst images or gameplay examples they can find, truncate everything around it that doesn't support their negative view and then promote ever flaw as the absolute limit of the Wii U's capability followed by some comment placing it as low on the scale vs the PS3/360 as they can get it without sounding to ludicrous. This was rampant with people using the launch ports as examples and apparently still is even now.

Pretty much all arguments against the Wii U's power take the first picture in his post and limit the Wii U to it with 0% open mindedness about any other possibility.

I don't know where to even begin with this post, but it seems you created a lot of subtext to my post that wasn't intended nor present in anything I said. Someone said the character models looked good, I pointed out that they are actually substandard from the norm. Regardless of whether they'll be improved in the future is irrelevant to the discussion.

Really if you didn't enter the discussion with so much outrage and a template response you would've noticed your mistake. And claiming that I chose "The worst possible image" in the case at hand is absurd, you realize that right?
 
I'm pretty sure Wii U's CPU is way better than the 360/PS3's, and may be comparable to the PS4 and Xbox One's.

In comparison to current gen HD twins: It's actually worse in some ways but better than others (i.e. floating point code would choke it in comparison to Cell, but it has a short pipeline and is more efficient).

In comparison to next gen HD twins: it's actually more comparable and a bit more apples-to-bigger apples here, but it's still missing some things in comparison (i.e. AVX), still does less per cycle, and has a deficit of about 3 cores.
 
Sony first party developers uses the PS3's Cell to handle rendering tasks all the time, so I don't think of that as a huge advantage over current gen. And both next gen consoles have unified memory, so they should be able to as well.
This is off topic for this thread but the difference is that the Gekko/Expresso architecture is much more efficient at parallel processing than x86. The PS4's and Xbone's inefficient x86 architecture forces them to have a high number of cores so they can "hack" parallel processing into an architecture that was never designed to have it. Expresso was built for it, which may explain why it was able to run games designed to run on 3 360 cores on its main core.

Cell was also fairly limited for a number of reasons that are well documented. Wii U's implementation of this strategy is certain to be more efficient than Cell's. It is true that all of the consoles have unified memory however, though that RAM isn't as fast as Wii U's. I'm not saying that the Wii U will have better graphics than Xbone. I am saying that we should take the Wii U's CPU into account while discussing theoretical peak graphic output, however.
 
None of what you list here is even remotely fact, especially FLOPS and it certainly doesn't tell the capability of any of the hardware. The probable FLOPs performance is more likely in the 200-250 rang, not 176 or 352.

The clocks mean nothing. If they were the biggest measurement of performance, then that would make the PS4 CPU weaker than the CELL because it has 8 SPU's at 3.2 Ghz that people (including a dev) were swearing for full fledged cores in the Espresso thread.

You can't truncate the other ram sticks, and only 300 MB of the Wii U's RAM is used for system files. That some of the info that was confirmed through vgleaks. That other 700Mb isn't being used at all or no one knows what its used for according to the docs and will like be be put to use for games or game related features in future firmware updates.

Whether WiiU uses 300MB or 999MB for the OS is irrelevant, the console reserves 1GB for it, end off. Will some of that RAM be made available for games in the future ?, maybe yes, maybe no, just like some of the 3GB on PS4/XBO might be used for games instead of the OS in the future.

I was going on what has been confirmed by all three companies, not some possible event in the future. Bottom line is 5GB's of much faster RAM destroys the pathetic 1GB of RAM @ 12GB/s in WiiU there is no getting away from it.

Considering that the 512MB's of RAM in PS360 is actually 43% faster than the main RAM in WiiU, the extra RAM isn't as big of a deal as some people make out imo, the eDRAM seems to the most important aspect WiiU is built around but we have very little info on it's speed nevermind anything else.

The RAM was perhaps the biggest disappointment in WiiU for me, there was talk in WUST 4 and 5 that the console may have 1GB of GDDR5 RAM for games with normal DDR3 for the OS.

As for the GPU, whether it's 176 GFLOPs (the best candidate), 200, 250 or even 352 (highly unlikely) it's still a joke for a console released in late 2012 for $350 imo.

Again in the WUST's, most of us expected 1TF for the GPU, then 800 GFLOPs, then 600 GFLOPs (BG was a big proponent of 600 GFLOPs), then 352 and it looks like the more tech savvy of people in this thread have settled on 176 GFLOPs after almost 8000 posts.

You can't seem to handle that people have a different opinion of the console than you tbh, you seem utterly obsessed with trying to prove that this 33 watt console with a $100 controller is suddenly going to start competing with the best PS360 games aswell as PS4/XBO launch games, it just isn't going to happen I'm afraid, let it go...

Nintendo focused on the tablet for WiiU, it's whole design reeks of a cheap console built with the fact they have to leave a huge chunk of the build cost reserved for the controller.

The WiiU tablet is the reason they went with the ancient CPU design, the pathetic 12GB/s RAM and the laughable 176GFLOP GPU IMHO.

If you love Nintendo games then there will be some awesome looking first party games for the system in the coming three years, that's all that matters to me :).
 
So the tablet is why they spent a whole lot of engineering resources on the CPU? Not, you know, other things like backwards compatibility and library compatibility?
 
I doubt you were intending to be, but that's how he took it. It was a bit condescending, to be honest.

I made an edit to that post, several in fact, trying not to be confrontational. My personal experience with my PC, the PS3 and Wii U tells me the Wii U is more capable. (Than the PS3, obviously) The problem seems to be that those launch games were built for systems with GPU's incapable of pulling their own weight and having to offload to the CPU (That's generally how both the 360 and PS3 were designed.), but the Wii U has a more feature rich GPU with a CPU designed to mainly do normal CPU tasks. This, combined with the fact that the Wii U CPU only uses one core unless you specifically program it to use the other two (Something a lot of devs didn't seem to know until recently), means that it had a lot more overhead when running code for the older systems. Even then, most of those multiplats ran better on Wii U than PS3, with the 360 version being the best. This last bit isn't suprising as the games were built specifically for the strengths of the 360 and ported to the others.

For reference, check out ZOE HD if you want to see what code unoptimized for the system it's on can do to a game. That's a PS2 collection that ran worse on the PS3 because the game was made specifically for the PS2 architechture.

Considering that the Wii U did so well with those ports and how different everything about its architecture is from the previous gen is telling.

In the words on Criterion "It punches above it's weight."

It's not going to compete with PC, but neither are PS4 or Xbox1. However, if you ask me, I expect Wii U is much closer to those consoles than either of them will be to High end PCs or mid tier PC's an a couple of years. That's why I don't see what the big deal is. In the end, it's no where near as large a divide as there was last gen between consoles, and none of the new consoles are bleeding edge in any way, Unless you want to count PS4's memory, but then a large portion of that will be used for processes that aren't going to get the proper use out of it's higher bandwidth anyway.

At least, that's my impression so far.

What a fantastic post. Couldn't have put it better myself.

All anyone (without an agenda of course) needs to do to realise that the Wii U is a step above the PS3 and 360 is take a look at the latest trailers of Bayonetta 2 and X tbh. But like I said a while back you could have a game that looks the best thing like sliced bread and you would still get the same people saying that the Wii U is on par with or just marginally more powerful than the PS3 and 360.

And those trailers are early builds too, they'll only get better.
 
So the tablet is why they spent a whole lot of engineering resources on the CPU? Not, you know, other things like backwards compatibility and library compatibility?

Backwards compatibility was likely a good excuse as to why they went with an 11 year old design for a 2012, $350 console.

Who the hell is actually going to use WiiU to play Wii games ?, every man and his dog owns a Wii already (most of which got bored with it after a few months of Wii Sports / Play / MK) and it was the Nintendo console with the least amount of high quality first party output.

If they truly did build the console around the CPU for backward compatibility and not to save R&D money on designing a new CPU then Iwata is even more clueless than I already though he was.

Iwata has said on record that they had to perform a balancing act with regards to the consoles hardware power because if they went with high end tech aswell as the tablet then the console would have been very, very expensive (as if it isn't already).

The tablet won out and they were left with $200 to build the rest of the console which is why IMO it uses the basis of a 2001 CPU, a small amount of horribly slow RAM and a laughably powerful GPU for a 2012 console.
 
@Apophis2036
lol, that 2nd to last post is like exactly the same thing that people said about GC hardware in 2001. I'm pretty sure its more powerful than we think it is... or rather, more efficient. Efficiency is the key.
 
I've never played a multi-player game on the Wii U that uses the pad as the 2nd player's screen but I do remember hearing complaints about the visual quality degrading while playing that way in Sonic Racing. I really don't even know how many games support the feature so maybe someone else can comment.

I can say that in the one section of the W101 demo where the action switches to the lower screen inside the hanger I didn't notice any change in game' performance at all even though you can see they large group of characters on both screens at once at times.

I'll play closer attention the next time that I play to see if the game's performance does dip in that section.
black ops 2 used the screen as a 2nd player
 
This is off topic for this thread but the difference is that the Gekko/Expresso architecture is much more efficient at parallel processing than x86.

I usually don't comment on everything because I don't want to derail the thread. But at this point it is getting absolutely ridiculous. The original PPC7xx line had no support for multiple cores at all and still is limited to 64 bit SIMD operations. Besides that there is no parallelism.
If anything it is relatively strong at single threaded performance without data level parallelism, at least when compared to Cell or Xenon.

Expresso was built for it, which may explain why it was able to run games designed to run on 3 360 cores on its main core.

That's an unconfirmed and quite unrealistic claim.

It is true that all of the consoles have unified memory however, though that RAM isn't as fast as Wii U's.

The actual numbers on main RAM bandwidth are well known. Wii U has less than a fifth of Xbox One's.
 
Gameguru59 said:
Considering the fact the Wii U's GPU can write to the same embedded RAM the CPU can, there's no reason to assume the Wii U's CPU can't help its GPU render graphics

The Wii U's CPU is not in any way geared well for floating point code. It's not beefy at all in that regard, and is actually worse than the PS360's design. The Wii U's GPU is what's going to be rendering graphics in this console.

Backwards compatibility was likely a good excuse as to why they went with an 11 year old design for a 2012, $350 console.

Do you know how long ago Jaguar's design is rooted? I'll give you a clue: it starts around the same time as Gecko's development - well a couple years after.

Who the hell is actually going to use WiiU to play Wii games ?

Lots of the current Wii userbase does.

every man and his dog owns a Wii already (most of which got bored with it after a few months of Wii Sports / Play / MK)

The attach ratio that's comparable to PS360 and the over 880 million pieces of software sold don't reflect this common misconception.

and it was the Nintendo console with the least amount of high quality first party output.

That's your opinion. Many others believe it's been their best first party in a long while as well.

If they truly did build the console around the CPU for backward compatibility and not to save R&D money on designing a new CPU then Iwata is even more clueless than I already though he was.

You think they SAVED money engineering the CPU as opposed to going with a nearly vanilla x86 setup? I think you should head over to the CPU thread and have a good read - it required significant customization of the base design.

The tablet won out and they were left with $200 to build the rest of the console which is why IMO it uses the basis of a 2001 CPU

Again, you're using a year to describe a CPU that didn't exist in 2001. That's almost like saying the CPU in the PS4One is a 1978 CPU.
 
I usually don't comment on everything because I don't want to derail the thread. But at this point it is getting absolutely ridiculous. The original PPC7xx line had no support for multiple cores at all and still is limited to 64 bit SIMD operations. Besides that there is no parallelism.
If anything it is relatively strong at single threaded performance without data level parallelism, at least when compared to Cell or Xenon.



That's an unconfirmed and quite unrealistic claim.



The actual numbers on main RAM bandwidth are well known. Wii U has less than a fifth of Xbox One's.
1) IBM considered putting multiple cores in GCN. They decided to go with a Superscalar design instead (I actually had no idea what that meant but then I Wikipedia'd it...)

"A superscalar CPU architecture implements a form of parallelism called instruction level parallelism within a single processor. It therefore allows faster CPU throughput than would otherwise be possible at a given clock rate. A superscalar processor executes more than one instruction during a clock cycle by simultaneously dispatching multiple instructions to redundant functional units on the processor. Each functional unit is not a separate CPU core but an execution resource within a single CPU such as an arithmetic logic unit, a bit shifter, or a multiplier."

So regardless of what was available on an off the shelf Power chip at the time, IBM was going to do whatever it took to get the best performance for GCN... they did have a $1 billion contract.

2) That was info derived from the CPU thread I have yet to fully investigate. I have no idea whether its true or not, but what is true is devs have ported current gen games on very inefficient tools to the Wii U with no real performance decrease on Wii U. That alone suggests Wii U is significantly more powerful than the 360 or PS3.

3) I was referring to the CPU's and GPU's ability to use the embedded RAM. Is this true for the embedded RAM?
 
Yes it did and it resulted in a reduced frame rate for both players, you are not going to get performance for free.

It's pretty impressive that the console can run the CoD engine twice at the same time though, I will give it that.

source? ive never heard of this until today
not trying to call you out or anyhting.
 
This is off topic for this thread but the difference is that the Gekko/Expresso architecture is much more efficient at parallel processing than x86. The PS4's and Xbone's inefficient x86 architecture forces them to have a high number of cores so they can "hack" parallel processing into an architecture that was never designed to have it. Expresso was built for it, which may explain why it was able to run games designed to run on 3 360 cores on its main core.

Not sure if serious.jpg

So, uh, what exactly do you mean by this?
 
I was going on what has been confirmed by all three companies, not some possible event in the future. Bottom line is 5GB's of much faster RAM destroys the pathetic 1GB of RAM @ 12GB/s in WiiU there is no getting away from it.

Considering that the 512MB's of RAM in PS360 is actually 43% faster than the main RAM in WiiU, the extra RAM isn't as big of a deal as some people make out imo, the eDRAM seems to the most important aspect WiiU is built around but we have very little info on it's speed nevermind anything else.

The RAM was perhaps the biggest disappointment in WiiU for me, there was talk in WUST 4 and 5 that the console may have 1GB of GDDR5 RAM for games with normal DDR3 for the OS.
I believe you are oversimplifying the memory system of the Wii U. You are right that the 32MBs of eDRAM is likely the most important piece, but there is also the Caches and the other small bank of SRAM/eDRAM in the system. Nintendo cares more about the performance from the sum of the parts rather than overpowering certain things that may not be worth the investments due to other bottlenecks. From what we know, I believe memory system works a lot better than it seems if we just focused on the speed of the main RAM.
 
source? ive never heard of this until today
not trying to call you out or anyhting.

http://www.eurogamer.net/articles/digitalfoundry-black-ops-2-wii-u-face-off

"Where Treyarch deserves credit is for the expansion to the local multiplayer options, specifically in terms of split-screen. While the standard two-player mode of the existing console version is supported, the Wii U has an excellent alternative: discrete screens for each participant - HDTV output for one player, GamePad for the other.

This mode - indeed, split-screen in general - is not without its trades. Frame-rate definitely takes a hit (which varies according to the map and definitely muddies controller response), but resolution is maintained and the only noticeable visual downgrade comes from the removal of dynamic shadows. Even this isn't quite as impactful as it sounds as shadows "baked" into the environments are still there."
 
Yes it did and it resulted in a reduced frame rate for both players, you are not going to get performance for free.

It's pretty impressive that the console can run the CoD engine twice at the same time though, I will give it that.

The game has the quite the same framerate drops in single and local multiplayer. Still nothing dramatic, though. If you play co-op online it plays very nice, so the issue is not with the resources allocated to the gamepad.
 
http://www.eurogamer.net/articles/digitalfoundry-black-ops-2-wii-u-face-off

"Where Treyarch deserves credit is for the expansion to the local multiplayer options, specifically in terms of split-screen. While the standard two-player mode of the existing console version is supported, the Wii U has an excellent alternative: discrete screens for each participant - HDTV output for one player, GamePad for the other.

This mode - indeed, split-screen in general - is not without its trades. Frame-rate definitely takes a hit (which varies according to the map and definitely muddies controller response), but resolution is maintained and the only noticeable visual downgrade comes from the removal of dynamic shadows. Even this isn't quite as impactful as it sounds as shadows "baked" into the environments are still there."

thanks
 
Not sure if serious.jpg

So, uh, what exactly do you mean by this?
Not sure all of this applies to the Wii U, but regarding the Power architecture generally:

"Let’s also look at x86 versus Power servers. Although x86 is good at processing many fast threads, it can only execute two threads per cycle. Power Systems servers can execute four. What this means is right off the bat, Power technology is twice as powerful. Power servers are also known to do compute-intensive jobs more efficiently. Both systems perform well in processing parallel tasks, but to scale x86, you must throw more processing cores into the configuration—more cores burn more power and vendors often charge for applications based on the number of cores. So using x86 solutions may drive up license costs. My point is that, despite what many IT buyers think, x86 is not the answer to running the most optimized solutions, as it doesn’t do every job optimally."
 
Not sure all of this applies to the Wii U, but regarding the Power architecture generally:

"Let’s also look at x86 versus Power servers. Although x86 is good at processing many fast threads, it can only execute two threads per cycle. Power Systems servers can execute four. What this means is right off the bat, Power technology is twice as powerful. Power servers are also known to do compute-intensive jobs more efficiently. Both systems perform well in processing parallel tasks, but to scale x86, you must throw more processing cores into the configuration—more cores burn more power and vendors often charge for applications based on the number of cores. So using x86 solutions may drive up license costs. My point is that, despite what many IT buyers think, x86 is not the answer to running the most optimized solutions, as it doesn’t do every job optimally."

That's cool, but what does that have to do with the Wii U? The Wii U uses a completely different processor, which can only handle one thread per core at a time.
 
Not sure all of this applies to the Wii U, but regarding the Power architecture generally:

"Let’s also look at x86 versus Power servers. Although x86 is good at processing many fast threads, it can only execute two threads per cycle. Power Systems servers can execute four. What this means is right off the bat, Power technology is twice as powerful. Power servers are also known to do compute-intensive jobs more efficiently. Both systems perform well in processing parallel tasks, but to scale x86, you must throw more processing cores into the configuration—more cores burn more power and vendors often charge for applications based on the number of cores. So using x86 solutions may drive up license costs. My point is that, despite what many IT buyers think, x86 is not the answer to running the most optimized solutions, as it doesn’t do every job optimally."

Hey GameGuru59, start your reading here and here if you've got some free time :)
 
That's cool, but what does that have to do with the Wii U? The Wii U uses a completely different processor.

Well, if you read the post(and IBM document I originally quoted, the Gekko GPU was designed to run parallel operations (superscalar) even though it was one core. And IBM went with this design in lieu of multiple cores for the Gekko according to the IBM document. Then the second article states that the Power design in general requires fewer cores than x86 to run the same parallel operations. Its not that much of a leap to arrive at the conclusion that Power is by design more efficient than x86 (which is why Power is still very popular in server applications). All of this makes sense in light of the fact that x86 was created in the 70's and Power was developed in the late 80's and early 90's.
 
Status
Not open for further replies.
Top Bottom