WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
No AA! Looked pretty smooth. I think E3 is going to bring out what the GPU is capable of at the hands of first parties.

The jaggies were actually the first thing I noticed about it. Around Links cap, the floor shadows, the spiders edges, etc

zelda_hd.jpg


Granted that's a downscaled photo, but they're even more apparent in video form.
 
The jaggies were actually the first thing I noticed about it. Around Links cap, the floor shadows, the spiders edges, etc

zelda_hd.jpg


Granted that's a downscaled photo, but they're even more apparent in video form.

True, but in motion, (depending upon how aggravating it is) I find I can be forgiving of it.
 
Regarding the alleged 282-352 GFLOPS for the Wii U GPU, this implies that the PS4 GPU is 5.2 to 6.5 times as powerful. I know comparing FLOPS doesn't tell the whole story, but let's humor this comparison for a second. The other way to look at it is that the Wii U GPU has about 15-19% the pixel crunching power as the PS4 GPU. So, a 1080p title on PS4 could potentially be rendered more or less equivalently on Wii U at around 744x418 to 837x470, which is a tolerable range for upscaling to 480p:

22s0HWd.gif


Sure, not so impressive for display on an HDTV, but suitable for off-TV play on the GamePad. The texture quality reduction due to less RAM is a non-issue as super high quality textures are wasted at sub-480p. The question is whether any developers will actually put in the effort to support off-TV play in this way instead of just downscaling from HD.

Note that PS4 titles that opt for 720p to push more effects would surely need to be significantly toned down on Wii U even at sub-480p resolutions.
 
Regarding the alleged 282-352 GFLOPS for the Wii U GPU, this implies that the PS4 GPU is 5.2 to 6.5 times as powerful. I know comparing FLOPS doesn't tell the whole story, but let's humor this comparison for a second. The other way to look at it is that the Wii U GPU has about 15-19% the pixel crunching power as the PS4 GPU. So, a 1080p title on PS4 could potentially be rendered more or less equivalently on Wii U at around 744x418 to 837x470, which is a tolerable range for upscaling to 480p:

22s0HWd.gif


Sure, not so impressive for display on an HDTV, but suitable for off-TV play on the GamePad. The texture quality reduction due to less RAM is a non-issue as super high quality textures are wasted at sub-480p. The question is whether any developers will actually put in the effort to support off-TV play in this way instead of just downscaling from HD.

Note that PS4 titles that opt for 720p to push more effects would surely need to be significantly toned down on Wii U even at sub-480p resolutions.
Yeah... I've said from the get go that Wii U owners should hope 1080p60 is the standard on the other next gen consoles. Definitely makes downports seem more viable.
 
Well on the positive side of that. It was done in a few weeks and on much weaker dev kits then the final.

True, I suppose.

Don't 3D Zelda games intentionally run at low frame rate?

Yeah...but considering it was in a closed area, at 720p with no AA, I thought they could have achieved 60 fps minimum. The old GC 2000 tech demo (with Link fighting Ganon) was 60, even though TP ended up being 30.
 
True, I suppose.



Yeah...but considering it was in a closed area, at 720p with no AA, I thought they could have achieved 60 fps minimum. The old GC 2000 tech demo (with Link fighting Ganon) was 60, even though TP ended up being 30.

Given Zelda games run at 30fps, I do not think they needed to go the extra mile this time around since it would have been brought back down any way. I am expecting them to wring the WiiU's neck with the actual game.

It's a shame that I have neither owned any nintendo system nor have I played any Zelda game nor do I have any intention of buying a WiiU. That said, I hope the owners get to enjoy another fine addition to a fine franchise befitting the technical prowess of the console.
 
I know what he's saying. I think 2-3 years from now, so at the very least 2nd gen games, Joe Smoe will definitely be able to the difference between PS4 and PS3 games. Wii U is much closer to ps3/360 then the next gen cosoles, so they'll end up seeing the difference there too.

People that think Wii U falls directly in the middle of the current gen consoles and the next gen ones are sorely mistaken. Wii U is 2x 360, where as Durango is about 8x and ps4 is 10x. It isn't all about flops either.

I agree Joe Smoe won't be able to tell the difference between Durango and PS4. Except I do think gamers will be able to tell the difference at games running much higher resolutions and higher framerate.

And what are you basing that 2x multiplyer on..? The 360 has 512MB of RAM compared to 2GB, 10MB of eDRAM for the GPU on a daughter die compared to 32MB of eDRAM for the GPU on die, 1MB of CPU cache compared to 3MB, the 360 CPU is not only In-Order but also has to handle sound and IO compared to an Out-Of-Order CPU with a DSP and an ARM processor to handle sound and IO, the 360 has a GPU with a DX9-equivalent compared to the Wii U's GPU having a DX11-equivalent feature set and RAM with considerably less latency.

And that's before we take into account that we don't have a Scooby Doo what half the silicon in the GPU does lol. My money would be on some sort of evolution of the TEV Unit which will allow developers the use of 'free' tesselation, HDR, depth of field with one important difference - it'll be easier for developers to implement these functions in engines, ie not making the Wii U having a nonstandard rendering pipeline that the Wii suffered from.

Nintendo must have known that both Sony and Microsoft were going to go for powerful machines and I'm pretty sure they would have added some sort of fixed functions/secret sauce/call it whatever you want lol to keep the console available for multiplatform titles a few years down the line.

People seem to be convinced that Nintendo haven't planned the Wii Us life to last for more than a couple of years as far as third parties goes, which is absurd tbh.
 
And that's before we take into account that we don't have a Scooby Doo what half the silicon in the GPU does lol. My money would be on some sort of evolution of the TEV Unit which will allow developers the use of 'free' tesselation, HDR, depth of field with one important difference - it'll be easier for developers to implement these functions in engines, ie not making the Wii U having a nonstandard rendering pipeline that the Wii suffered from.

A lot of people seem to think this. Is there any visual evidence in any titles, released or unreleased, or any developer quotes indicating that something like this may be present? Custom silicon allowing cheap and programmable versions of these features is not something that strikes me as easy or insignificant to implement.
 
And that's before we take into account that we don't have a Scooby Doo what half the silicon in the GPU does lol. My money would be on some sort of evolution of the TEV Unit which will allow developers the use of 'free' tesselation, HDR, depth of field with one important difference - it'll be easier for developers to implement these functions in engines, ie not making the Wii U having a nonstandard rendering pipeline that the Wii suffered from.

Nintendo must have known that both Sony and Microsoft were going to go for powerful machines and I'm pretty sure they would have added some sort of fixed functions/secret sauce/call it whatever you want lol to keep the console available for multiplatform titles a few years down the line.

"Free" in quotes is correct. Nothing is free. The notion that fixed function is superior or even desirable is misguided IMO. That's exactly what development is moving away from. If turning on tesselation, HDR, and depth of field effects are free, what about opposite case? If my game doesn't need those effects, turning them off gains me no additional performance? That silicon resource is then sitting there useless.

I just don't see evidence that Nintendo paid AMD enough to design something truly novel. Latte is almost definitely based closely on an existed AMD Radeon part with some bits to optimize for backwards compatibility and interface with various eDRAM pools.
 
Nintendo should put that shit on the store, along with the Zelda tech demo; sure some fiddling might be needed since it was running on unfinished hardware but I'll be dammed.

You know, hold the fort and serve as an example of what the console can do on-spec.

I did suggest such a thing in the Zelda Miiverse community but my post was mostly ignored...got a few yeahs and a couple of comments but it fizzled out pretty quickly unfortunately.
 
I'm pretty sure it was as well. There have been many changes made since the prototypes days. Like how the original Wii U gamepad had sliders instead of analogs and how they still used the Wii Classic Controller Pro back then.

Also, I wonder why no one uses the garden tech demo for analysis. http://www.youtube.com/watch?v=6OHUwDShrD4 I notice that the water in this looks just like the water in Monoliths X

Not really, the water there has accurate reflections of objects in the scene, X seems to be using fake cubemaps.
 
A lot of people seem to think this. Is there any visual evidence in any titles, released or unreleased, or any developer quotes indicating that something like this may be present? Custom silicon allowing cheap and programmable versions of these features is not something that strikes me as easy or insignificant to implement.

I'm pretty sure I remember reading (maybe in one of the many Wii U Speculation threads) that Nintendo were working with either Crytek or Epic or both at some point...unless I'm remembering incorrectly..? It would make sense if they are using some form of fixed functions that it won't mess up the rendering pipeline the way they did with the Wii, making ports so much of a pain in the bum that developers like Treyarch decided to develop an entirely separate SKU for the Wii instead.

I just find it difficult to believe that Nintendo aren't doing something to make ports to and from the Wii U possible in 3 years onwards. I'd say that they're going for another 5-6 year lifespan for the Wii U and it would be daft to have a console that will only have third party support for two or three years. Makes absolutely no business sense at all, particularly when they said right at the beginning of the Wii U reveal that they were aiming to have a console that's going to attract the 'hardcore' gamer.

We already know that Nintendo are using some sort of texture compression which would help to negate the RAM difference a little...wouldn't surprise me if the compression algorithm was either the same as or an evolution of the compression algorithm used to compress WiiWare games into just 40MB. Some of those games were insane for their compressed size! :Oo
 
I'm pretty sure I remember reading (maybe in one of the many Wii U Speculation threads) that Nintendo were working with either Crytek or Epic or both at some point...unless I'm remembering incorrectly..? It would make sense if they are using some form of fixed functions that it won't mess up the rendering pipeline the way they did with the Wii, making ports so much of a pain in the bum that developers like Treyarch decided to develop an entirely separate SKU for the Wii instead.

I'm sure Nintendo would have worked with companies like Epic and Crytek to get their engines working well on the system, and by all accounts they succeeded in that. However, neither of those companies is likely to have recommended the use of custom fixed shading units, especially not for ensuring performance into the future, where more SPUs would be more effective and flexible. The fixed function thing seems to have gotten some traction because 1) it's what Nintendo have done in the past 2) Pikmin and 101 had a similar looking DoF and 3) first party Nintendo titles have shown off some pretty lighting.

1) would be understandable if Nintendo had mentioned anything about wanting to bridge the gap between predictable performance and programmability, but instead everything we've heard is mostly about the learning curve their staff have been on.

2) was discussed by others and I in this thread. The DoF isn't even the same in W101 and Pikmin, and nothing the Wii-U is doing in either of those titles should be beyond its reach.

3) seems to me to be a combination of Nintendo prioritising the use of GPU power in their titles on improved rendering and lighting specifically, as opposed to (more expensive) asset improvements, and fans noticing these improvements more easily.

At this stage, there's no real evidence of any fixed function units on the die (beyond say an R700 tesselator or the like), discussion of any by developers, or even evidence of it in games, released or not. It's true we don't know the exact purpose of large chunks of the GPU silicon, but the people examining it who have an idea of what they're looking at aren't the ones suggesting it could be fixed function, at least not in the sense of the TEV or the 3DS' GPU. I could be wrong, but until a semi-solid case is put forward I think it's best to let this notion go.
 
And what are you basing that 2x multiplyer on..? The 360 has 512MB of RAM compared to 2GB, 10MB of eDRAM for the GPU on a daughter die compared to 32MB of eDRAM for the GPU on die, 1MB of CPU cache compared to 3MB, the 360 CPU is not only In-Order but also has to handle sound and IO compared to an Out-Of-Order CPU with a DSP and an ARM processor to handle sound and IO, the 360 has a GPU with a DX9-equivalent compared to the Wii U's GPU having a DX11-equivalent feature set and RAM with considerably less latency.

And that's before we take into account that we don't have a Scooby Doo what half the silicon in the GPU does lol. My money would be on some sort of evolution of the TEV Unit which will allow developers the use of 'free' tesselation, HDR, depth of field with one important difference - it'll be easier for developers to implement these functions in engines, ie not making the Wii U having a nonstandard rendering pipeline that the Wii suffered from.

Nintendo must have known that both Sony and Microsoft were going to go for powerful machines and I'm pretty sure they would have added some sort of fixed functions/secret sauce/call it whatever you want lol to keep the console available for multiplatform titles a few years down the line.

People seem to be convinced that Nintendo haven't planned the Wii Us life to last for more than a couple of years as far as third parties goes, which is absurd tbh.

It has 1gb of RAM available for games not 2gb, half the bandwidth, has a weaker CPU, and has render to a second screen. Flop wise its 1.5x the 360 GPU(so we think, could be almost equal). I think 2x is a pretty fair real world performance assessment. It's could be 2.5x, if a lot of this stuff in the gpu that we don't know about ends up making it much more efficient then we think or adding a lot of fixed function hardware.

That's doesn't change my point though. 2.5x or even 3x the 360- which it isn't- it still would be waaaay closer to 360/ps3 than the next gen consoles.
 
It has 1gb of RAM available for games not 2gb, half the bandwidth, has a weaker CPU, and has render to a second screen. Flop wise its 1.5x the 360 GPU(so we think, could be almost equal). I think 2x is a pretty fair real world performance assessment. It's could be 2.5x, if a lot of this stuff in the gpu that we don't know about ends up making it much more efficient then we think or adding a lot of fixed function hardware.

That's doesn't change my point though. 2.5x or even 3x the 360- which it isn't- it still would be waaaay closer to 360/ps3 than the next gen consoles.
It has 2GB of RAM, and the 1GB available for games can be expanded in the future. Weren't both the PS3 and the Xbox 360 using a lot more RAM for OS functions when they launched in 2004?
Then the same could happen to the WiiU.


It doesn't have "half the bandwith" because it has a much bigger eDram (32+3MB compared to 10MB) that can be used in ways the Xbox360's eDram couldn't (in fact, mem1 is the 32MB of eDram, while Xbox 360's GPU couldn't even read from the parent die, only write). You can't compare those much different designs without considering that.

The CPU on WiiU is DIFFERENT, not weaker. It destroys both Xbox 360 and PS3 CPUs when it comes to general processing capacity. When it comes to SIMD, then it's weaker, but all this responds to a certain design. The CPU on Xbox 360 was used to perform a lot of tasks normally done on the GPU, that's not the case on the WiiU.

Only if the developers choose to do that. If they want to use the 2nd screen as a simple menu, then the cost of rendering that is near the same as doing it on the TV screen.
 
It has 2GB of RAM, and the 1GB available for games can be expanded in the future. Weren't both the PS3 and the Xbox 360 using a lot more RAM for OS functions when they launched in 2004?
Then the same could happen to the WiiU.


It doesn't have "half the bandwith" because it has a much bigger eDram (32+3MB compared to 10MB) that can be used in ways the Xbox360's eDram couldn't (in fact, mem1 is the 32MB of eDram, while Xbox 360's GPU couldn't even read from the parent die, only write). You can't compare those much different designs without considering that.

The CPU on WiiU is DIFFERENT, not weaker. It destroys both Xbox 360 and PS3 CPUs when it comes to general processing capacity. When it comes to SIMD, then it's weaker, but all this responds to a certain design. The CPU on Xbox 360 was used to perform a lot of tasks normally done on the GPU, that's not the case on the WiiU.

Only if the developers choose to do that. If they want to use the 2nd screen as a simple menu, then the cost of rendering that is near the same as doing it on the TV screen.

What are you trying to say ultimately here? Do you believe the Wii U is closer to the PS4/Durango tech than to the Xbox 360?
 
What are you trying to say ultimately here? Do you believe the Wii U is closer to the PS4/Durango tech than to the Xbox 360?

I GUESS that Wii U is able to set itself way above PS360 visually. But of course won't reach PS4/720. I see it as right smack in the middle between PS360 and PS720.

Remember... I said i GUESS not that it will!

Considering the "Wii U punches above its weight" comment from Criterion, i belive that this could be realistic.

This E3 (3D Mario U) will be very telling
 
It has 2GB of RAM, and the 1GB available for games can be expanded in the future. Weren't both the PS3 and the Xbox 360 using a lot more RAM for OS functions when they launched in 2004?
Then the same could happen to the WiiU.

Holy bold and italic batman, they start to loose their meaning if every word in three paragraphs use them you know :P

But about the argument that the Wii U OS could be shrunk down, it doesn't sway me much because that same hypothetical could be applied to the PS4 and the Durango. If the Wii U OS is cut down by half (while somehow managing to fufill their promises of a speed up at the same time, that's tough, less in RAM means less prefetching), they gain 512MB usable to games. If the rumored Durango OS is cut by half, it gains 1.5GB. There's more to gain for the operating systems that use more to start with, and the overall RAM numbers would still be far in favor of the PS4/Durango, possibly even more so than before if they all shrink down.

I GUESS that Wii U is able to set itself way above PS360 visually. But of course won't reach PS4/720. I see it as right smack in the middle between PS360 and PS720.

Smack in the middle would be what, 4.5x ish the 360/ps3 (~200Gflop GPUs, vs 1.8TF, giving 9, half being 4.5)? If that were the case would we be seeing so many 720p games? Even with the added resolution for the controller. Right now usable RAM is only double the 360, upper end for the GPU ignoring the theoretical fixed function parts is about 1.5x, etc. The CPU may be more efficient than the Xenon by far, but being halfway to an 8 core Jaguar at 1.33x the clock speed, with only 3 cores itself? Doubtful.

With that much more power under the hood even launch games should have details like AA cranked up if they're at 720p, but they don't. The PS4 launch games meanwhile look markedly better than last gen.
 
Holy bold and italic batman, they start to loose their meaning if every word in three paragraphs use them you know :P

But about the argument that the Wii U OS could be shrunk down, it doesn't sway me much because that same hypothetical could be applied to the PS4 and the Durango. If the Wii U OS is cut down by half (while somehow managing to fufill their promises of a speed up at the same time, that's tough, less in RAM means less prefetching), they gain 512MB usable to games. If the rumored Durango OS is cut by half, it gains 1.5GB. There's more to gain for the operating systems that use more to start with, and the overall RAM numbers would still be far in favor of the PS4/Durango, possibly even more so than before if they all shrink down.

Honestly, having more can help more than hurt. Doesn't really matter what the other guys do.
 
Smack in the middle would be what, 4.5x ish the 360/ps3 (~200Gflop GPUs, vs 1.8TF, giving 9, half being 4.5)? If that were the case would we be seeing so many 720p games? Even with the added resolution for the controller. Right now usable RAM is only double the 360, upper end for the GPU ignoring the theoretical fixed function parts is about 1.5x, etc. The CPU may be more efficient than the Xenon by far, but being halfway to an 8 core Jaguar at 1.33x the clock speed, with only 3 cores itself? Doubtful.

I stopped reading there.

x multipliers are bullshit. Even when we used them in the Wusts we got told by erm.. Nintendo "disagreeers" and from knowledgable spec guys of GAF that hardware is not working that way.

Edit: Its 320 gflops. btw
 
I stopped reading there.

So what does "smack in the middle between PS360 and PS720." mean, in any meaningful measure? Where did you come up with that then?

and from knowledgable spec guys of GAF that hardware is not working that way.

No kidding. But when comparing different ballparks they generally work. They aren't an exact measure of performance, but if you look at the terraflop rating of any graphics card you can generally tell where it will fall on the charts.
 
From my view if Wii U get optimize Unreal 4 engine & other engine that working on PS4 & 720.

The only thing will be difference is Physic & Texture resolution & resolution & anti analysis.

Everything will be scale down for Wii U and will be good to medium quality .

Don't forget also Wii U support Direct X11 and texture will be acceptable with 720p.
 
So what does "smack in the middle between PS360 and PS720." mean, in any meaningful measure? Where did you come up with that then?

Its a very rough GUESS/Estimation, as i said in my post!

Nothing more, nothing less. But you try to sell your "numbers and multipliers" as facts and thats just not possibe.

1. We know final dev kits came very late! Criterion said themselves they got the final kits in NOVEMBER(!) last year.

2. SDK/Documentaion has been lacking until after launch

3. Example from the past: Going by spec sheet only the xbox would have destroyed Gamecube. But Gamecube could hold its own in visuals. Xbox had 21 gflops compared to gamecubes 8. So using numbers and multipliers gets you nowhere.
 
So what does "smack in the middle between PS360 and PS720." mean, in any meaningful measure? Where did you come up with that then?



No kidding. But when comparing different ballparks they generally work. They aren't an exact measure of performance, but if you look at the terraflop rating of any graphics card you can generally tell where it will fall on the charts.

So in your senario the Playstation 3 is on par with Durango and PS4 in raw calculating power? According to sony Ps3 is at 1,8 Tflops.

http://playstation.about.com/od/ps3/a/PS3SpecsDetails_3.htm

Also stating rumors as facts about the calculation on process power on the custon die of the Wii U when if this were true games like mass effect 3 could not be possible on the console much more be the better version developed from an outsourced studio.

I am not trying to be a smartass but please explain me how you deduct your conclusions and according to what facts exactly?

edit: No the Ps3 is better as the source state on the above link the GPU is at 1,8 TFlops and the Full System is at 2Tflops in general. Sony is master on numbers I presume.
 
So in your senario the Playstation 3 is on par with Durango and PS4 in raw calculating power? According to sony Ps3 is at 1,8 Tflops.

http://playstation.about.com/od/ps3/a/PS3SpecsDetails_3.htm

Also stating rumors as facts about the calculation on process power on the custon die of the Wii U when if this were true games like mass effect 3 could not be possible on the console much more be the better version developed from an outsourced studio.

I am not trying to be a smartass but please explain me how you deduct your conclusions and according to what facts exactly?


First off the PS3 "System Floating Point Performance: 2 TFLOPS" was completely bonkers and arbitrary, that's not even a GPU rating. The RSX is more likely to put out 200-250 real world Gflops. Sony has a history of saying crazy things like that.

What conclusion did I, er, deduct (that word makes me want to wear a tophat and monocle)? I was just trying to make head or tails of what the other guy said, I figured "smack in the middle" was a more literal statement and I was just saying how that seemed dead off from what we know so far. More powerful than the PS360? Sure. Smack in the middle of the PS360 and the PS4/Durango? Doesn't seem like it.
 
First off the PS3 "System Floating Point Performance: 2 TFLOPS" was completely bonkers and arbitrary, that's not even a GPU rating. The RSX is more likely to put out 200-250 real world Gflops. Sony has a history of saying crazy things like that.

What conclusion did I, er, deduct (that word makes me want to wear a tophat and monocle)? I was just trying to make head or tails of what the other guy said, I figured "smack in the middle" was a more literal statement and I was just saying how that seemed dead off from what we know so far. More powerful than the PS360? Sure. Smack in the middle of the PS360 and the PS4/Durango? Doesn't seem like it.

the other guy also said its a rough guess/estimation. Don't underestimate Nintendos crazy level of hardware optimisation.
 
the other guy also said its a rough guess/estimation. Don't underestimate Nintendos crazy level of hardware optimisation.

Yes you did, sorry then. Misunderstanding.

"In the middle" I would most certainly believe, just not dead in the center (and certainly not leaning more towards the PS4)
 
Yes you did, sorry then. Misunderstanding.

"In the middle" I would most certainly believe, just not dead in the center (and certainly not leaning more towards the PS4)

Nintendos AAA games this E3 should be a great indicator on what the platform is capable of though... Although they come a year too late...
 
First off the PS3 "System Floating Point Performance: 2 TFLOPS" was completely bonkers and arbitrary, that's not even a GPU rating. The RSX is more likely to put out 200-250 real world Gflops. Sony has a history of saying crazy things like that.

So what makes you so sure about the performance of the Ordisrango is true on paper on process power and not just theoretical performance? It has done before by cooking the numbers the way they want to present the system to the tech savvy crowd and generate buzz. We saw some tech demos which were far of the IQ and geometry in comparisson on the real time demo of Killzone Dark Fall. In case on the Wii U we have not see ANYTHING except rushed jobs and dirty ports on the system. If you want to see if the Wii U can hold on par with the Orbisrango wait to see GAMES that are made for the system by the very best teams of nintendo's studios.

What conclusion did I, er, deduct (that word makes me want to wear a tophat and monocle)?

It would match the level on witch I make conversation dear sir, but your smirk comments only adds to the misinformation you are spreading. It is only elementary by the "facts" you are stating.
 
Finally found the reason for the lack of the multicore ARM: That chip is only found in devkits and is part of the bridge between the Wii U hardware and the host.
 
And what are you basing that 2x multiplyer on..? The 360 has 512MB of RAM compared to 2GB, 10MB of eDRAM for the GPU on a daughter die compared to 32MB of eDRAM for the GPU on die, 1MB of CPU cache compared to 3MB, the 360 CPU is not only In-Order but also has to handle sound and IO compared to an Out-Of-Order CPU with a DSP and an ARM processor to handle sound and IO, the 360 has a GPU with a DX9-equivalent compared to the Wii U's GPU having a DX11-equivalent feature set and RAM with considerably less latency.

And that's before we take into account that we don't have a Scooby Doo what half the silicon in the GPU does lol. My money would be on some sort of evolution of the TEV Unit which will allow developers the use of 'free' tesselation, HDR, depth of field with one important difference - it'll be easier for developers to implement these functions in engines, ie not making the Wii U having a nonstandard rendering pipeline that the Wii suffered from.

Nintendo must have known that both Sony and Microsoft were going to go for powerful machines and I'm pretty sure they would have added some sort of fixed functions/secret sauce/call it whatever you want lol to keep the console available for multiplatform titles a few years down the line.

People seem to be convinced that Nintendo haven't planned the Wii Us life to last for more than a couple of years as far as third parties goes, which is absurd tbh.

They have (there would be the point of not even them being able to defeat the laws of physics in terms of performance/watt, but let's not bother ourselves with that now), but
they planned it in a scenario which IMHO assumed them doing much better with their 1 year headstart than what they have been doing and getting developers on board in such a way that would make Wii U be the lead platform of multiplatform games and Durango and Orbis versions ports of the Wii U version essentially. This scenario would have opened a window to first party developers to really show what those platforms could do as they were not bound by catering to Wii U as lowest common denominator/baseline performance.
 
Holy bold and italic batman, they start to loose their meaning if every word in three paragraphs use them you know :P

But about the argument that the Wii U OS could be shrunk down, it doesn't sway me much because that same hypothetical could be applied to the PS4 and the Durango. If the Wii U OS is cut down by half (while somehow managing to fufill their promises of a speed up at the same time, that's tough, less in RAM means less prefetching), they gain 512MB usable to games. If the rumored Durango OS is cut by half, it gains 1.5GB. There's more to gain for the operating systems that use more to start with, and the overall RAM numbers would still be far in favor of the PS4/Durango, possibly even more so than before if they all shrink down

Well the discussion was about WiiU and 360/PS3, not XBox3 or PS4. But as far as the difference between WiiU and XBox3's RAM amounts go before and after that potential change, 5GB is five times 1GB, while 6.5GB is 4.33x 1.5GB.

Also on the optimisation/speed boost. They could be doing a lot of that by improving the code/removing redundant code, which would also decrease RAM requirements, just a suggestion.
 
And what are you basing that 2x multiplyer on..? The 360 has 512MB of RAM compared to 2GB, 10MB of eDRAM for the GPU on a daughter die compared to 32MB of eDRAM for the GPU on die, 1MB of CPU cache compared to 3MB, the 360 CPU is not only In-Order but also has to handle sound and IO compared to an Out-Of-Order CPU with a DSP and an ARM processor to handle sound and IO, the 360 has a GPU with a DX9-equivalent compared to the Wii U's GPU having a DX11-equivalent feature set and RAM with considerably less latency.

And that's before we take into account that we don't have a Scooby Doo what half the silicon in the GPU does lol. My money would be on some sort of evolution of the TEV Unit which will allow developers the use of 'free' tesselation, HDR, depth of field with one important difference - it'll be easier for developers to implement these functions in engines, ie not making the Wii U having a nonstandard rendering pipeline that the Wii suffered from.

Nintendo must have known that both Sony and Microsoft were going to go for powerful machines and I'm pretty sure they would have added some sort of fixed functions/secret sauce/call it whatever you want lol to keep the console available for multiplatform titles a few years down the line.

People seem to be convinced that Nintendo haven't planned the Wii Us life to last for more than a couple of years as far as third parties goes, which is absurd tbh.


you hit the nail right on the head. i think people are forgetting that the Wii U is handling the PC port of Need For Speed: Most Wanted in the launch window. That shows that at base level, the Wii U is easily able to handle PC ports of current generation games. Monster Hunter 3 Ultimate and Pikmin 3 coming to the consoles in Quarter 2 of this year are nothing more than HD versions of SD Wii games. So once again Nintendo hasn't shown a title from the ground up developed by Nintendo for the Wii U.

I think at this years E3'13 when Sony and Microsoft are showing all the next generation games coming to their respective systems, Nintendo will show some unannounced first party games that start to show whats really under the hood of the Wii U. It will be very interesting to see at E3'13 what Nintendo's Super Smash Bros. game looks like on the Wii U, as well as the new 3D Mario and Mario Kart games. I think this will actually give a good basis for what we can expect in terms of power on the Wii U.
 
What are you trying to say ultimately here? Do you believe the Wii U is closer to the PS4/Durango tech than to the Xbox 360?
I believe that we can't actually know that for sure, since we also don't know how capable are both Durango and PS4.
Even when we know in detail which GPU/CPU/RAM will have, I think that what can be done with that hardware in a focused-closed environment is still unknown for we as users, and the same applies to WiiU.

Having an usable tessellator unit (surely inherited from the 2007 design, but customized and improved until 2011), new compression algorithms, updated TEV-like features (we don't know to which extent, but we do know that they are there since Iwata himself confirmed that GPU emulation was done thanks to the fact that the WiiU used those parts, enhanced and optimized to the stronger hardware), 32+3MB of eDram that avoids tiling and can be accessed like normal RAM + the benefits of being ondie instead than being in a separate chip, the MCM design (that + the 35MB of eDram could open the door to some crazy CPU+GPU algorithms, like the "light scattering effect" that Factor 5 implemented on the GC on Star Wars Rebel Strike, but obviously in a much greater level)...

I think that all the consoles of this generation will show greater results than most of us expect. I mean, there is even people saying that Crisis 3 at 1080p level is what they expect from PS4, and I think that this is a pretty low estimation of what can be done with that hardware.
And the same goes for the ones expecting WiiU being only a 360+, like how the Wii was a GC+.
 
Finally found the reason for the lack of the multicore ARM: That chip is only found in devkits and is part of the bridge between the Wii U hardware and the host.

So there's no milticore ARM in the retail version then? Is the OS running on "Espresso"?
 
I believe that we can't actually know that for sure, since we also don't know how capable are both Durango and PS4.
Even when we know in detail which GPU/CPU/RAM will have, I think that what can be done with that hardware in a focused-closed environment is still unknown for we as users, and the same applies to WiiU.

Having an usable tessellator unit (surely inherited from the 2007 design, but customized and improved until 2011), new compression algorithms, updated TEV-like features (we don't know to which extent, but we do know that they are there since Iwata himself confirmed that GPU emulation was done thanks to the fact that the WiiU used those parts, enhanced and optimized to the stronger hardware), 32+3MB of eDram that avoids tiling and can be accessed like normal RAM + the benefits of being ondie instead than being in a separate chip, the MCM design (that + the 35MB of eDram could open the door to some crazy CPU+GPU algorithms, like the "light scattering effect" that Factor 5 implemented on the GC on Star Wars Rebel Strike, but obviously in a much greater level)...

I think that all the consoles of this generation will show greater results than most of us expect. I mean, there is even people saying that Crisis 3 at 1080p level is what they expect from PS4, and I think that this is a pretty low estimation of what can be done with that hardware.
And the same goes for the ones expecting WiiU being only a 360+, like how the Wii was a GC+.

We'll see how it all goes. Its a long generation. The 33 watts or whatever it is during gameplay will be a massive hindrance to Wii U performance. I do not anticipate it shooting much above that. Though we don't know what the new HD twins will draw, I will be very surprised if it isn't far more than that. So far I haven't seen anything on Wii U that was "impressive", just little steps above the Xbox 360.
 
We'll see how it all goes. Its a long generation. The 33 watts or whatever it is during gameplay will be a massive hindrance to Wii U performance. I do not anticipate it shooting much above that. Though we don't know what the new HD twins will draw, I will be very surprised if it isn't far more than that. So far I haven't seen anything on Wii U that was "impressive", just little steps above the Xbox 360.
Considering that most of WiiU games are Xbox 360 ports at best (with little budget behind and time constraints), that final dev kits weren't ready until november 2012 (most if not all the games currently on-sale were finished at least a whole month before that) and that it's a completely different architecture we are speaking of, the fact that we already have games at 360 level or even a bit better in some cases is a proof of how capable this piece of hardware is.

I personally think that we will be surprised by Pikmin 3. I mean, the game was finished for launch and since the past E3 it showed graphical upgrades on every new trailer, screenshot realeased.
It wasn't shown on the Nintendo Direct of January besides a few images that (again) had an upgrade, and considering the fact that this time Nintendo is going for the Hardcore, the similarity on the Wii/WiiU architectures (enough to do hardware emulation), and that the console needs to prove what is capable of, delaying it a few months in order to have both a solid product and an early technical showcase months before the competence releases its products is a plausible scenario. Of course, this is 100% speculation, but what matters is that this E3 will be an opportunity for nintendo and I doubt they will miss it.
 
If there even really is an "OS" in any traditional sense. At this point, I don't think there is.

You may be right, O/S's are traditionally responsive. :P
 
Status
Not open for further replies.
Top Bottom