WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Assuming the 360's lower end of FLOP output holds true, we are seeing AT LEAST 300 GFLOPS, and 4 time that is 1.2TFLOPS... which the Wii U is not...
That's a (very) dumb way to look at it. Xbox 360 is 240 GFlops, full stop. CPU Gflops are not used for graphics therefore it's just overhead for something else. AI, Physics, calculations you name it. Adding them up it's simply dumb as fuck as it's never that linear.

Sure Wii U software might be looking to move some of that to GPGPU, for the sound processor and other dedicated parts to take part of the load off or perhaps some of those these days belong on GPU anyway, but *perhaps* most code will instead be optimized to run on the CPU in general purpose, who knows. As said, totally not linear; most GFlops on the GPU will in the end be used solely for graphics though, that's a given.


Also, most people here are forgetting something, if you want a 720p X360 game running "as is" at 1080p you need 2.5 times the fillrate/floating point performance; that amounts to going from 240 GFlops to 600 Gflops. Resolution and pushing pixels is the single thing whose hit hasn't been softened by acelerations, optimizations and the like, it simply costs that much and requires a order of magnitude to improve whilst doing the very same thing merely pushing more pixels.

On Wii U's case, assuming it's a 372 GFlop part and comparable that way, then PS4 and X720 need to have 930 Teraflops to do at 1080p whatever Wii U is doing at 720p. Of course they surpass that, but X720 is going to be 1.2 Teraflops, which is not a huge overhead providing it's doing 1080p (if it is doing 720p though, it's very palpable of course).

I'm not commending Nintendo on the system final specs, make no mistake, playing it low is a risk, and it's certainly not doing them any favours, nor am I saying it'll benefit from multiplatform across the whole generation, but I also believe that's probably due to ill developer intentions and bad preconceptions, as most of the devs dissing the system haven't bothered to take a dip on it, they probably looked at specs and concluded they'd be bound by the same bottlenecks the current generation has, while we know Nintendo is all about removing them. Of course Nintendo is still to blame for putting themselves on the situation where support depends of developer goodwill; but it is nevertheless very doable, unlike GC to Wii they moved the GPU well past 2005/2006 tech (the current generation starting point).
Anyway, is he right to think Wii U is closer to PS4/Durango than Wii was to PS360?

I think he is technically right, but I was wondering, does it matter? The difference still seems very large.
Read above.

As for the rest of the equation, really depends on what the basis for next gen development become. See, if you do a game with tesselation geometry in mind then the Wii version and PS4/X720 might have the same assets with varying LOD's. That's very different from past generations when downports had to be optimized and lower polygon geometry had to be created and optimized.

G8ea0FD.png


There's palpable seeing the wireframe difference, yes, but with good tetxuring perhaps not so, plus providing it runs in a fluid manner across all systems it's unconsequencial; just like whilst running a game on a computer across various configurations, sans the lack of optimization typical of PC software.

Of course, most developers have no experience using this tech and would prefer to use it as a means to spam insane LOD's over an already overdetailed 3D model rather than making a game to be barebones on it's knees so that it can run across all systems easily, but the later is really the best way to use it as the end result would be pretty much the same on the higher end system (using tesselation as a core technique rather than a plus). Also note, rendering tech and tools are becoming more and more scaleable so it only makes sense that games follow that tendency if developers can get past their pride of "being focused on the higher end" and the "we look forward not backward" bullshit line of thought; coding close to the silicon of a high end machine (or any machine) can be considered nuts by now; instead code and assets must strive to be abstract enough.

Also, perhaps the Wii U is to X360 "the same Wii was to GC" on a order of magnitude; but you gotta consider the rubberbanding of the whole scenario; PS2 was 39 times less powerful than X360 is, GC was 28 times less powerful (this doing only GPU versus GPU math), and the Wii was 18 times less powerful than a X360 (not to mention lacking in feature parity). Wii U will be at most 5 times less powerful when compared to PS4 (with X720 staying somewhere in the middle), it's not the same situation.

Somewhere, sometime not supporting a platform because of such power differential won't be justifiable anymore, be it because it's already powerful enough or because also selling said software on it is a worthwhile option to get the whole investment back. That time could be now, but developers often behave like juvenile indie-wannabe elitist clusterfucks so the jury is out. One thing is for sure, the market probably won't grow this generation and development costs for PS4/X720 are supposed to increase, so making it a two console environment all over again might not be wise with a huge chunk of games not providing an experience that couldn't be achieved on current gen. And one could pull a lot more than what's being pulled out of current iOS/Android mobile devices, but it's not worth the investment, on a similar line of thinking, one could pull more out of current gen and will always be able to pull more out of upcoming next generation systems for they're very complex systems at this point; but is it worth to? The answer, paired with industry risk, cases of flop, losses and layoff shenanigans that are common now is increasingly no.

What I'm saying though is that Wii U having some features the past generation lacks (or lacks in a usable way) has the potential of single handedly making it a whole different system when it comes to capabilities. Tesselation in theory makes it so that you could pull the best X360 closed scenario graphics in a open world setting keeping the 240 GFlop spec, because it can keep the further away geometry simple (keeping the end result the same) sparing you loads of polygons and coordinate tracking or complex geometry swaps. You can effectively do more with less if you tackle it, but at the same time it's not like going to the code and adding tesselation=yes.

No one knows how to use it for that effect still.
 
That's a (very) dumb way to look at it. Xbox 360 is 240 GFlops, full stop. CPU Gflops are not used for graphics therefore it's just overhead for something else. AI, Physics, calculations you name it. Adding them up it's simply dumb as fuck as it's never that linear.

Sure Wii U software might be looking to move some of that to GPGPU, for the sound processor and other dedicated parts to take part of the load off or perhaps some of those these days belong on GPU anyway, but *perhaps* most code will instead be optimized to run on the CPU in general purpose, who knows. As said, totally not linear; most GFlops on the GPU will in the end be used solely for graphics though, that's a given.


Also, most people here are forgetting something, if you want a 720p X360 game running "as is" at 1080p you need 2.5 times the fillrate/floating point performance; that amounts to going from 240 GFlops to 600 Gflops. Resolution and pushing pixels is the single thing whose hit hasn't been softened by acelerations, optimizations and the like, it simply costs that much and requires a order of magnitude to improve whilst doing the very same thing merely pushing more pixels.

On Wii U's case, assuming it's a 372 GFlop part and comparable that way, then PS4 and X720 need to have 930 Teraflops to do at 1080p whatever Wii U is doing at 720p. Of course they surpass that, but X720 is going to be 1.2 Teraflops, which is not a huge overhead providing it's doing 1080p (if it is doing 720p though, it's very palpable of course).

I'm not commending Nintendo on the system final specs, make no mistake, playing it low is a risk, and it's certainly not doing them any favours, nor am I saying it'll benefit from multiplatform across the whole generation, but I also believe that's probably due to ill developer intentions and bad preconceptions, as most of the devs dissing the system haven't bothered to take a dip on it, they probably looked at specs and concluded they'd be bound by the same bottlenecks the current generation has, while we know Nintendo is all about removing them. Of course Nintendo is still to blame for putting themselves on the situation where support depends of developer goodwill; but it is nevertheless very doable, unlike GC to Wii they moved the GPU well past 2005/2006 tech (the current generation starting point).Read above.

As for the rest of the equation, really depends on what the basis for next gen development become. See, if you do a game with tesselation geometry in mind then the Wii version and PS4/X720 might have the same assets with varying LOD's. That's very different from past generations when downports had to be optimized and lower polygon geometry had to be created and optimized.

G8ea0FD.png


There's palpable seeing the wireframe difference, yes, but with good tetxuring perhaps not so, plus providing it runs in a fluid manner across all systems it's unconsequencial; just like whilst running a game on a computer across various configurations, sans the lack of optimization typical of PC software.

Of course, most developers have no experience using this tech and would prefer to use it as a means to spam insane LOD's over an already overdetailed 3D model rather than making a game to be barebones on it's knees so that it can run across all systems easily, but the later is really the best way to use it as the end result would be pretty much the same on the higher end system (using tesselation as a core technique rather than a plus). Also note, rendering tech and tools are becoming more and more scaleable so it only makes sense that games follow that tendency if developers can get past their pride of "being focused on the higher end" and the "we look forward not backward" bullshit line of thought; coding close to the silicon of a high end machine (or any machine) can be considered nuts by now; instead code and assets must strive to be abstract enough.

Also, perhaps the Wii U is to X360 "the same Wii was to GC" on a order of magnitude; but you gotta consider the rubberbanding of the whole scenario; PS2 was 39 times less powerful than X360 is, GC was 28 times less powerful (this doing only GPU versus GPU math), and the Wii was 18 times less powerful than a X360 (not to mention lacking in feature parity). Wii U will be at most 5 times less powerful when compared to PS4 (with X720 staying somewhere in the middle), it's not the same situation.

Somewhere, sometime not supporting a platform because of such power differential won't be justifiable anymore, be it because it's already powerful enough or because also selling said software on it is a worthwhile option to get the whole investment back. That time could be now, but developers often behave like juvenile indie-wannabe elitist clusterfucks so the jury is out. One thing is for sure, the market probably won't grow this generation and development costs for PS4/X720 are supposed to increase, so making it a two console environment all over again might not be wise with a huge chunk of games not providing an experience that couldn't be achieved on current gen. And one could pull a lot more than what's being pulled out of current iOS/Android mobile devices, but it's not worth the investment, on a similar line of thinking, one could pull more out of current gen and will always be able to pull more out of upcoming next generation systems for they're very complex systems at this point; but is it worth to? The answer, paired with industry risk, cases of flop, losses and layoff shenanigans that are common now is increasingly no.

What I'm saying though is that Wii U having some features the past generation lacks (or lacks in a usable way) has the potential of single handedly making it a whole different system when it comes to capabilities. Tesselation in theory makes it so that you could pull the best X360 closed scenario graphics in a open world setting keeping the 240 GFlop spec, because it can keep the further away geometry simple (keeping the end result the same) sparing you loads of polygons and coordinate tracking or complex geometry swaps. You can effectively do more with less if you tackle it, but at the same time it's not like going to the code and adding tesselation=yes.

No one knows how to use it for that effect still.

so, can i simply put things like this?

wiiu (720p 30fps) = xbox 720 (1080p 30fps) = ps4 (1080 60 fps)
 
so, can i simply put things like this?

wiiu (720p 30fps) = xbox 720 (1080p 30fps) = ps4 (1080 60 fps)

The problem becomes when the games start doing many more complex graphical effects and 720/PS4 run at 720p/30fps. Those games will not be able to run on Wii U. I'd expect this to start when the 2md and 3rd gen games start coming out. It will be easy for Wii U to get ports when their cross gen games.
 
so, can i simply put things like this?

wiiu (720p 30fps) = xbox 720 (1080p 30fps) = ps4 (1080 60 fps)
From X720's rumored 1.2 Teraflops to PS4's 1.8 Teraflop there's not enough overhead for a 30 to 60 fps jump (which is often linear in the lines of if you can pull 500 polygons at 60 frames per second then you can pull 1000 at 30).

Multiplatform across those platforms will probably be very similar with X720 serving as the lead platform, Sony probably expected being less favoured by ocidental developers so they chose to up the specs and removing possible development difficulties in order to ensue parity and perhaps a little edge; said edge will probably be used in the same way X360 edge was, slightly better framerate, not going as low on sub-1080p or sub-720p solutions (whatever the target is) or extra graphical effects and AA being applied.

The 30 to 60 fps difference in some games should only happen if X720 is somehow bandwidth starved at 1080p, like how this gen 720p @ 60 frames and 1080p altogether meant a huge undertaking not just for the fillrate capabilities of the machines but also their memory bandwidth. Similarly, Wii U seems to be strapped to 720p first and foremost by it's RAM. (sure it can pull 1080p on select titles, but it's RAM configuration was clearly thought out for 720p)

X720 is also using DDR3 (according to rumors) so despite having an eDRAM framebuffer, RAM might be a bottleneck against PS4's behemoth GDDR5 configuration. I expect it to not be the case in a significant way.


Whether X720/PS4 will be 720p or 1080p consoles is unpredictable at this point, I'd guess if Wii U earns it's space on the multiplatform race until they release then 1080p standard is doable, otherwise most developers might see more advantage in going 720p for their titles, seeing those consoles will be lowest common denominator against the PC (and they're already gonna be lacking in performance against top PC parts at launch)
 
I think most of the games this gen for Wii U is if the pubs want to put it on Wii U.

From what we seen of Thief I definitely think the Wii U could handle that.

The only thing is if Nintendo wants to do what they said at E3 2011 and do anything they can to get third party games on Wii U.

If the Wii U can handle the game, and the console starts selling better than pubs wouldn't have an excuse not to make a Wii U version, they can't just say...

Pub: "We don't want to make a Wii U version because...It's Nintendo"

That would be beyond stupid and Nintendo would laugh in their face and pay up.
 
read the forums and everyones analysis of the architecture. VGLeaks also detailed how PS4 was enhanced for compute. It basically just means the system is much more efficient. Also thuway and and Reiko got ahold of a MS document that said Durango was aiming for "100% efficiency". It was in comparison to how inefficient Xenos was.

heres the threads I'm referring to.

http://www.neogaf.com/forum/showthread.php?t=515882&highlight=

http://www.neogaf.com/forum/showthread.php?t=515746&highlight=

The entire concept of the GPU being 100% efficient is ridiculous. It's much more likely that the chart he saw had the new GPU as a baseline of efficiency vs Xenos. As in if XBox3 was 100% efficient then in comparison Xenos would be 60%. That in itself is questionable but it's the kind of thing you'd expect in a PR document. There's no way they're actually claiming it's 100% efficient though.
 
So the 1.8 and 1.2 TFlops inside PS4 and the 720 will never be met ?, interesting if true. Makes you wonder why so many people put so much into these numbers.

I judge a console on what I see on screen, E3 is Nintendo's last chance to prove WiiU is anything other than a tiny step above PS360 for me.

Its very similar to horsepower in a car. Horsepower is never a direct measure of a cars performance but is an indicator of it. For instance, a 100 hp mass produced car can be safely dismissed as a performance vehicle. You need to start boasting at least 300 horsepower to be taken seriously. Even then, all vehicle manufacturers state the horsepower the engine generates, but rarely mention what the car is able to output to "the street" due to power train efficiency (in fact, its usually companies which do after market modifications that'll boats the horsepower put to the tires). now, hp alone doesn't dictate performance. You also have to factor in suspension, tires, aerodynamics, the driver etc.

This is all very similar to the terraflop argument. A 100 gigaflop GPU is no longer impressive. We talk of at least 1 terraflop (like one possibility entertained in the several Wii U threads leading up to its reveal). The bigger the number, the more confident we are in a machines power, though this is not the sole indicator of performance. This is why we entertain the possibility of Wii U hitting way above the belt. Though conversely, the competition may actually be far more power than it. We'll have to wait and see.
 
The problem becomes when the games start doing many more complex graphical effects and 720/PS4 run at 720p/30fps. Those games will not be able to run on Wii U. I'd expect this to start when the 2md and 3rd gen games start coming out. It will be easy for Wii U to get ports when their cross gen games.
Not with that level of detail, no.

Then again, next gen consoles are not up to Epic's original Unreal Engine 4 vision either and yet we're seeing the compromise being made:

HHmJiiW.jpg
WOAvThn.jpg


Pretty apparently roughly half the particles and some missing effects. But scaling to this degree is not a difficult thing to do; feature parity is there.

Next gen bullet points go on and on about how everything is dynamic and calculated in realtime instead of prebaked, but as nice as that is, a lot of games can use prebaked light just the same and obtain the same results; the same as saying Uncharted, The Last of Us, Halo 4 and other games with prebaked lightning of this generation won't be humiliated all of a sudden, most of the time you won't realize the difference, the advantage is for developers, as pre-baking took some extra time and resources to do before while on any 3D editor where real-time/framerate isn't the issue, applying them is only a few clicks away.

Meaning a lot of multiplatform titles might opt to pre-bake for the Wii U what they're doing in real time elsewhere, or perhaps later prebake even on PS4/X720 if they become that obsolete against their PC counterparts. Anyway most things can be toned down or done in another less taxing way and still look good enough.

Time, each system sales, conjecture, development tools/middlewares and developer goodwill will tell.
 
The entire concept of the GPU being 100% efficient is ridiculous. It's much more likely that the chart he saw had the new GPU as a baseline of efficiency vs Xenos. As in if XBox3 was 100% efficient then in comparison Xenos would be 60%. That in itself is questionable but it's the kind of thing you'd expect in a PR document. There's no way they're actually claiming it's 100% efficient though.

GCN is supposed to be "low flops" but high efficiency. The whole "100%" efficiency came from them as a part of this architecture. Of course it won't hit 100, but it might come super close.
 
Those pictures are interesting. PC version of that demo was running in 1920x540 (my estimate) resolution actually. Download HD file from Nvidia, and you'll see that it's half res (or about half res) vertically.

How do we know the PS4 version was running at 720p?

lostinblue, the advantage of not prebaking the lighting is obviously that you can have any lighting in the scene be dynamic, move around or anything. Also, from the sounds of it, UE4 GI lighting will have to be prebaked regardless of platform.
 
GCN is supposed to be "low flops" but high efficiency. The whole "100%" efficiency came from them as a part of this architecture. Of course it won't hit 100, but it might come super close.
Not as linear; the more complex the hardware is the harder it is to tax it 100%; or rather, you can make a cpu go full load on crappy code, but that's not taking advantage of it, right?

GPU's are harder to tax as they have so many units/parts and subparts to them, and not all of them can hit 100% usage every time; it's like saying framerate varies depending on what's on screen, take this situation, you assigned hardware to do your stencil extrude map work, and for that purpose you assigned x overhead that you had, it's usage though varies in magnitude depending on how many characters you have on screen or how big/complex they are; and if you don't optimize on a case by case basis then you're always leaving overhead going unused.

Trying to use all of it though is nuts, and yet that's the tip of the iceberg, even if you could tax it 100% in a very predictable manner you're still bound to inefficiencies such as having to wait cycles until the information you want to access next becomes available due to the RAM refreshing cycles or for any other reason.


GCN efficiency is also relative, like the decisions that lead to the VLIW4 decision for the top range HD6xxx range previously (for whom GCN is an evolution of). Basically PC API wouldn't take advantage of the 5-way nature of AMD's VLIW5 stream processors, therefore they were often only 80% effective in that environment and said overhead was a waste of silicon space compared to the proposition of just having more stream processor units, and so they did.

That's not an issue on closed systems/consoles. GCN is more modern, robust and meant for CPGPU performing; but effectiveness percentage with tailor made code on a closed architecture should be the same. Efficiency per clock though might vary.
 
Those pictures are interesting. PC version of that demo was running in 1920x540 (my estimate) resolution actually. Download HD file from Nvidia, and you'll see that it's half res (or about half res) vertically.
Probably not.

That sounds like interlaced video; Epic wouldn't go for that on a PC demo, they'd go SLI (massively so) before bowing to resorting to non-square-pixels, scretching or underscan on a PC.


I've seen contradicting reports, found a source claiming it to be running at 2560x1400 but I doubt that (a lot); I recall Unreal Engine 3.9 Samaritan demo ran at 1920x1080 on PC so it seems likely that Epic would run this demo shooting for that as well.
How do we know the PS4 version was running at 720p?
I believe that's what they said, or rather, not 1080p.
lostinblue, the advantage of not prebaking the lighting is obviously that you can have any lighting in the scene be dynamic, move around or anything. Also, from the sounds of it, UE4 GI lighting will have to be prebaked regardless of platform.
Well, "real-time dynamic global illumination" sure is in their spec sheet of promises and intentions for what they consider "next gen".

It certainly doable on X720/PS4, albeit with caveats, like their 1 million "free" particles promise/intention.
 
Those pictures are interesting. PC version of that demo was running in 1920x540 (my estimate) resolution actually. Download HD file from Nvidia, and you'll see that it's half res (or about half res) vertically.

How do we know the PS4 version was running at 720p?

lostinblue, the advantage of not prebaking the lighting is obviously that you can have any lighting in the scene be dynamic, move around or anything. Also, from the sounds of it, UE4 GI lighting will have to be prebaked regardless of platform.

I remember reading somewhere that it was 1080p. In fact everything at the event was shown in 1080p.
 
I remember reading somewhere that it was 1080p. In fact everything at the event was shown in 1080p.
The PS4 thing? perhaps.

We should also note that Elemental was running on a single GTX680 on the PC. We also know that Elemental was running at 1080p and 60fps on the PC. Epic Games has not revealed the resolution and the framerate of the PS4 version, though we expect it to be at 1080p and 30fps.
Source: http://www.dsogaming.com/news/unrea...pc-comparison-between-elemental-pc-ps4-demos/

Meh, damn google, on some other forum they were all worked up because "it was 720p", who knows. Unless there are more recent news confirming just that.

EDIT: Edited the images above, they no longer talk about resolution, it's better that way.
 
Tim Sweeney said the demo was running in 1080p.

The features and effects in our new "Elemental" demo on PlayStation 4 are just the tip of the iceberg, showing dynamic lighting and shadowing, subsurface scattering and GPU-powered particle effects at full 1080p resolution.

http://ca.ign.com/articles/2013/02/25/developers-react-to-playstation-4

Also, the archway in the PS4 demo doesn't even have a texture on it. It's been speculated that the demo was quickly put together with very little optimization(PC version likely more optimized).
 
Tim Sweeney said the demo was running in 1080p.

http://ca.ign.com/articles/2013/02/25/developers-react-to-playstation-4

Also, the archway in the PS4 demo doesn't even have a texture on it. It's been speculated that the demo was quickly put together with very little optimization(PC version likely more optimized).
Edited said images in order to remove the resolution mention, that said I wouldn't want to focus on PS4 from here on out, that popped out as a sidenote of sorts seeing effect scalability was being discussed and will happen all across the board.

As for the demo, sure; not even gonna touch the texture situation as I guess the memory architecture being different (and them expecting 4 GB of RAM, perhaps) being the culprit. But particles and other effects are pretty straighforward and a matter of overhead available. This console is also pretty much a x86 PC with a AMD/ATi GPU too so code should be tight enough.

Bottom line was that the overhead for outputting the same amount of particles is quite obviously missing, and there's no way around it; it is known that Epic was shooting higher for next gen minimum specs (and I'm always wishing them and Crytek to quite honestly eat crow on their demands) and as such PS4 has less fillrate to go around; X720 will too.
 
There's was also some speculation that it was targeting 4GB as opposed to 8GB of RAM. Baseless really, but wouldn't be surprising either.
Not really. We know the switch from 4GB to 8GB was very recent so it's not hard to imagine that the games/demos shown were targeting 4GB of RAM.

Sorry for going off topic.

Edited said images in order to remove the resolution mention, that said I wouldn't want to focus on PS4.

As for the demo, sure; bottom line was that the overhead for the same amount of particles is missing, and there's no way around it, as PS4 has less fillrate to go around.
True.
 
Also, the archway in the PS4 demo doesn't even have a texture on it. It's been speculated that the demo was quickly put together with very little optimization(PC version likely more optimized).

It looks textured to me. Is it possible the effect in the PC demo was related to voxel lighting and they simply swapped in a texture.
 
It looks textured to me. Is it possible the effect in the PC demo was related to voxel lighting and they simply swapped in a texture.
DOF and per pixel motion blur are also missing; the specular highlights or equivalent effect on the eyes is also missing.

It's probably a quick adaptation, particles and global ilumination are really the core innovations of this tech demo/engine so they focused on putting them up and running, I guess all the rest was simply turned off if it was in the way of getting the most particles possible out. As for the archway, it sticks out but it isn't the only thing changed, the rock platform where the guy is standing also has a more rugged look to it, I wouldn't focus on such differences.


On a side note it sure would be fun to see this tech demo running across all next gen consoles, including the Wii U.
 
DOF and per pixel motion blur are also missing; the specular highlights or equivalent effect on the eyes is also missing.

It's probably a quick adaptation, particles and global ilumination are really the core innovations of this tech demo/engine so they focused on putting them up and running, all the rest was simply turned off if it was in the way of getting the most particles possible out.


On a side note it sure would be fun to see this tech demo running across all next gen consoles, including the Wii U.

I liked the Wii U tech demos more betterer to be honest.
 
The problem becomes when the games start doing many more complex graphical effects and 720/PS4 run at 720p/30fps. Those games will not be able to run on Wii U. I'd expect this to start when the 2md and 3rd gen games start coming out. It will be easy for Wii U to get ports when their cross gen games.
I would like to mention we haven't seen a Wii U game made for the pad only (854 x 480).

Not that I'm a big believer in Wii U hardware but just saying the option is there.
 
I would like to mention we haven't seen a Wii U game made for the pad only (854 x 480).

Not that I'm a big believer in Wii U hardware but just saying the option is there.
Hah, reading that resolution associated to a HD system I instantly thought of Alan Wake (960x544) and Star Ocean 4 (936x512 battles) on the X360.

That (a game where the graphical intensive output is not meant to run on the TV) probably won't happen; separate running paths to bump up the graphics in a great way when running on the controller too are unlikely.
I liked the Wii U tech demos more betterer to be honest.
Me too, but fun as a means to compare.

Kinda curious as how many particles can the Wii U could output attempting the same tech demo at 720p, I'm guessing 1/2 but I have no clue how that would look.
 
Not sure why so much talk about wiiu and ps4/720 running the same games. Doesn't look it will happen much next Gen without a ps360 version also being made. That right there is very telling...
 
Not sure why so much talk about wiiu and ps4/720 running the same games. Doesn't look it will happen much next Gen without a ps360 version also being made. That right there is very telling...

To be fair, we haven't heard a lot about multiplatform titles purely for "next-gen" consoles, and probably won't until later this year. If a lot of titles are announced for PS4/720 at E3, with no Wii U counterparts, then yeah, it might be telling.
 
To be fair, we haven't heard a lot about multiplatform titles purely for "next-gen" consoles, and probably won't until later this year. If a lot of titles are announced for PS4/720 at E3, with no Wii U counterparts, then yeah, it might be telling.
Well Thief was announced today/yesterday which is next-gen only.
 
Not sure why so much talk about wiiu and ps4/720 running the same games. Doesn't look it will happen much next Gen without a ps360 version also being made. That right there is very telling...

To be fair, we haven't heard a lot about multiplatform titles purely for "next-gen" consoles, and probably won't until later this year. If a lot of titles are announced for PS4/720 at E3, with no Wii U counterparts, then yeah, it might be telling.

Well Thief was announced today/yesterday which is next-gen only.

Some games were never going to get Wii U ports no matter how powerful the hardware was.
 
Some games were never going to get Wii U ports no matter how powerful the hardware was.

This is speculation I don't agree with. Not saying it would get every port, but it would certainly get all the important cross platform titles if they would run with minimum effort.
 
This is speculation I don't agree with. Not saying it would get every port, but it would certainly get all the important cross platform titles if they would run with minimum effort.
SQEX didn't even release Tomb Raider on Wii U, and that would have required even less effort. We probably all agree that there were no technical reasons, and Wii U devkits are out for close to two years now, so time wasn't an issue, either. Business decisions.
 
SQEX didn't even release Tomb Raider on Wii U, and that would have been even less effort. We probably all agree that there were no technical reasons, and Wii U devkits are out for close to two years now, so time wasn't an issue, either. Business decisions.

Final dev kit just went out around launch.
 
SQEX didn't even release Tomb Raider on Wii U, and that would have required even less effort. We probably all agree that there were no technical reasons, and Wii U devkits are out for close to two years now, so time wasn't an issue, either. Business decisions.

Woops, I kinda meant next-gen cross platform titles. But, I won't dispute your opinion, I just disagree with it.

A potential scenario in my mind would have been something along these lines. Let's imagine a more powerful Wii U launched at the same price (imagine they did whatever it took to get there, like sticking solely to motion controls):

Super Wii U lives up to the hype and is powerful enough to run current gen games with ease at 1080p and in many cases even 60 fps and better textures.

Developers are stoked and put a good effort in ports.

Public sees Xbox 360 and PS3 games running AMAZING on the Wii U and it just poops on their versions.

Wii U would sell much better based on the hype of this new, hot machine.

Publishers don't start canceling shit left and right and Wii U just get more and more ports announced and their next-gen games on their fancy new engines get ports as well.

This is based on absolutely nothing of course, because this is not what occured and is 100% fancy. But I like to dream.
 
Woops, I kinda meant next-gen cross platform titles. But, I won't dispute your opinion, I just disagree with it.

A potential scenario in my mind would have been something along these lines. Let's imagine a more powerful Wii U launched at the same price (imagine they did whatever it took to get there, like sticking solely to motion controls):

Super Wii U lives up to the hype and is powerful enough to run current gen games with ease at 1080p and in many cases even 60 fps and better textures.

Developers are stoked and put a good effort in ports.

Public sees Xbox 360 and PS3 games running AMAZING on the Wii U and it just poops on their versions.

Wii U would sell much better based on the hype of this new, hot machine.

Publishers don't start canceling shit left and right and Wii U just get more and more ports announced and their next-gen games on their fancy new engines get ports as well.

This is based on absolutely nothing of course, because this is not what occured and is 100% fancy. But I like to dream.

The problem with this dream is that it assumes the public's issue with the Wii U is fundamentally based on its graphical capabilities - and we have no real data to back up that idea. Also, it doesn't change the "wait and see" approach many publishers apparently take with Nintendo hardware - with Sony and Microsoft's platforms, their dedication is often a given, but no matter what track record of sales Nintendo achieves with an earlier platform, publishers always seem wary about Nintendo's offering. Of course, there are business reasons for this attitude, I'm sure.

The Wii U's current failing is that there are really no games that push the system from a technical or conceptual point of view. Nintendo Land is probably the best game in both regards, but it's not exactly rippling with ambition. Without software to back the system, marketing suffers. If Nintendo and its few loyal partners can push out must-have software, then the system will live and third parties might give it the time of day.
 
Do we know if the Wii U has "real" tesselation support? What I mean is will games be able to make heavy use of tesselation, or will limited hardware resources make it be kind of a minor bonus feature? I mean, doesn't the 360 also have a tesselation unit but we don't see much of it?

We talk about the number of shader cores available on a GPU. Is there an equivalent count of tesselation units?
 
The problem with this dream is that it assumes the public's issue with the Wii U is fundamentally based on its graphical capabilities - and we have no real data to back up that idea. Also, it doesn't change the "wait and see" approach many publishers apparently take with Nintendo hardware - with Sony and Microsoft's platforms, their dedication is often a given, but no matter what track record of sales Nintendo achieves with an earlier platform, publishers always seem wary about Nintendo's offering. Of course, there are business reasons for this attitude, I'm sure.

The Wii U's current failing is that there are really no games that push the system from a technical or conceptual point of view. Nintendo Land is probably the best game in both regards, but it's not exactly rippling with ambition. Without software to back the system, marketing suffers. If Nintendo and its few loyal partners can push out must-have software, then the system will live and third parties might give it the time of day.

There is a distinct lack of games, partly IMO due to the weaker hardware not inspiring anyone to port anything to the console. This is not the sole reason, as third parties are weary of Nintendo, but add to this another weak console and there is no incentive to make anything on the platform. The casuals have moved on to smartphones and tablets, graphics are not an issue to them...they already have hardware that caters to them. Nintendo is left with core gamers...and they do give a crap about graphics to a certain extent.
 
Do we know if the Wii U has "real" tesselation support? What I mean is will games be able to make heavy use of tesselation, or will limited hardware resources make it be kind of a minor bonus feature? I mean, doesn't the 360 also have a tesselation unit but we don't see much of it?

We talk about the number of shader cores available on a GPU. Is there an equivalent count of tesselation units?

No, its like the x360 pretty much.
 
Woops, I kinda meant next-gen cross platform titles. But, I won't dispute your opinion, I just disagree with it.

A potential scenario in my mind would have been something along these lines. Let's imagine a more powerful Wii U launched at the same price (imagine they did whatever it took to get there, like sticking solely to motion controls):

Super Wii U lives up to the hype and is powerful enough to run current gen games with ease at 1080p and in many cases even 60 fps and better textures.

Developers are stoked and put a good effort in ports.

Public sees Xbox 360 and PS3 games running AMAZING on the Wii U and it just poops on their versions.

Wii U would sell much better based on the hype of this new, hot machine.

Publishers don't start canceling shit left and right and Wii U just get more and more ports announced and their next-gen games on their fancy new engines get ports as well.

This is based on absolutely nothing of course, because this is not what occured and is 100% fancy. But I like to dream.

If it's true about pubs canning Wii U projects because the sales aren't going all out, did this happen with the PS3 in it's first few years? Because I heard pubs had confidence in the PS3 turning around, if that's true than they shouldn't be canning projects because of the first few months sales, 3DS is a example for that.

I think there is more to the story.
 
If it's true about pubs canning Wii U projects because the sales aren't going all out, did this happen with the PS3 in it's first few years? Because I heard pubs had confidence in the PS3 turning around, if that's true than they shouldn't be canning projects because of the first few months sales, 3DS is a example for that.

I think there is more to the story.

PS2 and the DS were monsters in their respective time and true generational leaps from their predecessors. The Wii was an aberration, a positive one, but it was catching lightning in a bottle. While developers expected turn arounds for PS3 and DS, I don't think they have the confidence at all in the Wii U. The Wii was a difficult enough market to break into for various reasons, and now the Wii U is merely matching last generation systems. Horrendous sales does not encourage them to make an effort.
 
Sounds like a pretty bleak future for Wii U.

People should have known this from the beginning. Power of the system really wasn't going to matter that much. This should have been known even before E3 last year. I guess a lot of people just have a hard time accepting that.
 
PS2 and the DS were monsters in their respective time and true generational leaps from their predecessors. The Wii was an aberration, a positive one, but it was catching lightning in a bottle. While developers expected turn arounds for PS3 and DS, I don't think they have the confidence at all in the Wii U. The Wii was a difficult enough market to break into for various reasons, and now the Wii U is merely matching last generation systems. Horrendous sales does not encourage them to make an effort.

What happens if the PS4 and Xbox 720 don't sell as expected? Will devs just keep making 360/PS3 versions of Elders Scrolls 6, GTA6, etc?

Some devs made a mistake not supporting the Wii last gen (when they made high budget PS3/360 games and they didn't sell and went out of business), I just think ignoring Nintendo this gen like last gen on console face is not a smart idea, and I think that about any of the systems, If pubs aren't happy when a game sales 2 million only than it's time for something to change, maybe supporting 3 systems could help, or not, who knows.

I just think it's too early to count Nintendo out.
 
...Just how did this thread end up talking about the PS4 and the mind/market share for the Wii U? I thought this was a thread for analyzing Latte?
 
There is a distinct lack of games, partly IMO due to the weaker hardware not inspiring anyone to port anything to the console. This is not the sole reason, as third parties are weary of Nintendo, but add to this another weak console and there is no incentive to make anything on the platform. The casuals have moved on to smartphones and tablets, graphics are not an issue to them...they already have hardware that caters to them. Nintendo is left with core gamers...and they do give a crap about graphics to a certain extent.


People are also forgetting the fact there are less players on the market making games.
Where is Atari, THQ, Eurocom, Cing, etc. Each one of those could have provided one or two decent games during the launch window.

And a company the size of EA not providing support like Ubisoft, really has an effect as well.
 
The 360 had tessellators? Please explain.
X360 had a tessellation unit.

It was borderline unusable because it didn't support vertex compression so it would just bloat geometry data in a very huge way and it did require multiple passes tied to the unified shaders.

AFAIK it was only used for Halo 4 water.


Not comparable to Wii U's worst case Tessellation unit scenario, let alone the best.


EDIT: I didn't realize it was USC-fan doing the trolling honors.
can't what? I stated a fact., Xbox 360 does has a tesselation unit. Try again...
So does 3DS.

Saying Wii U Tesselation feature set is the same as X360's is like saying Xbox 1 had shader's/it's not fixed function therefore it's pretty close to a X360/PS3, makes no sense.
 
Status
Not open for further replies.
Top Bottom