WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Both questions are a bit loaded. A game that runs at 60fps is going to need more CPU power than when running that same game at 30fps and is independent of resolution. And, even with enough CPU power, it could still be limited by memory bandwidth.

It's not debatable that both Xbox One and PS4 are orders of magnitude faster than the Wii U on pretty much all fronts. But it doesn't mean that some of the same games aren't possible on all platforms. There are countless PC games out there playable across systems with similarly great performance differentials.

The hardest thing to get around is the difference in memory size. A game using 5+ gigabytes of RAM on anything other than texture data is pretty much impossible on a system with just 1gb of usable RAM.

Which is blatantly false. There's no a single spec to my knowledge that's even ONE order of magnitude (10x) faster

edit: I see you meant multiples. Carry on
 
Both questions are a bit loaded. A game that runs at 60fps is going to need more CPU power than when running that same game at 30fps and is independent of resolution. And, even with enough CPU power, it could still be limited by memory bandwidth.

It's not debatable that both Xbox One and PS4 are orders of magnitude faster than the Wii U on pretty much all fronts. But it doesn't mean that some of the same games aren't possible on all platforms. There are countless PC games out there playable across systems with similarly great performance differentials.

The hardest thing to get around is the difference in memory size. A game using 5+ gigabytes of RAM on anything other than texture data is pretty much impossible on a system with just 1gb of usable RAM.

Just like how the HD twins have existed next to the PCs with a lot of memory? There are always ways around it, its just how the results pan out.

Is 720p60 equally taxing as 1080p30?
 
Another question: is 720p60 comparable to 1080p30? Which one is more taxing?

That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!

I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.

So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.
 
Is 720p60 equally taxing as 1080p30?

I'm nowhere near as knowledgeable as some others in here, but I would say it depends on your game code (non graphics).

If you just update graphics code 60 fps but keep the rest of you code at 30, is less taxing than all code at 60, but more taxing than all 30. No idea how much more taxing 1080 is than 720 though.
 
Slandering Montpelier without a source, not nice...

I'm not slandering anyone mate. When Ancel said that Rayman Legends as it was wasn't possible on the PS3 and 360 hardware then they would have had to have downgraded the game in some way to make the port possible. There was one early screenshot that showed well over a hundred enemies onscreen at once, I haven't got the game yet thanks to Zavvi being shite but if anyone that has the game can check it out I'd be interested to see if the final multiplatform game has that many enemies.

Edit: And I can't for the life of me find that bloomin quote either. He specifically mentioned the number of enemies onscreen at once and it coincided with the screenshot that showed well over a hundred enemies onscreen at once. Might have been a video interview which would explain the lack of quotes. He definitely said it anyway.
 
That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!

I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.

So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.

Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.

Interesting, Wii U seems built for strong 720p games IMO, with either 60 or 30 fps depending on the effects, etc devs are willing to push.
 
That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!

I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.

So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.

I think I saw a mod mention that keeping a consistent 60 fps is closer to 3x the performance requirement of 30 fps due to those issues.
 
Both questions are a bit loaded. A game that runs at 60fps is going to need more CPU power than when running that same game at 30fps and is independent of resolution. And, even with enough CPU power, it could still be limited by memory bandwidth.

It's not debatable that both Xbox One and PS4 are orders of magnitude faster than the Wii U on pretty much all fronts. But it doesn't mean that some of the same games aren't possible on all platforms. There are countless PC games out there playable across systems with similarly great performance differentials.

The hardest thing to get around is the difference in memory size. A game using 5+ gigabytes of RAM on anything other than texture data is pretty much impossible on a system with just 1gb of usable RAM.

You are correct. It's not that it's impossible to adapt game engines to the system despite its massive gulf in performance in comparison. (For example, see Frostbite 2/3 and its debate despite its existence on the 360, UE4 and its mobile variants, etc). However, the issue with the Wii U is that there is near zero business incentive on the platform when it comes to the bean counters (both prior to launch, and obviously after).
 
Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.
One thing is that often framerate decisions come down to how long it takes to process a frame. CPU and even GPU calculations take time, the more a developer wants to do during a frame will obviously take more of that time up. Eventually the processing time goes beyond 16ms, which makes constant 60fps imposible. The decision is to either cut back on things that need processing, or drop the fps to 30, or something lower (or variable).

1080p/30 vs 720p/60 is pretty hard to judge.

I wouldn't say nessisarily that the WiiU is balanced for 60fps, more that the games targeting the system (that are 60fps), are balanced well for 720p/60fps...
 
One thing is that often framerate decisions come down to how long it takes to process a frame. CPU and even GPU calculations take time, the more a developer wants to do during a frame will obviously take more of that time up. Eventually the processing time goes beyond 16ms, which makes constant 60fps imposible. The decision is to either cut back on things that need processing, or drop the fps to 30, or something lower (or variable).

1080p/30 vs 720p/60 is pretty hard to judge.

Thanks for your inputs, this is very interesting. Basically there are so many variables to go by that it makes comparing the two very hard to do, (engine, effects, game non-graphics calculations, etc.)

To be honest I did not expect so many Wii U games to be 720p60, while some are simpler others like Bayo2, W101 and MK8 seem to be doing a lot.
 
Thanks for your inputs, this is very interesting. Basically there are so many variables to go by that it makes comparing the two very hard to do, (engine, effects, game non-graphics calculations, etc.)

To be honest I did not expect so many Wii U games to be 720p60, while some are simpler others like Bayo2, W101 and MK8 seem to be doing a lot.

Yeah, most games will see the higher framerate being more taxing. Back when I had a weaker PC, switching from 1080p to 720p pretty much never gave the equivalent of a 30 -> 60 performance boost in framerate.
 
I have a question too, and sorry if it has been discussed already, but do we know why the Wii U and PS360 versions of Rayman look the same? Before it went multiplat, didn't someone at Ubisoft say that they were doing things graphically that could only be done on the Wii U? Was that BS or what?

Here is the PC picture of Rayman Legends
http://images.eurogamer.net/2013/articles//a/1/6/1/2/3/1/6/PC_000.bmp.jpg

Here is the 360 version of the same scene. Use the cage to compare as you can see the aliasing, allowing you to compare the two.
http://images.eurogamer.net/2013/articles//a/1/6/1/2/3/1/6/360_000.bmp.jpg

Here is the Wii U picture just for the comparison.
http://images.eurogamer.net/2013/articles//a/1/6/1/2/3/1/6/WiiU_000.bmp.jpg

Same native resolutions.

The PC and Wii U versions look about the same, minus particles(fireflies). The 360 version's lighting looks different. Compare the sides of the dragon, the Wii U and PC look identical. 360 looks darker... looks like a different lighting engine maybe? Dunno, hard to tell.
 
The PC and Wii U versions look about the same, minus particles(fireflies). The 360 version's lighting looks different. Compare the sides of the dragon, the Wii U and PC look identical. 360 looks darker... looks like a different lighting engine maybe? Dunno, hard to tell.

No, the 360 has a known gamma issue.
 
Just like how the HD twins have existed next to the PCs with a lot of memory? There are always ways around it, its just how the results pan out.

Yes, but that's also a very different scenario. PC games are still shipping 32-bit executables which are limited to 2gb of memory usage. Some of that memory is replicated between video and system RAM bringing total memory usage much lower than 2 gigabytes in the vast majority of cases. This kind of memory duplication isn't as necessary on consoles.

A console-to-PC comparison involves many overheads that don't exist in a console-to-console comparison.

Is 720p60 equally taxing as 1080p30?

There's not a 'yes or no' answer to that. Both take roughly the same amount of GPU resources but the rest of the system also needs to be up to snuff.

To put it in context, a game designed explicitly to run at 1080p30 on a PS4 would be far too CPU intensive to run at 720p60 on a Wii U.
 
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.

I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
 
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.

I wonder if they're spending so much time learning the console that they just put that idea on the backburner.

I think the console supports it, but Gamepads are still too expensive to produce and sell on the shelves of Walmarts as of yet.
 
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.

I wonder if they're spending so much time learning the console that they just put that idea on the backburner.

think they are waiting to get an established base and a reason to do it, until then its unnecessary consumer confusion with that extra box on the shelf (brings back the whole is the wiiu the pad attachment to the wii) IMO
 
There's not a 'yes or no' answer to that. Both take roughly the same amount of GPU resources but the rest of the system also needs to be up to snuff.
Actually, it's not so simple, even just for the GPU. In terms of frame buffer and bandwidth, yeah they're pretty similar, but things like shader calculations take time as well, and doubling how many you need to do per second is non-trivial, as resolution has little to no effect on a shaders performance.

Add to that, with GPGPU happening more and more (perhaps even for the Wii U), those "general purpose" calculations will also eat into GPU processing time, something to consider.

In the end 30fps is easier to do all around, kudos to all those studios that can maintain 60fps and have a complicated game, at whatever resolution!
 
That's a good question. Some often point to the 2.25X increase in pixel, and directly compare it to the 2X increase in framerate performance, and conclude that the former is more taxing. But that's not a fair direct comparison, because in one case, you're just rendering more pixels. In the other case, the GPU has extra frames of pixels to render, textures for those new frames, lights for those new frames, etc. Double the work in every area. Then there's new frames of animation, sounds that need to be synced to those frames, and whatever work the CPU has cut out. It's a lot of extra work all around!

I've seen a dev(on GAF, I believe) said something to the effect of; "boosting resolution is much easier than doubling framerate. So if devs have a bit of performance to spare, most would opt for an increase in pixels." This is the reason why 60fps is normally a goal decided upon very early on in development. The whole engine must be built around it.

So although it's hard to quantify by a directly comparison of numbers, I'm inclined to believe that 720p-60fps may be more taxing, in general.

You are correct. It's not that it's impossible to adapt game engines to the system despite its massive gulf in performance in comparison. (For example, see Frostbite 2/3 and its debate despite its existence on the 360, UE4 and its mobile variants, etc). However, the issue with the Wii U is that there is near zero business incentive on the platform when it comes to the bean counters (both prior to launch, and obviously after).

One thing is that often framerate decisions come down to how long it takes to process a frame. CPU and even GPU calculations take time, the more a developer wants to do during a frame will obviously take more of that time up. Eventually the processing time goes beyond 16ms, which makes constant 60fps imposible. The decision is to either cut back on things that need processing, or drop the fps to 30, or something lower (or variable).

1080p/30 vs 720p/60 is pretty hard to judge.

I wouldn't say nessisarily that the WiiU is balanced for 60fps, more that the games targeting the system (that are 60fps), are balanced well for 720p/60fps...
Thank all of you for those elaborate answers. I did underestimated how many more resources are needed to push a game from 30 to 60 frames.
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.

I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
For that idea, they may also want to reduce the production cost of the controller. Nintendo would likely want to increase the popularity of their console before those production issues significantly improves.
 
Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.

In terms of pixel fillrare, yes 1080p-30fps is a bit more taxing. Also keep in mind that the oft-mentioned 2.25X difference only applies to the pixels on screen. It's a slight exaggeration. The actual performance(fillrare) difference is considerably smaller when you account for the 2X higher rate at which 720p-60fps must get those pixels onscreen. But like I said, in general there's a LOT more going on, performance wise - on both the GPU and CPU - that doubling framerate must account for. Much more than increasing pixels alone. Yeah, I think 720p-60fps is a sweetspot for Wii U. It seems like it's built for that WITH gamepad interactions in mind.

*My Wii U just arrived(FedEx), in between texting. Sweeeeet!*

I think I saw a mod mention that keeping a consistent 60 fps is closer to 3x the performance requirement of 30 fps due to those issues.

I don't know how to quantify all those areas were the CPU and GPU have double work, but 3X overall wouldn't be surprising.
 
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.

I wonder if they're spending so much time learning the console that they just put that idea on the backburner.
Well, we already know that the 2nd gamepad will add limitations to the experience (half the refresh rate), given Nintendo is already having trouble selling, and spreading the message about the gamepad, I don't think it's in Nintendo's best interest to open the floodgates for negativity about the experience when using two gamepads.

I wouldn't be surprised if the 2nd gamepad will be the vitality sensor of this generation, mentioned at the start, and never implemented by Nintendo after. Personally, I think announcing it was a just a knee jerk reaction to fans when they requested it initially.

I could totally see the use for it, but knowing that it wasn't designed for it initially (no dedicated antenna for the 2nd one), and that there will be compromises, makes me less enthusiastic about it. I would have loved for them to have 4 dedicated antennas in there, and go hog wild with supporting 4 of them, and having games really make use of it.
 
All very good answers about 2 gp support. But you would think with the 2ds and Wii U price drop coming, they would have at least announced their summer update along with the upgraded gp battery imo.
 
Well, we already know that the 2nd gamepad will add limitations to the experience (half the refresh rate), given Nintendo is already having trouble selling, and spreading the message about the gamepad, I don't think it's in Nintendo's best interest to open the floodgates for negativity about the experience when using two gamepads.


I believe early on Nintendo said that they were focusing one gamepad this year so developers could first get familiar developing for it. The console hasn't even been out a year yet, so its hard to say if they are still on track releasing the second gamepad with games next year.

Nintendo probably is being coy about the second gamepad because they want Sony and MS to show their hand. Now we are hearing rumors that Sony might show off their own VR glasses. Nintendo could counter that with a second gamepad announcement.
 
Could they not just disable the display to the tv when using 2 game pads?
While it's yet to be seen how the general gaming performance will be affected having 2 gamepads, the limitation that we know of isn't where you're thinking (I believe). It's actually the data communication between WiiU and Gamepad, the Wii U is only able to send 60 frames of data to the gamepad per second (not relating to GPU performance), so to support 2 gamepads, it has to split up that 60 frames to two devices.

There will also likely be an additional performance hit if 2 gamepads are used in terms of CPU/GPU (how much is unknown), but yes, if Nintendo allowed the disabling of the main display, that would most definitely offset it.
 
I wouldn't be surprised if part or the reason they don't release a standalone gamepad is because they are afraid people will think its just a peripheral for the Wii.
 
Thanks, so it seems 1080p is more taxing on the graphics card and taking the game to 60fps affects the system in general. IMO Wii U seems indeed very balanced for 720p60.

Interesting, Wii U seems built for strong 720p games IMO, with either 60 or 30 fps depending on the effects, etc devs are willing to push.
It's a development decision.

If developers wanted to, every single console game nowdays would be 1080p/60fps. They would just have to make their games simpler.

Many people keep directly comparing resolution and framerate between games completely ignoring what the game is actually doing each frame and that's just wrong.
 
Just wondering if it is possible that Nintendo could have disabled some of the shaders until PS4XB1 started production. Then unlock them in the October update. May start pushing out warm air with unlocked shaders.
 
I really am not the type to care about the graphics between the 3 consoles, but it kinda grinded my gears when Nintendo mentions the 2 gamepad support 2 years ago then just fall silent about it.

I wonder if they're spending so much time learning the console that they just put that idea on the backburner.

The gamepad is appealing to only one person at a time, while motion controls were new and fun for the whole family with the original Wii. This is part of the reason why they are suffering...the gimmick is so dramatically less impressive.
 
To put it in context, a game designed explicitly to run at 1080p30 on a PS4 would be far too CPU intensive to run at 720p60 on a Wii U.

Not necessarily. As with everything else being discussed relating to this, it depends on a lot of factors beyond just saying "it won't run". Lots of other compromises may/could be made, but saying unilaterally that it can't be done isn't accurate.
 
Just wondering if it is possible that Nintendo could have disabled some of the shaders until PS4XB1 started production. Then unlock them in the October update. May start pushing out warm air with unlocked shaders.

Not a chance.

So does the Wii U, but it isn't represented in the shots...

Well, to be more accurate, the Wii U needs a set that plays well with Limited RGB. Some capture equipment (like the one Lens of Truth uses) doesn't play well with this but my TV sets and projectors certainly do so it looks "normal".
 
Just wondering if it is possible that Nintendo could have disabled some of the shaders until PS4XB1 started production. Then unlock them in the October update. May start pushing out warm air with unlocked shaders.

Didn't Iwata say around Jan this year that people have not seen 50% of what WiiU is capable of ?. He could and is most probably talking about developers learning the hardware though.

I really wouldn't be surprised though if clock changes were possible through a system update although obviously not to the ridiculous degree that the rumour from a few months ago would suggest. Maybe a small boost to the CPU and or GPU would be possible.

My biggest hope is that they can really optimize the OS over the coming year and eventually unlock a further half a gig of RAM for developers.

The WiiU specs are what they are but I believe if you gave Nintendo the chance to change / tweak one thing about the specs, they would have put another 2GB's of RAM in the system which would give it 3GB's for games with the possibility of 3.5GB's in the future which isn't all that far behind PS4 and XBOX ONE's 5GB's of RAM for games. It would make porting games, far, far easier from the other two next gen consoles.
 
Well, to be more accurate, the Wii U needs a set that plays well with Limited RGB. Some capture equipment (like the one Lens of Truth uses) doesn't play well with this but my TV sets and projectors certainly do so it looks "normal".

Good point. I could be just that.
 
...Can't the Internet Browser take up a hefty chunk of RAM?

but they could disable browser access for certain titles couldn't they? They disable home button functionality for online enabled games to prevent network connectivity issues. They could do the same for games that use more than the 1GB of RAM to prevent issues with the game's use of the RAM.
 
Didn't Iwata say around Jan this year that people have not seen 50% of what WiiU is capable of ?. He could and is most probably talking about developers learning the hardware though.

Lets be honest here, Iwata talks absolute crap. You'd have to be a fool to believe anything that guy or Nintendo themselves say.

This is the same guy that told us: (paraphrasing here)

We've learned from our mistakes with the 3DS. The Wii U will have a strong lineup of games within the launch Window.

And:

Nintendo have built the Wii U with 3rd parties in mind. We've invested significant resources and time into ensuring the hardware is as appealing as possible to 3rd parties.

And:

One issue we had with the Wii was that developers couldn't bring their games to this platform as its performance and architecture simply weren't capable of meeting developer requirements. With the Wii U we've ensured it's performance and architecture will be competitive to our rivals up coming consoles, there wont be a massive gulf in performance, and our architecture will be modern. Developers wont have issues bringing games from Xbox One, PC, and PS4 to this platform.

Iwata is either one of the most ignorant men in the games industry if he honestly believed the crap he was saying prior to launch. Or he was flat out lying through his teeth about the Wii U.

I'm going with lying, as surely no one can be that ignorant. Either way he's already shot up any credability he had with me, as the claims he's made and commitments he's made have all fallen down.
 
No it's launch window was not fine. Even now, it's not fine.

Not even Nintendo agree with you. Only last month they went on record saying they believe their biggest issue is lack of compelling software.
 
No it's launch window was not fine. Even now, it's not fine.

Not even Nintendo agree with you. Only last month they went on record saying they believe their biggest issue is lack of compelling software.

Which is why they had games with 2:3 attach rates. Lol yea, that's not compelling at all...

I'm not saying it couldn't have been better (too many old ports, for example), but it had multiple solid to great titles, so it wasn't ass or anything.
 
Which is why they had games with 2:3 attach rates. Lol yea, that's not compelling at all...

If the Wii U had such compelling software why's it breaking records as the worst selling console since the Saturn?

If the Wii U had such compelling software why only a few weeks ago did Nintendo go on record saying lack of compelling software is one of their biggest issues and concerns?

You're kidding yourself if you think the Wii U's currnet lineup of software is compelling. If it was, the console wouldn't be selling like ass.

Also games like Pikim and 101 have sold like ass to boot. Shows how compelling they were.
 
Wii U's launch window was fine. It was everything between then and about now that blew.
Its launch day was fine. Everything after 2012 was a bungled mess. IIRC, the main reason so many parties came to launch was because Nintendo promised a better royalty percentage for launch-day games. In all honesty they should have extended that offer through the window so they would have seen a more consistent release schedule throughout instead of 25 some odd games with varying levels of port quality (many bad) all in pretty much one day.
 
So Nintendo are lying then are they? Even they're saying the lack of compelling software is hurting the Wii U.

Also it's the major reason why the Wii U is breaking records for worst selling major console since the Saturn.

You're dreaming if you think the Wii U's software lineup even now is acceptable. It's pathetic.

Gee, I should have remembered to say that Wii U lacked compelling software after launch. Well here it is now:

Wii U's launch window was fine. It was everything between then and about now that blew.

As for shitty sales, Wii U was selling okay up until it fell off a cliff in January (conveniently when the launch window titles just about stopped coming), so I guess that further supports my argument the launch window software wasn't that bad.
 
Wow, this tread moved alot faster than expected. Too bad lost a few posters along the way..
No it's launch window was not fine. Even now, it's not fine.

Not even Nintendo agree with you. Only last month they went on record saying they believe their biggest issue is lack of compelling software.
There is a difference between not understanding what is going on and knowing the issue but being unable to do much about it. The software wasn't ready, and they decided not to rush them. They figured the first half of the year would be rough for the Wii U. That was why they did that epic January Nintendo direct.

Anyway, back on topic.

My slighly different outake :)

- Added non-anamorphic 16:9, renders 1080p
- Reworked graphical overhead from Flipper/TEV Pipeline to a modern GPU/programmable shader environment
- High Dynamic Range (HDR)
- Screen space Ambient Occlusion (SSAO)
- Self-shadowing
- Removed horrible fake-DOF blur filter
- Loads up entire ocean at once to improve traveling/pacing
- 32 bit color framebuffer (up from 16-bit with dithering)
- Redone sky and clouds
- Bigger resolution texture assets

Nothing wrong with the way you were wording it all, though.It did, notice the background texture:

u1kBCUU.gif


It's not even the same asset, it was clearly painted over, reinterpreted; doesn't 100% match.


Other comparison, since we've been at it:

E3Zznsv.jpg


0144.png



It's a crying shame they didn't spruce up geometry a little.
Thanks for the reply. Looks like all the enchantments were due to the upgraded shader effects and RAM. I personally found that pretty cool. Nintendo may have a fun time taking advantage of the complete hardware.


Didn't Iwata say around Jan this year that people have not seen 50% of what WiiU is capable of ?. He could and is most probably talking about developers learning the hardware though.

I really wouldn't be surprised though if clock changes were possible through a system update although obviously not to the ridiculous degree that the rumour from a few months ago would suggest. Maybe a small boost to the CPU and or GPU would be possible.

My biggest hope is that they can really optimize the OS over the coming year and eventually unlock a further half a gig of RAM for developers.

The WiiU specs are what they are but I believe if you gave Nintendo the chance to change / tweak one thing about the specs, they would have put another 2GB's of RAM in the system which would give it 3GB's for games with the possibility of 3.5GB's in the future which isn't all that far behind PS4 and XBOX ONE's 5GB's of RAM for games. It would make porting games, far, far easier from the other two next gen consoles.
. The machine was designed to use a lower amount of RAM than the other next-gen consoles. It will even be interesting to see how devs will use all of that RAM. Devs were able to do some impressive stuff while managing less than 512MB.
 
As for shitty sales, Wii U was selling okay up until it fell off a cliff in January (conveniently when the launch window titles just about stopped coming), so I guess that further supports my argument the launch window software wasn't that bad.

It doesn't support your argument at all. The correlation between launch window and sales are not substantiated.

Nintendo launched the system globally, during the busiest shopping period of the year, etc.

We can also look at the impact Pikmin and W101 had on Wii U sales. Both very poor, and after a small and short spike in hardware sales the Wii U went back down.
 
Status
Not open for further replies.
Top Bottom