WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
The PS4 also isn't technically compatible with DX11, it has the features though, Wii U uses features of DX10.1 which isn't a big difference from what people said.
There is an actual difference. DX 11 tessellation has 2 new stages added to it not available in DX 10.1. I'm not sure about the rest of the other changes made that are not backwards compatible.

edit: I remember seeing another engine where certain post processing could only be handled by DX 11 GPU's despite supporting DX 10.1 as well.
 
Nah. The publishers who are still on board are going to be on board for a little longer as they try to find the audience. The rest of them are waiting it out for a while longer... Or the rest of the generation if there's little money to be made.

So it will heavily depend on how the Wii U does this fall and onward. Makes sense. Nintendo is fortunate to have Ubisoft's support to fill out some of the game genre gaps this year.

There is an actual difference. DX 11 tessellation has 2 new stages added to it not available in DX 10.1. I'm not sure about the rest of the other changes made that are not backwards compatible.

edit: I remember seeing another engine where certain post processing could only be handled by DX 11 GPU's despite supporting DX 10.1 as well.

As StevieP stated eariler, Wii's is using OpenGL instead of DX, so there will be some differences in its feature-set compared to DX10.1-DX11.
 
Obviously there is multiple reasons why a console fails. It's never one.

For example, I would say a $499 Wii U that has Xbone specs would be a drastically worse failure than the current Wii U, and as is proven by the existence of the lack of current generation ports on the system (which is why people bring the Gamecube up in the first place), it wouldn't guarantee any level of third party support.

The lack of ROI and perceived demographics are much bigger factors in something like that.

Don't forget about other factors which led to Gamecube's third place finish, such as lack of online support, which really began to take off in sports titles and games like SOCOM and Halo 2. And the fact that Nintendo marketed it as a console for kids with the whole form factor of the console made getting titles like GTA practically impossible. The fact that compression and multiple discs would need to be used for such games was the final nail in the coffin, if it even got that far. N64 balanced the family friendly and "mature" gamer demographics much better. But Japan apparently didn't like that console so Nintendo did the classic overreaction and designed the Gamecube in the quirky Japan-centric way that it came out.

Who is to say that we would be talking about a $499 box with Wii U? Seeing as that Sony are likely at least breaking even at $399 with a more powerful system, they could surely get the price lower than that - especially if the Gamepad was nixed in favor of the proven success of a Wii Remote form factor (maybe throw in a classic controller if third parties were truly averse - I doubt this was a major issue, though). Nintendo needn't even match Xbone in specs - just get more in the ballpark. The lack of current gen ports can be seen as a direct result of a)Nintendo failing to build enthusiasm (likely from lukewarm reception to Gamepad, lack of a noticeable visual leap to Joe Gamer, and other negative press like the slow OS) and b)The combination of underwhelming performance from the system (as attested by arkam, 4A games, DICE) and an architecture that you must wrestle with in order to achieve decent results makes for a poor ROI when taking into account "a."

The PS4 also isn't technically compatible with DX11, it has the features though, Wii U uses features of DX10.1 which isn't a big difference from what people said.

I think a major issue with next gen engines is gonna be compute shader support. This is something which got a major upgrade in DX11 hardware and especially the GCN architecture Xbone/PS4 employ. Plus, I mean, there's just such a huge gap in terms of FLOPS, available RAM, RAM speed, CPU cores...if Wii U ends up getting any games specifically built around the next gen consoles, it will be a miracle.
 
Don't forget about other factors which led to Gamecube's third place finish, such as lack of online support, which really began to take off in sports titles and games like SOCOM and Halo 2. And the fact that Nintendo marketed it as a console for kids with the whole form factor of the console made getting titles like GTA practically impossible. The fact that compression and multiple discs would need to be used for such games was the final nail in the coffin, if it even got that far. N64 balanced the family friendly and "mature" gamer demographics much better. But Japan apparently didn't like that console so Nintendo did the classic overreaction and designed the Gamecube in the quirky Japan-centric way that it came out.

Online gaming was a very small factor in that generation. It was a factor, but a very small one. Even this generation, its penetration on the most popular network (Live) was only at about 60%-66% last I recall reading. As core gamers, we are very myopic about this sometimes. It matters a hell of a lot to us, but the vast majority of the 150m PS2 owners weren't playing online.

Who is to say that we would be talking about a $499 box with Wii U? Seeing as that Sony are likely at least breaking even at $399 with a more powerful system, they could surely get the price lower than that - especially if the Gamepad was nixed in favor of the proven success of a Wii Remote form factor (maybe throw in a classic controller if third parties were truly averse - I doubt this was a major issue, though).

Sony isn't breaking even. Let's just leave it at that. The way Nintendo does business (i.e. selling at a profit or very small loss) isn't compatible with the way other consoles are made (see: MS attempting to come closer to the breakeven point and getting shunned universally on this board). Could they put out a more powerful and, as a result, larger and more expensive box? Absolutely they could. But an even less market-friendly price than what they already have would have the Wii U suffer even worse. Outside of the initial holiday rush and post-rush, I think you're going to see slower uptake of the other next generation consoles than many here including me would like, and will result in an overall fairly substantial market-wide contraction at the end of the generation.

I would've much preferred an upgraded Wii Remote to the Gamepad (I hate dual-analog for most of the most popular console genres) and I find it a regression in almost every respect. But the Wii Remote (and Move) were subjects of derision from "hardcore" gamers and developers for an entire generation. And still are (example: try posting "dual analog is garbage for shooters, we should be playing this with Wii remote infrared or at least gyro Move" - in the latest shooter-of-the-month thread, and watch what happens). Nintendo surely heard it. So they wanted a differentiator that revolved around a dual analog pad, for better or worse. Differentiation is what Nintendo is about now, because we don't need a third identical box. Though they've done differentiation in similar respects across most generations outside the Gamecube.

I think a major issue with next gen engines is gonna be compute shader support. This is something which got a major upgrade in DX11 hardware and especially the GCN architecture Xbone/PS4 employ. Plus, I mean, there's just such a huge gap in terms of FLOPS, available RAM, RAM speed, CPU cores...if Wii U ends up getting any games specifically built around the next gen consoles, it will be a miracle.

There are plenty of examples of high end PC games (which require at least 1 super powerful or 2 GPUs to max out at 1080p and gobs of memory) ported to the 360 and PS3 this generation. Hell, Battlefield 3 running on Ultra at 1080p and 60fps throughout multiplayer on PC requires much more hardware grunt to run than the even the PS4 has. Where there's a will to scale, there's a way to scale. I'm not saying it would be easy for the Wii U (as we have a good general idea of where it stands) but it's not impossible.

Where there is a will, there is a way. Unfortunately, for the Wii U, there is no will. And there probably won't be even if this thing lights on fire like the Wii, which is extraordinarily unlikely. ROI, demographics, etc. But we've derailed the thread enough.
 
If they continue to use PPC in 2016, then they are drunk as fuck.

x86 is much more ancient than PPC is.

Clock for clock PPC is still more efficient and uses far less transistors which is why you still RISC processors dominate in mobile phones and embedded devices. Nintendo also use Risc chips in their portable devices, so there is some overlap.
 
Please take a look. I continue to say this is a significant improvement. I tried the best screens I could get from both games to be fair at the same resolution 720p.

I am really happy with this comparison, Bayo 2 looking a lot better and running at 60fps. And this is Platinum's second game on the Wii U.

Bayonetta 1
ranQcTp.jpg


0JFR1QJ.jpg


Bayonetta 2

dK5wZQE.jpg


XYvfToH.jpg
 
I don't think people are saying that GameCube and Nintendo 64 failed because they were powerful. I think people are merely saying that the power did not really help them be successful. I personally do not have the knowledge to claim that is the truth.
 
I don't think people are saying that GameCube and Nintendo 64 failed because they were powerful. I think people are merely saying that the power did not really help them be successful. I personally do not have the knowledge to claim that is the truth.

And in all honesty by specs, the Gamecube looked very inferior to both the Xbox and PS2 (which boasted unlit/untextured poly counts). In reality, graphically it punched above it's weight (sounds familiar), and compared well with Xbox, but in other areas it still lacked. For example, no real surround, small disks, less RAM, no HD support, poor network infrastructure (or plan). Nintendo had it's priorities, they wanted an efficient console, that could make them money from day one.

It was the most recent on par (or in the running) console relating to real world power.

As for could Nintendo had made a WiiU that's as powerful as the PS4 if they'd have left out the gamepad? Of course, but I think Sony is probably taking a larger loss than Nintendo is comfortable doing (given even with a powerful console, they still can't guarentee 3rd party support, or even consumer support without a gimmick), so it'd probably be priced higher.

For me, I wish'd they'd have snuck in at around Xbone power, skipped the gimmick (kinnect), and hopefully be able to sell it at $400, so I could get my nintendo franchises, but also have the best chance at 3rd party support...

Gamepad? Peripheral, I hate to say it, but given how all competing consoles now have 2ndary screens' available, I think 3rd parties will probably have alt support for these type of features. Given that most (even Nintendo) Wii U games support off TV play, it means that single screen experience is possible on those games, might as well make the gamepad its self optional as well.

Oi, why did we take such a side turn away from the GPU...
 
x86 is much more ancient than PPC is.

Clock for clock PPC is still more efficient and uses far less transistors which is why you still RISC processors dominate in mobile phones and embedded devices. Nintendo also use Risc chips in their portable devices, so there is some overlap.

So? This is a home console. One that shouldn't be bloated by price. Hell, IBM is having a shitfest trying to get down to 22nm. Not only that, but AFAIK, they don't have any foreseable future with die stacking like x86 (Intel/AMD) does.
 
So? This is a home console. One that shouldn't be bloated by price. Hell, IBM is having a shitfest trying to get down to 22nm. Not only that, but AFAIK, they don't have any foreseable future with die stacking like x86 (Intel/AMD) does.

You know what: http://www.commonplatform.com/about/
IBM are using the exact same processes like AMD for it's processors.

20nm production will start later this year (probably next year in the market)
glofo-roadmap.png


FinFet will come with the 14nm process.
 
Shin, i do understand, you are basically explaining what differentiates DX 10.1 from DX11. And efficiency is a crutial part you seem to be resting importance here. The problem here is that Wii U supports "compute shaders" has no reliable source. Or at the least i would argue that Nomura is as good as a source as the one proclaiming what the Wii U supports.

You mean the leaked tech specs that were right on everything? Yeah I'd say that's a reliable source, more so than Buckleboy Nomura. Besides the fact that the chip the Wii U chip is supposed to be based on supports compute shaders.

Also the "tesselator" in the Wii U is supposed to be in the same capacity as the AMD ones prior to the 5000's series of video cards and that one is not very practical to use in gameplay situations. As we saw in how almost no PC game used it.


You are incorrect the reason it wasn't usedin PC Games is because they all used DX, which did not support AMD's way of doing Tesselation at the time. While it is not efficient as more newer tesselators it still was usable in games. Plus we have Shin'en saying they were going to be using it in their next games.
 
Please take a look. I continue to say this is a significant improvement. I tried the best screens I could get from both games to be fair at the same resolution 720p.

I am really happy with this comparison, Bayo 2 looking a lot better and running at 60fps. And this is Platinum's second game on the Wii U.

Bayonetta 1
ranQcTp.jpg


0JFR1QJ.jpg


Bayonetta 2

dK5wZQE.jpg


XYvfToH.jpg

What I see is Bayonetta 2 looking prettier only because the screenshots of Bayonetta 1 look really damn brown. The Bayonetta 2 screenshots have significantly more color and more saturated ones at that. Also, there is some noticeable screen-tear on the first image (Right by the monster's foot).
 
What I see is Bayonetta 2 looking prettier only because the screenshots of Bayonetta 1 look really damn brown. The Bayonetta 2 screenshots have significantly more color and more saturated ones at that. Also, there is some noticeable screen-tear on the first image (Right by the monster's foot).

I don't think I've ever rolled my eyes or seen more arbitrary reasons not to admit a game looks better...The game clearly is improved. Even looking at Bayonetta herself the lighting model on her looks much more advanced. Not to mention the "screen tearing" you notice which seems to be a graphical effect of her move...
 
I don't think I've ever rolled my eyes or seen more arbitrary reasons not to admit a game looks better...The game clearly is improved. Even looking at Bayonetta herself the lighting model on her looks much more advanced. Not to mention the "screen tearing" you notice which seems to be a graphical effect of her move...

Nooooo...that's screen-tearing caused by a lack of v-sync. And yes, it does look better, but I think that it might be just because there is a lot more color. Trust me, a palette-change can make games look COMPLETELY different (remove the brown filter from Rezi4 or the blue filter from Battlefield 3).
EDIT: Shadowing is improved, but I think it would help if the person found screenshots of a similar level design to help make a better comparison.
 
Wii U Pretty until now in my eye why i should care about more graphics.

I already played NFS,NSMBU,MH3U and many many games , all games looking great.

Why i should say wii u weak and talk about Wii U GPU or CPU and all things i see is good?

Gaffers all these days taking about graphics and graphics and only graphics, please gaffers stop taking about graphics the wii u already give us great game in graphics or IP name.

and i am Okay with other consoles that is good in graphics and better but still the wii u give us really great game .

Please stop taking about graphics and enjoy your games.

We can talked only technical here please make this thread for analyzing .
 
As StevieP stated eariler, Wii's is using OpenGL instead of DX, so there will be some differences in its feature-set compared to DX10.1-DX11.

The WiiU doesn't fully use OpenGL, IIRC, there are some GL 4.X calls that can be done, you are going to mostly use Nintendo's API called GX... or was it GX2. It's been a while. Although GX could be Nintendo's derivative of GL. Regardless, N7 is referenced as having DX 10.1+ /GL 4.X+ features.

Actually, the textures are pretty low res, so there's nothing to scale back.

There are quite a few texture compression algorithms at one's disposal other than S3TC on the WiiU.
 
Marketing mistakes have been made, yes, but apart of that your reasoning does not convince me.

The gamepad is not a "GIMMICK", as much as the Wiimote wasn't one (the missuse of this word to dismiss something people don't like drives me crazy!!). It's a standard controller with extra functionality and an extra screen. It is NOT a fucking gimmick, goddamnit!

Is the PS4 controller a gimmick, because it comes with a touchpad?

Is the PS4 touchpad a gimmick? Absolutely. Due to an awkward physical arrangement of controller features (sticks in the way), the touchpad requires the player to take one hand of the controller to be used properly. Fast paced immersive nature of modern gaming requires control schemes that allow the player to be oblivious of the controller; the player should not think how to use the controller to trigger action, it should come as second nature through muscle memory. The current implementation of the touch pad is cumbersome and immersion breaking and I don't see it being used much outside of tertiary functions like menu browsing.

With the way we interact with games, due to a slow paced evolutionary development of the controller, each "radical" novelty is viewed as a gimmick at first. It ceases to be a gimmick when it has proven its worth i.e. it has seen proper utilization that enhances the gaming experience to beyond what was possible before and the majority of the software uses it in a gameplay enhancing way. You could definitely argue that the Wii remote transcended the gimmick stage since it has seen proper use and triggered industry-wide imitation. The Wii U gamepad on the other hand has not shown much of anything that would justify it's inclusion. Outside of one or two titles that also offer alternative gameplay mechanics (Zombie U) the large majority of the software, both released and upcoming, uses the tablet controller in ways that I could only describe as superfluous (honk honk watch out for Mario's Kart).

So what? Early problems many consoles had during their launch periods. And no, 3rd Parties didn't "rightfully" pull their support, what kind of stupid argument is that? Only because something doesn't immediately work as intended, doesn't mean you should completely abandon it. What kind of logic is that? By that logic, 3rd parties should have abandoned the 360 with it's RROD problems a long time ago.

Some people are so biased on this forum, my god...

Sorry, I made my message somewhat unclear there. The publishers didn't drop support because of the firmware issues but due to a lukewarm reception at launch that continued throughout the holiday season and finally sales tanking after the holidays. In conjunction with traditionally sluggish third party sales on Nintendo home consoles (after N64) they simply made a sensible business decision and put Wii U development on hiatus.

Firmware issues most definitely affected the sales though. After the hardware failure fiasco experienced with both the X360 and the PS3 people are simply less prone to take risks with their purchases. For example, I can easily see myself getting a Wii U sometime in the future but I refuse to do so with the "broken" launch period hardware still lingering on the store shelves. Maybe when they introduce a fixed 64 GB revision.
 
So? This is a home console. One that shouldn't be bloated by price. Hell, IBM is having a shitfest trying to get down to 22nm. Not only that, but AFAIK, they don't have any foreseable future with die stacking like x86 (Intel/AMD) does.

First off, console manufacturers don't choose instruction sets, they choose processors. MS and Sony chose Jaguar because it fit their price and performance requirements. If a PowerPC chip was available which provided the same performance for lower cost, or greater performance for the same cost, that's what they would have gone with. Hell, if there was an ARM chip which met their requirements they would have gone with that.

And it'll be the same the next time around. They'll go for the best chip they can get for the price they're willing to pay, regardless of what instruction set it might run.

Secondly, your die stacking comment is puzzling, as IBM are arguably at the forefront of chip stacking research. Their stacking process is what's going to be used in HMC, for example.
 
Eh, highly unlikely. There's no reason for them to do much more than dump the 360 code onto a wii u disc with swapped out button icons and call it a day. I forget who (might have been Shocking Alberto) but someone relatively notable hinted that the real reason it hasn't been talked about is because Activision is still negotiating some kind of perk package just as they usually do with MS and Sony.

If Ubisoft are anything to go by we're more likely to see the PC being the lead platform and porting to the Wii U/PS4/One and having the PS3 and 360 SKUs being developed independently. It's going to be a great deal easier for developers to port between the Wii U/PS4/One due to the similarities in console architecture. The PS3 and 360 SKUs will involve the majority of floating point work being done by the CPU, the PS4 and One are powerful enough to brute force it but why bother when you have a GPGPU..?

The performance issues with Wii U multiplatform titles so far are very likely due to architectural differences and poor/lazy/rushed optimisation.
 
If Ubisoft are anything to go by we're more likely to see the PC being the lead platform and porting to the Wii U/PS4/One and having the PS3 and 360 SKUs being developed independently. It's going to be a great deal easier for developers to port between the Wii U/PS4/One due to the similarities in console architecture. The PS3 and 360 SKUs will involve the majority of floating point work being done by the CPU, the PS4 and One are powerful enough to brute force it but why bother when you have a GPGPU..?

The performance issues with Wii U multiplatform titles so far are very likely due to architectural differences and poor/lazy/rushed optimisation.

Are devs even really utilizing GPGPU yet? For instance, we have Havok on Wii U running on the CPU and Killzone for PS4 isn't using GPU compute for anything either, from what I've read. I am sure it will be huge moving forward, but I wouldn't be surprised if Watch Dogs still relies on the CPU for alot of the floating point vector code.
 
Wanted to say in response to Watch Dog posts since there were quite a few, we learned earlier this year that Ubisoft started development for the game on PC and PS4/Xbone. At the time Wii U's version was being developed separately and the PS360 versions hadn't started yet.

Was wondering why the loss of interest?

For me I believe I have gotten all I can out of looking at the die shot.

People usually make these comparisons in context. 50% faster than the competition is not the same as 50% faster than last gen consoles. Random number but you get the point.

Whoever is saying the PS4 is a generation ahead of the Xbox One because it has a few advantages is wrong. Just like you, I wouldn't consider a 50% difference to be "on par", but it's definitely in the same ballpark and the kind of difference I would expect from systems released within the same generation.

You should see my discussion in the TRUTHFACT thread. That's what Donnie is essentially talking about. Start here if you're interested.

http://www.neogaf.com/forum/showthread.php?p=64017081#post64017081
 
I've been wondering, has there been anything said about the rumor of the Latte having Fix functions?

Nah that was never a rumor, just hypothetical discussions. Doesn't seem to be anything beyond normal fixed function components like TMUs, ROPs, and graphic engine components.
 
Are devs even really utilizing GPGPU yet? For instance, we have Havok on Wii U running on the CPU and Killzone for PS4 isn't using GPU compute for anything either, from what I've read. I am sure it will be huge moving forward, but I wouldn't be surprised if Watch Dogs still relies on the CPU for alot of the floating point vector code.
According to StreamComputing (a company specialized in optimizing software), the Wii U SDK ships with OpenCL libraries at least. But it's a new paradigm, and not something most game programmers have much experience in, so it'll probably take a while to become widespread.
 
According to StreamComputing (a company specialized in optimizing software), the Wii U SDK ships with OpenCL libraries at least. But it's a new paradigm, and not something most game programmers have much experience in, so it'll probably take a while to become widespread.

Looking at what Nintendo has shown at E3, would you say, low memory bandwidth theory was way off?
 
Looking at what Nintendo has shown at E3, would you say, low memory bandwidth theory was way off?
The system sure as hell isn't bandwidth starved. In fact, I actually wouldn't be all that surprised if the Wii U had the highest aggregate memory bandwidth of all next generation systems, due to its rather strange memory subsystem.
 
The system sure as hell isn't bandwidth starved. In fact, I actually wouldn't be all that surprised if the Wii U had the highest aggregate memory bandwidth of all next generation systems, due to its rather strange memory subsystem.

Yeah, Bayonetta 2 alphas are insane. That's what caused me to ask. I've been wondering if the SRAM on the chip, particularly what has been determined as ROPS or ALU is limited to a select vendor. Let's say Renesas were able to use wider bus SRAM during the chip fab process.
 
Yeah, Bayonetta 2 alphas are insane. That's what caused me to ask. I've been wondering if the SRAM on the chip, particularly what has been determined as ROPS or ALU is limited to a select vendor. Let's say Renesas were able to use wider bus SRAM during the chip fab process.

SRAM? I thought that it was DRAM.
 
There is no downside. At lower capacities, SRAM is better. At higher capacities, eDRAM is better. Latte has several memory pools of different capacities, so they can mix and match.

I see. So they included a small pool of SRAM along with the 32MB of eDRAM to help with the lower bandwidth of the DDR3?
 
Looks like Project C.A.R.S. might be getting treated like a worthwhile port to the Wii-U. Hopefully this game shows us how a high IQ realistic racer can perform on the Wii-U. Im Excited.

With 12 months remaining until release, what is the main feature(s) you see for Project CARS as being different than its competitors, once it lands on the market? On PC we will have Assetto Corsa, rFactor2, iRacing, Game Stock Cars... on consoles there will be Gran Turismo and Forza series. What will Project CARS deliver to distinct itself among other titles?

There's a number of things… firstly Project CARS is beautiful. All the images and trailers you've seen are created by actual gamers. So they haven't been rendered or passed via a marketing department for touching up. That's why it's the most beautiful racing game out there right now.
On the Wii U, there's an opportunity there for us to 'be the Forza' on that platform. The Wii U is a great machine and the gamepad holds lots of possibilities so it's really exciting to bring a title like Project CARS to that platform and let Nintendo fans finally get their hands on a realistic car game.
 
You should see my discussion in the TRUTHFACT thread. That's what Donnie is essentially talking about. Start here if you're interested.

http://www.neogaf.com/forum/showthread.php?p=64017081#post64017081
Seems like you are discussing a subjective definition of what represents a huge jump. My view is that a 50% difference is pretty significant when comparing two machines releasing at the same time and going after the same market, especially considering the price.

Thanks for the link.
 
Wanted to say in response to Watch Dog posts since there were quite a few, we learned earlier this year that Ubisoft started development for the game on PC and PS4/Xbone. At the time Wii U's version was being developed separately and the PS360 versions hadn't started yet.

May the Digital Foundry Face-Off be ever in Wii U's favor.
 
Status
Not open for further replies.
Top Bottom