WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I haven't really been here for long but I always hear you mentioning "WUST days". What does that mean? Just curious.

Wii U Speculation Thread. People were quite hyped about the Wii U, and there were plenty of rumors going around and lots of discussion. People expected Wii U to be quite powerful, but...
 
I haven't really been here for long but I always hear you mentioning "WUST days". What does that mean? Just curious.

It stands for Wii U Speculation Thread(s). There were about 4 or 5 official(and very popular) speculation threads that provided a place to mainly discuss/speculate about the hardware, based on rumors or whatever bits of insider info that got out. If I remember correctly - someone correct if I'm wrong - the exact size of Wii U's eDRAM(32MB) got 'leaked' in one of those threads.

Edit: Beated. *shakes fist at Vermillion*
 
If this company doesn't make technically advanced games, the same goes for Bayonetta 2.

For now, Bayonetta 2 is at a level which is ahead of what was possible the last generation. Even The Last of Us (the highest achievement seen on current gen) is below it technically.
What pushes Baynetta 2 above current gen?

It seems to share compromises current gen games have to run 60fps. 720p, no AA, pixelated shadows and even textures, baked shadows, relatively simple geometry, low quality depth of field...
 
What pushes Baynetta 2 above current gen?

It seems to share compromises current gen games have to run 60fps. 720p, no AA, pixelated shadows and even textures, baked shadows, relatively simple geometry, low quality depth of field...

This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.
 
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.

While I mostly agree I have to say that I like this choice Platinum Games made. I always thought that previous PG games (Bayonetta, Vanquish, Metal Gear Rising) looked unncessarily bland in terms of colors (grey/brown). I wonder if it is due to them thinking that the typical (western) 360/PS3 audience wants to have that "realistic" color set while the Wii audience would be different.
 
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.

It isn't a generational jump. But it looks better than the PS360 game.

USC-fan said:
No.

They are about as similar as x360 is to xbone.

Actually... no. The Wii U's architecture shares more similarities to the other 8th gen consoles than it does the PS360. It's just that there's a lot less of it there (less memory, less cores, slower everything, etc). The design paradigm is similar, albeit nowhere near as efficient (MCM vs APU, memory bandwidth, etc). But what do you expect when you compare 35w with an APU that draws more than 100? Physics never, ever lie.
 
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.

I don't think it's evidence of a generational leap but at the very least, Bayo2 looks better than Bayo1. Textures are noticeably higher res, the framerate is much more stable around 60fps, polycount seems pushed farther between the boss and the hair-demon-thingy, the lighting is better, etc.

But ultimately I don't even think Bayo2 is the best looking Wii U game, it just performs admirably with a lot of flair.
 
I don't think it's evidence of a generational leap but at the very least, Bayo2 looks better than Bayo1. Textures are noticeably higher res, the framerate is much more stable around 60fps, polycount seems pushed farther between the boss and the hair-demon-thingy, the lighting is better, etc.

But ultimately I don't even think Bayo2 is the best looking Wii U game, it just performs admirably with a lot of flair.

Gaffers there is still big time to complain about Bayonetta graphics because game will release next year and there is more than 6 month to upgrade the graphics more and more like what happened to Pikmin 3.
 
Although I'm not dismissing potential improvements due to the hardware having some strong points above the PS360, it is natural that after all the experience that Platinum gained since the first Bayonetta, doing stuff like Vanquish and MGR, the sequel would look better, even if it was on the old consoles.
 
I think nintendo doing good.

You know that the difference between 1080P & 720P is power and ram.

For example PSX4one release 1080P game used 7GB ram.

To release same game on Wii U it's need 1.75 Ram to show it on 720P.

And this is the math.

1080p used 4 times power than 720p.

8Gb Ram 4 times size than 2Gb.

PSX4one 4 times power than Wii U.

If the graphics card on Wii U used customized function that already on PSX4one Graphics card , then the only difference between the them is 4 times power,size.

That mean the Wii U save in Multi games because it has customized function.

This is Intellectually.

But if we goes by technically it's clear that if any games release on PSX4one with 720p resolution will never release it on wii U and to release it on wii U they should make it 576P resolution with many technique removed.

I think 3rd party will never release multi game on Wii U because of many technique removed , and they don't like to make their game look much worse.
 
I think nintendo doing good.

You know that the difference between 1080P & 720P is power and ram.

For example PSX4one release 1080P game used 7GB ram.

To release same game on Wii U it's need 1.75 Ram to show it on 720P.

And this is the math.

1080p used 4 times power than 720p.

8Gb Ram 4 times size than 2Gb.

PSX4one 4 times power than Wii U.

If the graphics card on Wii U used customized function that already on PSX4one Graphics card , then the only difference between the them is 4 times power,size.

That mean the Wii U save in Multi games because it has customized function.

This is Intellectually.

But if we goes by technically it's clear that if any games release on PSX4one with 720p resolution will never release it on wii U and to release it on wii U they should make it 576P resolution with many technique removed.

I think 3rd party will never release multi game on Wii U because of many technique removed , and they don't like to make their game look much worse.

Uhhhmmm. No.. People have been gaming at 1080p@30fps for years on PC systems less "powerful" than a Wii U, with less ram and a worse graphics card.


edit: You have to play to your systems strengths and set a minimum acceptable target. Optimization takes you miles with proper hardware utilization.
 
Uhhhmmm. No.. People have been gaming at 1080p@30fps for years on PC systems less "powerful" than a Wii U, with less ram and a worse graphics card.


edit: You have to play to your systems strengths and set a minimum acceptable target. Optimization takes you miles with proper hardware utilization.

Somehow you are right.

Last time i visited my friend and he is PC gamer and i take my Wii U with me.

Then i show him my Trine 2 he is impressed from the Wii U and he said i never thought wii U will make games really nice and it's show graphics near my PC it look nearly the same.

And he turn on his Trine 2 on his PC with FXAA , 1080P , 60FPS.

And he comment your Wii U with only 300$ show really good graphics and my computer with only 3000$ show nearly same graphics with only few advantages on my PC.

And he is happy with Nintendoland & NSMB and he comment your Wii U really nice if we think it's with only 300-350$ price and with best controller ever.

And he wish to has Gamepad on PC.
 
Somehow you are right.

Last i visited my friend and he is PC gamer and i take my Wii U with me.

Then i show him my Trine 2 he is impressed from the Wii U and he said i never thought wii U will make games really nice and it's show graphics near my PC it look nearly the same.

And he turn on his Trine 2 on his PC with FXAA , 1080P , 60FPS.

And he comment your Wii U with only 300$ show really good graphics and my computer with only 3000$ show nearly same graphics with only few advantages on my PC.

And he is happy with Nintendoland & NSMB and he comment your Wii U really nice if we think it's with only 300-350$ price and with best controller ever.

And he wish to has Gamepad on PC.

I'm not saying that it has bad graphics. I have a Wii U. I also have a damn fine PC. If your friend payed $3k for his PC he either has an extremely overpowered PC or he did not build it himself.

I'm not knocking the Wii U, I just wanted you to know that PC's with less "power" as you like to put it, have been doing the same games as Wii U, but at 1080p@30 fps.

I love the off-tv play, and I think the gamepad is comfortable as hell. But, more than anything, the Wii U is going to be a system that is left with a ton of exclusive titles, first-party or 3rd-party, as the multiplatform games that are coming out for the PSOne are going to very soon not be worth the effort to scale down to Wii U capabilities. Which in-turn means, that games that are released on the platform, will most likely be tailor made to work the strengths of the system, but as time goes by, we won't see much parity between the SonyM$ twins either.

It'll be an interesting future for sure.


Completely unrelated, are you using a text translator? It seems English is not your fist language, and I'm curious as to what is.
 
I'm not saying that it has bad graphics. I have a Wii U. I also have a damn fine PC. If your friend payed $3k for his PC he either has an extremely overpowered PC or he did not build it himself.

I'm not knocking the Wii U, I just wanted you to know that PC's with less "power" as you like to put it, have been doing the same games as Wii U, but at 1080p@30 fps.

I love the off-tv play, and I think the gamepad is comfortable as hell. But, more than anything, the Wii U is going to be a system that is left with a ton of exclusive titles, first-party or 3rd-party, as the multiplatform games that are coming out for the PSOne are going to very soon not be worth the effort to scale down to Wii U capabilities. Which in-turn means, that games that are released on the platform, will most likely be tailor made to work the strengths of the system, but as time goes by, we won't see much parity between the SonyM$ twins either.

It'll be an interesting future for sure.


Completely unrelated, are you using a text translator? It seems English is not your fist language, and I'm curious as to what is.

Yes my english not good.

I think he has 2 X GPU Nvidia 680
 
What pushes Baynetta 2 above current gen?

It seems to share compromises current gen games have to run 60fps. 720p, no AA, pixelated shadows and even textures, baked shadows, relatively simple geometry, low quality depth of field...
Geometry is higher than the one seen in The Last of Us, and the same goes for texture quality.
I mean, this is the geometry of the Killzone 4 scenario:
8493770440_1eda3e4bff_o.jpg

From a game that runs at below 30 fps and from a company that pushes the hardware much more than platinum games usually does.

Regarding the "pixelated" shadows... feel free to point me a game with higher resolution shadows at 60fps on current/past generation. In fact, the direct footage video from IGN hasn't got a single pixelated shadow, so I think that those things you point are what you want to see and not what was really shown.

Smurfman256 said:
This is why I think that the only reason the people think it looks better is because it's significantly more colourful. Nothing more. They see the nicer colours and start coming up with a bunch of hollow statements trying to convince themselves and other people that it looks like a generational jump.
It's funny to see how it seems to some people that the WiiU has a "color trick" or anything.
If this game is "more colorful" than the first one is not due to "magical colors" but to a much better and higher quality global lighting.

TheGuardian said:
Although I'm not dismissing potential improvements due to the hardware having some strong points above the PS360, it is natural that after all the experience that Platinum gained since the first Bayonetta, doing stuff like Vanquish and MGR, the sequel would look better, even if it was on the old consoles.
The jump between Bayonetta 2 and any game made by this studio on past-gen consoles is MUCH bigger than the jump between any of the games they did there.
If Bayonetta 2 looks how it looks only because the studio has "more experience" (on what? I would ask. Because this is their second attempt on WiiU, and the first one was made mostly on unfinished development kits), the jump would be comparable to the one seen between Bayonetta and Vanquish for example, or MGR.

This game has more polygons, better textures and better lighting than The Last of Us, arguably the best looking game of past generation, and at constant 60fps.
 
Geometry is higher than the one seen in The Last of Us, and the same goes for texture quality.
I mean, this is the geometry of the Killzone 4 scenario:

The funny thing about that Killzone 4 pic, was that it was from one of the most impressive scenes shown in the game so far. The geometry is really simple, the colors and textures are reused everywhere, yet it gets a pass but Bayonetta 2 don't? Now, don't get me wrong, the overall image still looked pretty good, but it was funny how people were quick to point out similarly simple geometry in the buildings in Bayo2.

Never mind the fact that you were speeding past those buildings on a jet, at 60fps during gameplay(vs 30fps cutscene for KZ:SF), or in that Gamorrah boss fight to the end, you see a large city scape similar in scope to the one in that opening sequence of KZ:SF, much less monotonous textures, insane water particle FX, a ton of stuff happening on screen any given time, or the fact that close-up of character models hold their own - quite easily too - against KillZone's. Nope, people chose not to see those things because a Wii U game is not allowed to be impressive, even while they praise a PS4 game that doesn't offer as much to marvel at.
 
The funny thing about that Killzone 4 pic, was that it was from one of the most impressive scenes shown in the game so far. The geometry is really simple, the colors and textures are reused everywhere, yet it gets a pass but Bayonetta 2 don't? Now, don't get me wrong, the overall image still looked pretty good, but it was funny how people were quick to point out similarly simple geometry in the buildings in Bayo2.

Never mind the fact that you were speeding past those buildings on a jet, at 60fps during gameplay(vs 30fps cutscene for KZ:SF), or in that Gamorrah boss fight to the end, you see a large city scape similar in scope to the one in that opening sequence of KZ:SF, much less monotonous textures, insane water particle FX, a ton of stuff happening on screen any given time, or the fact that close-up of character models hold their own - quite easily too - against KillZone's. Nope, people chose not to see those things because a Wii U game is not allowed to be impressive, even while they praise a PS4 game that doesn't offer as much to marvel at.

I often wonder if people claiming Bayonetta 2 looking like current gen even watched the whole Gomorrah fight in the demo playthroughs. Never have i seen anything looking this good on PS360. As you said, during that fight scene we see the following:

-> you fight that huge super detailed monster wich is scaling a building
-> Windows of the building reflects the sunlight and other buildings of the city
-> Said city is also fully rendered in the background
-> Alot of small chips from the buildings crumbe off and fly through the air
-> WHILE it appears to be raining... (Wich might only be from water pipes from the building being broken though)

No one is claiming Wii U is as strong as PS4/Xbox One. It clearly isnt. But can we now please all agree on that fact that Wii U is a pretty good step beyond PS360 now?
 
Although I'm not dismissing potential improvements due to the hardware having some strong points above the PS360, it is natural that after all the experience that Platinum gained since the first Bayonetta, doing stuff like Vanquish and MGR, the sequel would look better, even if it was on the old consoles.

And Forza Motorsport 5 would look exactly the same on Xbox360 due to experience gained by the developing team. But they just wanted it only on Xbox One for some strange reason.
 
I often wonder if people claiming Bayonetta 2 looking like current gen even watched the whole Gomorrah fight in the demo playthroughs. Never have i seen anything looking this good on PS360. As you said, during that fight scene we see the following:

-> you fight that huge super detailed monster wich is scaling a building
-> Windows of the building reflects the sunlight and other buildings of the city
-> Said city is also fully rendered in the background
-> Alot of small chips from the buildings crumbe off and fly through the air
-> WHILE it appears to be raining... (Wich might only be from water pipes from the building being broken though)

No one is claiming Wii U is as strong as PS4/Xbox One. It clearly isnt. But can we now please all agree on that fact that Wii U is a pretty good step beyond PS360 now?

The monster seem low poly at times though and the background was a bit dark. Whats impressive is how colorful it was,not to mention the content, action, and gameplay.
 
No one is claiming Wii U is as strong as PS4/Xbox One. It clearly isnt. But can we now please all agree on that fact that Wii U is a pretty good step beyond PS360 now?

Depends on how you define "a pretty good step beyond". To me it's like the step between PS2 and Gamecube/Xbox. The difference can be noticeable no doubt but it's still in the same ballpark.
 
What does it mean?

This would imply that the WiiU OS doesn't have a scheduler; or a traditional one anyway.

Normally threads run for a period of time (a 'slice') then check to see if they should carry on running or handover processing time to another thread.

Time slicing allows for multitasking. The lack of a scheduler would imply that the WiiU does not pre-emptively multitask. This would imply that it would be up to the developer to add handover instructions within their code to stop something hogging a hardware thread/core.

The 'main thread' is probably a reference to the main loop (int main();) which is always running. This generally does little other than some initial dispatch instructions. If this has the same 'nice' value as other threads then a core could be sat there doing nothing much at all; and this could be what Ideaman was talking about when he said some devs had to alter their code in order to get it running across all cores.

The lack of a scheduler would mean that more CPU time is freed to process instructions undisturbed.. but it would also mean that developers have to be more aware of what they're doing in order to make the most out of the power.


/cpu_derail
 
I've asked this in another thread but didn't get a response. Perhaps it's been answered before.

Is there a comprehensive comparison between the Wii and the Wii U hardware?

What's the difference in RAM, CPU, and GPU?

What's the difference in FLOPs?

Maybe the Wii U is considerably more powerful than the Wii. In that case, the complaints from gamers would mostly stem from people who played on 360/PS3 and PC.

Otherwise, Wii gamers would probably be impressed by the jump, right?
 
I've asked this in another thread but didn't get a response. Perhaps it's been answered before.

Is there a comprehensive comparison between the Wii and the Wii U hardware?

What's the difference in RAM, CPU, and GPU?

What's the difference in FLOPs?

Maybe the Wii U is considerably more powerful than the Wii. In that case, the complaints from gamers would mostly stem from people who played on 360/PS3 and PC.

Otherwise, Wii gamers would probably be impressed by the jump, right?

People who have only ever played on a Wii, maybe. I never owned a PS360, personally, but I don't see anything beyond a PS360, as I have played both systems.
 
This would imply that the WiiU OS doesn't have a scheduler; or a traditional one anyway.

Normally threads run for a period of time (a 'slice') then check to see if they should carry on running or handover processing time to another thread.

Time slicing allows for multitasking. The lack of a scheduler would imply that the WiiU does not pre-emptively multitask. This would imply that it would be up to the developer to add handover instructions within their code to stop something hogging a hardware thread/core.

The 'main thread' is probably a reference to the main loop (int main();) which is always running. This generally does little other than some initial dispatch instructions. If this has the same 'nice' value as other threads then a core could be sat there doing nothing much at all; and this could be what Ideaman was talking about when he said some devs had to alter their code in order to get it running across all cores.

The lack of a scheduler would mean that more CPU time is freed to process instructions undisturbed.. but it would also mean that developers have to be more aware of what they're doing in order to make the most out of the power.


/cpu_derail
Very interesting, and that does seem to match up with the situation that Ideaman replied. Thanks for your explaination.
I've asked this in another thread but didn't get a response. Perhaps it's been answered before.

Is there a comprehensive comparison between the Wii and the Wii U hardware?

What's the difference in RAM, CPU, and GPU?

What's the difference in FLOPs?

Maybe the Wii U is considerably more powerful than the Wii. In that case, the complaints from gamers would mostly stem from people who played on 360/PS3 and PC.

Otherwise, Wii gamers would probably be impressed by the jump, right?

Well, the Wii U is definitely a big leap over the Wii. Easily beyond one generation No one is arguing with that.
 
Very interesting, and that does seem to match up with the situation that Ideaman replied. Thanks for your explaination.

There's still a lot of presumptions within my reply, so please add as many grains of salt as is necessary until others can back it up. If the WiiU OS doesn't bear any resemblance to a unix-y system then I wouldn't believe a word I type.
 
The funny thing is that this strong rumor about the CPU been underutilized early on has not been reported by the media. That Project Cars log seems to support it.

Guess there is no fun in reporting it, only fun comes from bashing the Wii U CPU.

Seeing as current gen consoles graphic cards are based on the DirectX9, Wii U on DirectX 10.1 and PS4/XB1 on DirectX11 I found the following interesting. Note as it has been cleared many times, this refers to feature set and not the actual DirectX API, which I think only MS uses.

Disclaimer: I know is not as simple as this, as many other factors weigh in but I thought it was interesting nonetheless. Other components in PS4 and XB1 and the various mysteries in Wii U like the tessellation unit which we know exists but we are not sure about the implementation.

http://www.overclock.net/t/597046/dx11-vs-dx10-vs-dx-9-pics

from there I added the below comparisons. From the pictures IMO there is a bigger difference from DX9 to DX10 than DX10 to DX11 on the lightning, shadows, DOF and detail.
This seems to be around the same difference I am seeing in the batch of games shown @ E3 for Wii U, that even SSB, that its Brawl roots are still obvious, looks very very good with just the lightning, also the WW port, HDfied and better lighting goes a long way.

Road DX9
eH2RWtm.jpg


Road DX10
xNtVMdM.jpg


Road DX11
50Pf9zx.jpg


Dragon DX9
vuQ1Dwv.jpg


Dragon DX10
2ineGdd.jpg


Dragon DX11
g65MLrZ.jpg


House DX9
XKYLFVT.jpg


House DX10
9JavN6p.jpg


House DX11
1gJSkbt.jpg


Another shot from Call of Juarez BiB comparing DX9 and DX10.

A0vZJ4i.jpg


Other DX9 vs DX10
Crysis
tL1Ytqj.jpg


Age of Conan: Hyborean Adventures
2QKN5lh.jpg


Just lightning, shadows and AA goes a long way in creating more life-like scenes
 
The lighting does indeed look better, but textures are also important, and DX10 doesn't support tesselation, does it?

What has been said around here (please correct me if I am wrong) is that earlier non DX11 cards had tessellation units but were not as efficient as those in DX11.

The Wii U has a tessellation unit and a developer (Shinen) is using it. It is not clear which tessellation unit the Wii U has as the chip is custom.
 
The funny thing is that this strong rumor about the CPU been underutilized early on has not been reported by the media. That Project Cars log seems to support it.

Guess there is no fun in reporting it, only fun comes from bashing the Wii U CPU.

Seeing as current gen consoles graphic cards are based on the DirectX9, Wii U on DirectX 10.1 and PS4/XB1 on DirectX11 I found the following interesting. Note as it has been cleared many times, this refers to feature set and not the actual DirectX API, which I think only MS uses.

Disclaimer: I know is not as simple as this, as many other factors weigh in but I thought it was interesting nonetheless. Other components in PS4 and XB1 and the various mysteries in Wii U like the tessellation unit which we know exists but we are not sure about the implementation.

http://www.overclock.net/t/597046/dx11-vs-dx10-vs-dx-9-pics

from there I added the below comparisons. From the pictures IMO there is a bigger difference from DX9 to DX10 than DX10 to DX11 on the lightning, shadows, DOF and detail.
This seems to be around the same difference I am seeing in the batch of games shown @ E3 for Wii U, that even SSB, that its Brawl roots are still obvious, looks very very good with just the lightning, also the WW port, HDfied and better lighting goes a long way.

Road DX9
eH2RWtm.jpg


Road DX10
xNtVMdM.jpg


Road DX11
50Pf9zx.jpg


Dragon DX9
vuQ1Dwv.jpg


Dragon DX10
2ineGdd.jpg


Dragon DX11
g65MLrZ.jpg


House DX9
XKYLFVT.jpg


House DX10
9JavN6p.jpg


House DX11
1gJSkbt.jpg


Another shot from Call of Juarez BiB comparing DX9 and DX10.

A0vZJ4i.jpg


Other DX9 vs DX10
Crysis
tL1Ytqj.jpg


Age of Conan: Hyborean Adventures
2QKN5lh.jpg


Just lightning, shadows and AA goes a long way in creating more life-like scenes

All of these photos are telling me one thing; we really ARE reaching the point where graphics aren't improving much from generation to generation.
 
What has been said around here (please correct me if I am wrong) is that earlier non DX11 cards had tessellation units but were not as efficient as those in DX11.

The Wii U has a tessellation unit and a developer (Shinen) is using it. It is not clear which tessellation unit the Wii U has as the chip is custom.
AMD cards have had hardware tessellation since 2001. It was only recently implemented into DX11.
 
From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?

Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.

My Questions Are:

  • Can Direct X11 games not run on Direct X10.1 at all?
  • What is stopping Direct X11 games running on Direct X10.1?
  • What features are missing that might not allow optimizing and porting?
 
From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?

Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.

My Questions Are:

  • Can Direct X11 games not run on Direct X10.1 at all?
  • What is stopping Direct X11 games running on Direct X10.1?
  • What features are missing that might not allow optimizing and porting?

Nintendo isn't using ANY DirectX.
The featureset is said to be comparable to DirectX10.1 at least, so really not so far behind DirectX11, because as already told above, AMD card already supported tesselation before for example.

I think the featureset is the least problem Nintendo faces with the hardware.
 
A few comments:

Seeing as current gen consoles graphic cards are based on the DirectX9, Wii U on DirectX 10.1 and PS4/XB1 on DirectX11 I found the following interesting. Note as it has been cleared many times, this refers to feature set and not the actual DirectX API, which I think only MS uses.

PS4/XB1 GPUs support DirectX 11.1 (not that it is a big difference, just saying). Even the upcoming DX11.2 will be supported.


The most obvious difference here is tesselation. Just because you have the hardware that supports it doesn't mean you can use it for free though. For example, on my GPU (GTX 460), the same scene will run at half the fps with the high tesselation factor which is used in this screenshot.
You can achieve a decent effect for the road without having to use tesselation btw. It's called parallax occlusion mapping and can be enabled in Crysis using DX9 (example).

Another shot from Call of Juarez BiB comparing DX9 and DX10.

A0vZJ4i.jpg

The difference in texture quality has nothing to do with DX10. They could've enabled it in DX9, just chose not to. Not your fault of course; sadly, some comparisons are made to me misleading.

Other DX9 vs DX10
Crysis
tL1Ytqj.jpg

This is a similar case. All features of Crysis' DX10 mode (except the motion blur solution) could be enabled in DX9 mode, just not through the main menu options but the .ini files. It might have performed sightly better in DX10 though.

Age of Conan: Hyborean Adventures
2QKN5lh.jpg

This features different lighting no doubt. Keep in mind though that god rays like this are achievable in DX9 without problems, as we have seen in many games. The devlopers chose not to implement it specifically for DX9 because it would've meant additional effort. Most people with a decent PC would just use the DX10 mode anyway.
 
From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?

Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.

My Questions Are:

  • Can Direct X11 games not run on Direct X10.1 at all?
  • What is stopping Direct X11 games running on Direct X10.1?
  • What features are missing that might not allow optimizing and porting?

Wii U (And PS4 for that matter) are not running Direct X. Its sorely a Microsoft thing. Hence "X" Box. Wii U and PS4 run on a (propably custom) version of OpenGL wich supports basically all the stuff Direct X does aswell.

EDIT: beaten lol
 
The other part of the Project Cars stuff I wanted to understand was:

-Set default memory alignment on WiiU to be 64 bytes due to FS_IO_BUFFER_ALIGN requirement

Does that actually divulge anything or is this again already a given based on what we know?

Any clever people have time to explain? :)
 
From what I understand is that the Wii U supports many of the key features in Direct X11, but is actually running Direct X10.1...?

Thinking about when Tetsuya Nomura said that both Kingdom Hearts 3 and Final Fantasy XV will not be coming to Wii U due to it not supporting Direct X11. Can someone explain to me the limitations of a Direct X10.1 system that supports Direct X 11 effects.

My Questions Are:

  • Can Direct X11 games not run on Direct X10.1 at all?
  • What is stopping Direct X11 games running on Direct X10.1?
  • What features are missing that might not allow optimizing and porting?
No. There's a BIG misunderstanding around this. What happened is that WiiU's GPU (Latte) started development in 2008 over an R700, which is a DX10.1 compilant card with some other features not included on the DX10.1 standard.
Because we don't know what we currently have, the safe bet is to assume that at least the WiiU GPU will be an R700.

That being said, EVERY SINGLE GPU ON THE MARKET STARTS DEVELOPMENT OVER AN OLDER DESIGN, and not the past year's design like some people say. A GPU is not designed, tested, tuned, tested again, fabricated, distributed and commercialized in 1 year.

The HD5870 was the first ATI DX11 GPU, and some people here act as if this was made from scratch. The reality is that this GPU not only was really similar to the HD4870 (R700) in terms of design, but that it was also designed over an even older design (maybe the R600 or even the R500). And the same goes for the R700, it was in terms of architecture an evolution of an R600, but this doesn't mean that it's development didn't start until the R600 was finalized and commercialized.

We have rumours that the final silicon of the WiiU GPU wasn't finished until early 2012. We know that the devkits of the console weren't at version 1.0 (that is, complete) until AFTER the console was released (December 2012, confirmed by Nintendo), which gives more strength to the rumour of the GPU not being finalized until very late.

So in other words, from the DX10.1+some other things that was the R700 Nintendo has had 4 whole years to modify, redesign and adapt the GPU design. I insist, final silicon wasn't finalized until early 2012, that means that all the tunning (that is increase or decrease frequencies and test it's reliability under multiple scenarios) was done during the first half of that year.

What this means is nothing and everything. If the design wasn't finalized until that late, and if when examined the GPU turned to have a completely hand-made layout to squeeze every single mm^2 of chip area a LOT of changes could be made.
Maybe this GPU has features not included on the DX11 standard and maybe it lacks other of those DX11 features.

Only time will tell us what this console is good at, but seeing recent games like Bayonetta 2 or X, it is at least a noticeable jump from Xbox 360 and PS3 and only developers know how far it can go.
 
Memory alignment (64 bytes) only means parts of the system need the starting addresses to be n*64 to perform optimal. e.g. cache lines, buffers, etc. Miss alignment can cost you 5-10% on data access. IOW proper alignment gives you a little performance boost.
 
No. There's a BIG misunderstanding around this. What happened is that WiiU's GPU (Latte) started development in 2008 over an R700, which is a DX10.1 compilant card with some other features not included on the DX10.1 standard.
Because we don't know what we currently have, the safe bet is to assume that at least the WiiU GPU will be an R700.

That being said, EVERY SINGLE GPU ON THE MARKET STARTS DEVELOPMENT OVER AN OLDER DESIGN, and not the past year's design like some people say. A GPU is not designed, tested, tuned, tested again, fabricated, distributed and commercialized in 1 year.

The HD5870 was the first ATI DX11 GPU, and some people here act as if this was made from scratch. The reality is that this GPU not only was really similar to the HD4870 (R700) in terms of design, but that it was also designed over an even older design (maybe the R600 or even the R500). And the same goes for the R700, it was in terms of architecture an evolution of an R600, but this doesn't mean that it's development didn't start until the R600 was finalized and commercialized.

We have rumours that the final silicon of the WiiU GPU wasn't finished until early 2012. We know that the devkits of the console weren't at version 1.0 (that is, complete) until AFTER the console was released (December 2012, confirmed by Nintendo), which gives more strength to the rumour of the GPU not being finalized until very late.

So in other words, from the DX10.1+some other things that was the R700 Nintendo has had 4 whole years to modify, redesign and adapt the GPU design. I insist, final silicon wasn't finalized until early 2012, that means that all the tunning (that is increase or decrease frequencies and test it's reliability under multiple scenarios) was done during the first half of that year.

What this means is nothing and everything. If the design wasn't finalized until that late, and if when examined the GPU turned to have a completely hand-made layout to squeeze every single mm^2 of chip area a LOT of changes could be made.
Maybe this GPU has features not included on the DX11 standard and maybe it lacks other of those DX11 features.

Only time will tell us what this console is good at, but seeing recent games like Bayonetta 2 or X, it is at least a noticeable jump from Xbox 360 and PS3 and only developers know how far it can go.

Wow, thank you for such informative information that concludes one of the questions that has been going on in my mind for a while about how GPU's are succeeded.

I remember once an interview from a developer of NInja Gaiden 3: Razor's Edge, saying how the amount of enemies on screen is decreased on Wii U - Due to the slow CPU or something... But then the comments on a lot of those Slow CPU threads were saying how it uses a CPU+GPU combined thing, and developers just aren't utilizing it right (same with Mass Effect I think).

X and Bayonetta 2 shows that the system is capable of gigantic open worlds and large enemies, so I'm sure Platinum Games and Monolith Soft. know what they are doing.

Most of the Graphics Talk I find really annoying how when people are complaining about models and textures, the game isn't magically going to look like that, developers and graphic designers create them and spend a lot of effort and time on them - like a quality over quantity thing, depends on what they want to spend their resources on.
 
A few comments:



PS4/XB1 GPUs support DirectX 11.1 (not that it is a big difference, just saying). Even the upcoming DX11.2 will be supported.



The most obvious difference here is tesselation. Just because you have the hardware that supports it doesn't mean you can use it for free though. For example, on my GPU (GTX 460), the same scene will run at half the fps with the high tesselation factor which is used in this screenshot.
You can achieve a decent effect for the road without having to use tesselation btw. It's called parallax occlusion mapping and can be enabled in Crysis using DX9 (example).



The difference in texture quality has nothing to do with DX10. They could've enabled it in DX9, just chose not to. Not your fault of course; sadly, some comparisons are made to me misleading.



This is a similar case. All features of Crysis' DX10 mode (except the motion blur solution) could be enabled in DX9 mode, just not through the main menu options but the .ini files. It might have performed sightly better in DX10 though.



This features different lighting no doubt. Keep in mind though that god rays like this are achievable in DX9 without problems, as we have seen in many games. The devlopers chose not to implement it specifically for DX9 because it would've meant additional effort. Most people with a decent PC would just use the DX10 mode anyway.

Thanks for your inputs. Very interesting indeed and makes the whole diminishing returns make more sense, and other components like CPU, available memory and memory bandwitdth all that much important in building on the differences between consoles. So IQ, resolution and FPS plays a major role also, but this is mostly determine by power.

So can I ask what are the advantages of a DX10 compliant card over a DX9 and a DX10.1 over a DX10 compliant.

One thing I read is that they can do the same "techniques" in a more efficient way. Certainly important but I am more curious to know if there is some technique that cannot be done in DX9.

And again, please remember this is a graphics card that is compliant with the DX10.1 features not that it uses DX10.1 API.

Also what does Shader model 4.1 and geometry shaders bring to the table or better someone can translate into dummy mode the following if even possible:

From http://en.wikipedia.org/wiki/Microsoft_Direct3D
The DirectX 10 SDK became available in February 2007.[9]

New features:

Fixed pipelines[10] are being done away with in favor of fully programmable pipelines (often referred to as unified pipeline architecture), which can be programmed to emulate the same.
New state object to enable (mostly) the CPU to change states efficiently.
Shader model 4.0 enhances the programmability of the graphics pipeline. It adds instructions for integer and bitwise calculations.
Geometry shaders, which work on adjacent triangles which form a mesh.
Texture arrays enable swapping of textures in GPU without CPU intervention.
Predicated Rendering allows drawing calls to be ignored based on some other conditions. This enables rapid occlusion culling, which prevents objects from being rendered if it is not visible or too far to be visible.
Instancing 2.0 support, allowing multiple instances of similar meshes, such as armies, or grass or trees, to be rendered in a single draw call, reducing the processing time needed for multiple similar objects to that of a single one.[11]

Direct3D 10.1

Direct3D 10.1 was announced by Microsoft shortly after the release of Direct3D 10 as a minor update. The specification was finalized with the release of November 2007 DirectX SDK and the runtime was shipped with the Windows Vista SP1, which is available since mid-March 2008.

Direct3D 10.1 sets a few more image quality standards for graphics vendors, and gives developers more control over image quality.[12][13] Features include finer control over anti-aliasing (both multisampling and supersampling with per sample shading and application control over sample position) and more flexibilities to some of the existing features (cubemap arrays and independent blending modes). Direct3D 10.1 level hardware must support the following features:

Mandatory 32-bit floating point filtering.
Mandatory support for 4x anti-aliasing
Shader model 4.1

Interesting the bit about 4x anti-aliasing being Mandatory.
 
Thanks for your inputs. Very interesting indeed and makes the whole diminishing returns make more sense, and other components like CPU, available memory and memory bandwitdth all that much important in building on the differences between consoles. So IQ, resolution and FPS plays a major role also, but this is mostly determine by power.

So can I ask what are the advantages of a DX10 compliant card over a DX9 and a DX10.1 over a DX10 compliant.

One thing I read is that they can do the same "techniques" in a more efficient way. Certainly important but I am more curious to know if there is some technique that cannot be done in DX9.

And again, please remember this is a graphics card that is compliant with the DX10.1 features not that it uses DX10.1 API.

Also what does Shader model 4.1 and geometry shaders bring to the table or better someone can translate into dummy mode the following if even possible:

From http://en.wikipedia.org/wiki/Microsoft_Direct3D


Interesting the bit about 4x anti-aliasing being Mandatory.

Was anything significant changed between DirectX 11 and 11.1?
 
That and 11.2 seem to be centered more on the API itself, but someone more techie than me could chime in.

so no new features like, say, hardware-based ambient occlusion (if that wouldn't cause a card to melt) or improved displacement mapping (modern displacement mapping makes things look too jagged and unnatural)?
 
Am I the only one who finds it funny that people expected games back in the early PS3 era to be using ray tracing while the only film to use it up to this point was Monsters University?
 
Once again, devs with actual hardware experience chimes in on the Wii U's hardware strength.

http://www.nintendolife.com/news/20..._nintendo_is_struggling_to_explain_its_appeal

We are still considering SteamWorld Dig for Wii U, but I also think we have other ideas that suit the Wii U better. The Wii U is a very powerful console, but I think Nintendo has a hard time explaining to consumers who should buy it and why. It becomes a paradox: if not enough people buy the console, developers are not going to flock to it – which means that it takes longer for the console to establish itself.


http://www.nintendolife.com/news/20...rt_and_is_interested_in_future_wii_u_projects

Julius: We have really enjoyed working with the Wii U hardware. It was rather easy to port our modern proprietary engine to it, and it does pack the punch to bring to life some really awesome visuals. The Wii U is a very modern console with a lot of RAM which helped us out a lot during development. The hardware capabilities have improved quite a lot from the original Wii, and the Wii U is a truly powerful console. The console is definitely more powerful than the Xbox 360 and PS3.

It was mostly praising how much better the eshop was than Xbla and PSN, but there was also this comment.
With Splot though the situation is a bit more complex as the technology we’re using isn’t the same as for Trine 2: Director’s Cut, so there’s a lot of technical work in getting Splot running on “high end” consoles like the Wii U.

Of course, people are more than likely just going to ignore everything Frozenbyte says "again" like how they ignored all graphical enhancements to Trine 2: DC(which was directly stated as not being possible on the 360/PS3 without downgrades) in favor of promoting games that demonstrate some form of inconsistency/inferiority to the last gen console version as absolute examples of the Wii U's power and peak potential without even remotely considering any other reasoning than said inferiorities being the result of incapability on the Wii U's part which is still prevalent despite examples to the contrary.

I placing my chips on the Wii U GPU being absolutely no less than 200% stronger than the last gen consoles. The GPU is clearly more than a few notches ahead. How much of that power is put to use is always up to the dev, but the better ones like Frozenbyte will definitely give us something nice to look at. I'm also fairly certain that the GPU contains 2 polygon computation units. It is impossible to ignore the increase in geometrical detail at this point.
 
Status
Not open for further replies.
Top Bottom