WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Just because Bayonetta 2 looks better then the first one does not mean the Gap in power between 360 and WiiU is the same as the visual difference.

The first Bayonetta came out jan 2010, and visuals have progressed since then, 2013 visuals are better then 2010 visuals on the 360/PS3.

Even if B2 was going to be made on PS3/360 do people believe it would not have any visual improvements?

Seeing how games like the last of us, GodOW:A , halo 4,skyrim have made large visual advancements, saying B2 looks better because of the WiiU's extra ram and more advanced GPU, does not make sense.

Imagine in a different dimension metal gear solid 5 was in development for 360/ps3 and it ran fine on those platforms but for some crazy reason they were scrapped and the game was WiiU/X1/PS4 only and the WiiU version was very similar to the scrapped 360/PS3 version. If this happened and no one knew about the scrapped 360/PS3 versions everyone would be like ''look at the difference between MGS5 and MGS4, WiiU is clearly visually superior to 360/ps3 and this proves it''

this is the same sought of logic that is being applied to the B1 vs B2 situation, which maybe different to my imaginary scenario but the possibility that the visual advancements are due to the WiiU'S hardware are by no means absolute and are even less likely then my opinion when you compare it to the visual progression made by other games across the industry.
 
Just because Bayonetta 2 looks better then the first one does not mean the Gap in power between 360 and WiiU is the same as the visual difference.

The first Bayonetta came out jan 2010, and visuals have progressed since then, 2013 visuals are better then 2010 visuals on the 360/PS3.

Even if B2 was going to be made on PS3/360 do people believe it would not have any visual improvements?

Seeing how games like the last of us, GodOW:A , halo 4,skyrim have made large visual advancements, saying B2 looks better because of the WiiU's extra ram and more advanced GPU, does not make sense.

Imagine in a different dimension metal gear solid 5 was in development for 360/ps3 and it ran fine on those platforms but for some crazy reason they were scrapped and the game was WiiU/X1/PS4 only and the WiiU version was very similar to the scrapped 360/PS3 version. If this happened and no one knew about the scrapped 360/PS3 versions everyone would be like ''look at the difference between MGS5 and MGS4, WiiU is clearly visually superior to 360/ps3 and this proves it''

this is the same sought of logic that is being applied to the B1 vs B2 situation, which maybe different to my imaginary scenario but the possibility that the visual advancements are due to the WiiU'S hardware are by no means absolute and are even less likely then my opinion when you compare it to the visual progression made by other games across the industry.

Man, the lightning and shading, as well as textures look better, deal with it, your comments are just confusing.

In the end what all this goes to show is that Uncharted 3, MGS V current gen, GTA V, The Last of Us, when compared to next gen launch games people are not going to see the difference, same situation with the Wii U, but now of course the argument will be PS4/XB1 are new architectures and need time to be maxed out, but god forbid anyone mention the Wii U, that shit is maxed the hell out.

And just like some are saying here, being the PS4 and XB1 very similar to PC, is somewhat worrying looking at certain games that are not looking or running as expected, they should be taking advantage of the similar architecture.

People jump all over the place when in fact the Wii U seems like a fair step up current gen like they have some f"#$% stock on the other platforms. The Wii U seems to be strategically placed somewhere in the middle and the games showed are looking promising with more development time pending. Just deal with it.
 
Man, the lightning and shading, as well as textures look better, deal with it, your comments are just confusing.

In the end what all this goes to show is that Uncharted 3, MGS V current gen, GTA V, The Last of Us, when compared to next gen launch games people are not going to see the difference, same situation with the Wii U, but now of course the argument will be PS4/XB1 are new architectures and need time to be maxed out, but god forbid anyone mention the Wii U, that shit is maxed the hell out.

And just like some are saying here, being the PS4 and XB1 very similar to PC, is somewhat worrying looking at certain games that are not looking or running as expected, they should be taking advantage of the similar architecture.

People jump all over the place when in fact the Wii U seems like a fair step up current gen like they have some f"#$% stock on the other platforms. The Wii U seems to be strategically placed somewhere in the middle and the games showed are looking promising with more development time pending. Just deal with it.

you obviously did not read my post.

and deal with what exactly? all I said was that game engines advance, dont they?

Your faith that B2's visual advancement is due to the WiiU's hardware seems not clear cut for reasons stated in my first post, if you think i'am wrong then please share your reasoning's.

and I dont know why your bringing up the X1 and PS4? This thread is about discussing the WiiU's visual capabilities.
 
you obviously did not read my post.

and deal with what exactly? all I said was that game engines advance, dont they?

Your faith that B2's visual advancement is due to the WiiU's hardware seems not clear cut for reasons stated in my first post, if you think i'am wrong then please share your reasoning's.

and I dont know why your bringing up the X1 and PS4? This thread is about discussing the WiiU's visual capabilities.

The thing is we have to go with what we have and not what could have been. In a previous post I said that Bayo been a 2010 game, a sequel on HD twins one would believe would have been better, but this is just guesswork, it was not made, so we will never know for sure. We have Bayo 1 and Bayo 2 and huge differences among the 2.

The games that were shown at E3 are looking very promising and I am happy with that. Until a very capable studio builds a realistic game on the Wii U and takes advantage of its architecture and memory subsystem many here will never be convinced. The chances of this happening are slim though.
 
Even if Wii U were a good step up, it'd honestly be invisible and look very close to last gen.

It's difficult to see huge improvement from PS360 to PS4X1, and those are vastly more powerful than Wii U is. I mean, the improvement there is still visible, but more difficult to see than the step from Gen 6 to Gen 7.

I don't expect to see much variance with Wii U, tbh.
 
Building off my post comparing Latte to Brazos, I believe we can now positively identify another two blocks on Latte. It's funny how taking a break from things for a while can allow you to see things you never noticed before when you come back to them.

Strangely enough, it appears that the UVD is actually Block B on Latte. Meanwhile Block G is the block labeled "TAVC" on Brazos. It seems highly likely that this is another block related to video transcoding - possibly AMD's "Advanced Video Processor" or some offshoot. The giveaway is the group of small SRAM blocks on the right side of Block B, which borders on Block G - just as with the UVD and TAVC blocks on Brazos. And actually, it does make sense for UVD to be close to the CPU interface, since the CPU still lends a hand in decoding (although greatly reduced thanks to UVD).

This also shakes up my previous labeling slightly. It now seems that Latte's Block H is Primative Assembly/Vertex Setup/Tesselation.

Also, after some more thought last night, I am going with Latte's Q Blocks being Local Data Shares. I had previously speculated M might be a combined LDS block, but my current reasoning is that we should be seeing something symmetrical if that were the case, since Wii U has two SIMD cores, and there should be a LDS for each. Thus, it should be one symmetrical block or two separate identical blocks. I'm going with the latter as "M" was the only other real candidate based on SRAM amount and position. The position of Q makes some sense as the LDS blocks on RV770 actually sat between the SPUs and texture units/caches. The amount of LDS would be 16kB each. The extra SRAM pools in the Qs beyond what we would count towards the 16kB seem to be a common finding if you look at Llano, Brazos, etc.
 
Building off my post comparing Latte to Brazos, I believe we can now positively identify another two blocks on Latte. It's funny how taking a break from things for a while can allow you to see things you never noticed before when you come back to them.

Strangely enough, it appears that the UVD is actually Block B on Latte. Meanwhile Block G is the block labeled "TAVC" on Brazos. It seems highly likely that this is another block related to video transcoding - possibly AMD's "Advanced Video Processor" or some offshoot. The giveaway is the group of small SRAM blocks on the right side of Block B, which borders on Block G - just as with the UVD and TAVC blocks on Brazos. And actually, it does make sense for UVD to be close to the CPU interface, since the CPU still lends a hand in decoding (although greatly reduced thanks to UVD).

This also shakes up my previous labeling slightly. It now seems that Latte's Block H is Primative Assembly/Vertex Setup/Tesselation.

Also, after some more thought last night, I am going with Latte's Q Blocks being Local Data Shares. I had previously speculated M might be a combined LDS block, but my current reasoning is that we should be seeing something symmetrical if that were the case, since Wii U has two SIMD cores, and there should be a LDS for each. Thus, it should be one symmetrical block or two separate identical blocks. I'm going with the latter as "M" was the only other real candidate based on SRAM amount and position. The position of Q makes some sense as the LDS blocks on RV770 actually sat between the SPUs and texture units/caches. The amount of LDS would be 16kB each. The extra SRAM pools in the Qs beyond what we would count towards the 16kB seem to be a common finding if you look at Llano, Brazos, etc.

What would you say the chip is closer to; R770 or Brazos?
 
What would you say the chip is closer to; R770 or Brazos?

That's sort of a tough question to answer. Latte is really its own beast, although certain aspects of Latte are shared with both of the chips you name. Unlike Brazos, Latte is not meant to function as a SoC (despite the presence of the ARM core) - it still relies on a 60x interface out to Espresso rather than an on-chip bus like Brazos has to its CPU cores. Still, it has in common with Brazos a focus on low power draw and is a much more lean design than RV770.

It's hard to say for sure, because we lack more clear pictures of the RV770 die, but it appears as if most of the blocks on Latte resemble that die more. The shaders are apparently of the DX10.1 variety and the SRAM banks in the SPU blocks indicate that they contain 20 shaders each - more like RV770 than Brazos. Similarly, the TMU blocks appear to be similar to those on RV770, and it also has interpolation hardware similar to that chip. The actual floor plan of the blocks, however, is quite different, although I wouldn't exactly call it similar to Brazos.

The memory subsystem is still somewhat of a mystery. I have not found a block that I can firmly identify as the memory controller/north bridge, although I have had a few ideas. We can say for certain that it's nothing like RV770 in that regard, as Latte likely has one DDR3 memory controller and not the 4 GDDR3 controllers of the former. Still, I wonder if it doesn't also have a Graphics Memory Controller, as Brazos, Llano, and Trinity feature. Still something to work out, although I am not sure if it's possible. Then again, the UVD location just kind of jumped at me out of nowhere, so perhaps myself or another will get a similar Eureka moment. haha.
 
Totally off-topic, but I just played through Spec Ops: The Line...and now I feel cold and empty inside; as if the lingering flame of my belief of the inherent goodness of man was extinguished in a shower of blood, water, and white phosphorus...
 
Just because Bayonetta 2 looks better then the first one does not mean the Gap in power between 360 and WiiU is the same as the visual difference.

The first Bayonetta came out jan 2010, and visuals have progressed since then, 2013 visuals are better then 2010 visuals on the 360/PS3.

Even if B2 was going to be made on PS3/360 do people believe it would not have any visual improvements?

Seeing how games like the last of us, GodOW:A , halo 4,skyrim have made large visual advancements, saying B2 looks better because of the WiiU's extra ram and more advanced GPU, does not make sense.

Imagine in a different dimension metal gear solid 5 was in development for 360/ps3 and it ran fine on those platforms but for some crazy reason they were scrapped and the game was WiiU/X1/PS4 only and the WiiU version was very similar to the scrapped 360/PS3 version. If this happened and no one knew about the scrapped 360/PS3 versions everyone would be like ''look at the difference between MGS5 and MGS4, WiiU is clearly visually superior to 360/ps3 and this proves it''

this is the same sought of logic that is being applied to the B1 vs B2 situation, which maybe different to my imaginary scenario but the possibility that the visual advancements are due to the WiiU'S hardware are by no means absolute and are even less likely then my opinion when you compare it to the visual progression made by other games across the industry.

You know why Bayonetta 2 ia good comparison? Because its made by the same developer

Bayonetta came 4 1/2 years into xbox 360 lifespan, by a very competent and team with alot of exerience on the 360 at that point (360 = Lead platform)

Bayonetta 2 is coming from the same team, on their second Wii U game, wich is coming out after 1+ years into Wii Us lifespan

If Bayonetta 2 is now looking this much better than Bayonetta 1, how will games look when Wii U is 4 1/2 years on the market compared to PS3/360? (Wich may be dead in 4 1/2 years, but still...)

But if time is such a facor for you, theres still NFS:MW U wich looks significantly improved on Wii U. Theres a Digital Foundry comparison for it aswell.

Also "X" and Mario Kart, SSB4 looking really damn good, with the last 2 confirmed to be in 60fps and SSB4 even in 1080p60. I think we have more than enough proof now that Wii U is more capable than PS360.
 
Bayonetta came 4 1/2 years into xbox 360 lifespan, by a very competent and team with alot of exerience on the 360 at that point (360 = Lead platform)

It was their first 360 game. And it was no looker.

Anarchy Reigns and MGS:R are pretty ugly too. Platinum don't make technically advanced games.
 
That's sort of a tough question to answer. Latte is really its own beast, although certain aspects of Latte are shared with both of the chips you name. Unlike Brazos, Latte is not meant to function as a SoC (despite the presence of the ARM core) - it still relies on a 60x interface out to Espresso rather than an on-chip bus like Brazos has to its CPU cores. Still, it has in common with Brazos a focus on low power draw and is a much more lean design than RV770.

It's hard to say for sure, because we lack more clear pictures of the RV770 die, but it appears as if most of the blocks on Latte resemble that die more. The shaders are apparently of the DX10.1 variety and the SRAM banks in the SPU blocks indicate that they contain 20 shaders each - more like RV770 than Brazos. Similarly, the TMU blocks appear to be similar to those on RV770, and it also has interpolation hardware similar to that chip. The actual floor plan of the blocks, however, is quite different, although I wouldn't exactly call it similar to Brazos.

The memory subsystem is still somewhat of a mystery. I have not found a block that I can firmly identify as the memory controller/north bridge, although I have had a few ideas. We can say for certain that it's nothing like RV770 in that regard, as Latte likely has one DDR3 memory controller and not the 4 GDDR3 controllers of the former. Still, I wonder if it doesn't also have a Graphics Memory Controller, as Brazos, Llano, and Trinity feature. Still something to work out, although I am not sure if it's possible. Then again, the UVD location just kind of jumped at me out of nowhere, so perhaps myself or another will get a similar Eureka moment. haha.

Very interesting. I need to do some research more on Brazos, but I consider it very intriguing on how different this GPU is from the original r700 processors. I appreciate your continuous analysis on what we are looking at.

It was their first 360 game. And it was no looker.

Anarchy Reigns and MGS:R are pretty ugly too. Platinum don't make technically advanced games.

The "ugly" thing is very subjective. It is more accurate to say that Platinum focuses on fast/insane gameplay.
 
It was their first 360 game. And it was no looker.

Anarchy Reigns and MGS:R are pretty ugly too. Platinum don't make technically advanced games.

I have to respectfully disagree with you. I was surprised at the sheer amount of detail that they packed into Metal Gear Rising visually. I was stunned at a freaking tree stump because of the resolution of the textures on it.
 
It was their first 360 game. And it was no looker.

Anarchy Reigns and MGS:R are pretty ugly too. Platinum don't make technically advanced games.
If this company doesn't make technically advanced games, the same goes for Bayonetta 2.

For now, Bayonetta 2 is at a level which is ahead of what was possible the last generation. Even The Last of Us (the highest achievement seen on current gen) is below it technically.

The jump from PS360 to WiiU is there, and it will only grow bigger as the time passes.

Very interesting. I need to do some research more on Brazos, but I consider it very intriguing on how different this GPU is from the original r700 processors. I appreciate your continuous analysis on what we are looking at.
It's not intriguing at all. The GPU's silicon design was finalized at early 2012, the fact that that design started over an R700 means absolutely nothing besides that.

ALL THE GPUs THAT YOU CAN FIND ON THE MARKET STARTED DEVELOPMENT OVER AN OLDER DESIGN. The R800 was started at least as an R600 or even older, and was being developed among the R700 from which inherited all the advancements made to that architecture.

On consoles this is a bit more variable and depends completely on the level of customization that a company wants for their chips, and as we can see, the Latte GPU would surely be a custom generation of GPU made specifically for Nintendo, with maybe support for some functions that in a PC GPU wouldn't be feasible due to the fact that games are coded so that they work well with everything. We have tons of examples of that, for example, the Tessellation unit included on ATI GPUs that wasn't even touched until DX11 and the HD5000 iteration.
 
I never tire of watching that Gomorrah boss fight. The game has to be pushing a ridiculous amount of polys there unless they've used tesselation to disguise a lower poly model. And there are still people that are denying that Bayonetta 2, X, Mario Kart 8, Donkey Kong and Super Smash Bros. look like 'next gen' titles. The mind really does boggle. :o/

Edit: And why the fuck don't these useless gaming journalists ask Sega or Nintendo whether Sega are going to port the first game to be made available in the eShop only at a budget price..? It would be a no-brainer question imo, and even if Sega haven't considered doing this it might give them an idea to do it!!!

I don't think you quite understand how tesselation works. Tesselation doesn't "disguise" low poly models, it literally increases the poly count of said model dynamically. So if a game is pushing a lot of polygons via tesselation then the game is simply just pushing a lot of polygons.

The in the end, tesselation doesn't actually increase the poly's a system can push, it's just a more effective means of LOD scaling that games already use anyway.
 
I don't think you quite understand how tesselation works. Tesselation doesn't "disguise" low poly models, it literally increases the poly count of said model dynamically. So if a game is pushing a lot of polygons via tesselation then the game is simply just pushing a lot of polygons.

The in the end, tesselation doesn't actually increase the poly's a system can push, it's just a more effective means of LOD scaling that games already use anyway.
Of course tesselation increases the polys a system can push, in fact, this is its purpose at this point.
It can't be used as LOD because of some artifacts that the tesselated objects display on different tessellation levels (the transition between using tesselation factor 5 to let's say factor 7 is not smooth (the object deforms) so it can't be used directly as a free perfect dynamic LOD system).

Nowadays tesselation is used to increase the polycount a GPU can display on-screen. Thanks to the model being tesselated, you do vertex transformation and manipulation BEFORE tesselating and doing so you save both bandwidth (because you transfer a low poly model from memory) and GPU processing power per object, so you can draw more simpler objects that then become high poly.
 
I think the link below show what Next Gen can do and it's enough.

Wii U Graphics card tech demo
http://www.youtube.com/watch?v=BzquM5Td6bM

PS4 Graphics card tech demo
http://www.youtube.com/watch?v=4gIq-XD5uA8

A lot of people here will tell you to not get your hopes up about that tech demo. While that is running on RV770, it's probably running on way more memory. I want to say way faster as well but the built in eDRAM/SDRAM aid with speed. So faster, but not WAY faster I suppose. Though I will say the Bird demo (on floor) looks close to the RV770 tech demo (though people here are saying not to even expect graphics of the bird demo).

By the way, http://www.youtube.com/watch?v=7fzkHGch12c

There is the full RV770 tech demo. Edit: Ok, just watch the full demo again. Wow that is no longer that advanced looking or impressive. A lot of those textures were really awful and while some of the effects were ok, it's nothing all that impressive. I think the WiiU can pull that off.
 
Of course tesselation increases the polys a system can push, in fact, this is its purpose at this point.
It can't be used as LOD because of some artifacts that the tesselated objects display on different tessellation levels (the transition between using tesselation factor 5 to let's say factor 7 is not smooth (the object deforms) so it can't be used directly as a free perfect dynamic LOD system).

Nowadays tesselation is used to increase the polycount a GPU can display on-screen. Thanks to the model being tesselated, you do vertex transformation and manipulation BEFORE tesselating and doing so you save both bandwidth (because you transfer a low poly model from memory) and GPU processing power per object, so you can draw more simpler objects that then become high poly.

Well, I stand corrected. Though at the very least my point stands on the fact that tesselation is not simply "disguising low poly models".
 
A lot of people here will tell you to not get your hopes up about that tech demo. While that is running on RV770, it's probably running on way more memory. I want to say way faster as well but the built in eDRAM/SDRAM aid with speed. So faster, but not WAY faster I suppose. Though I will say the Bird demo (on floor) looks close to the RV770 tech demo (though people here are saying not to even expect graphics of the bird demo).

By the way, http://www.youtube.com/watch?v=7fzkHGch12c

There is the full RV770 tech demo. Edit: Ok, just watch the full demo again. Wow that is no longer that advanced looking or impressive. A lot of those textures were really awful and while some of the effects were ok, it's nothing all that impressive. I think the WiiU can pull that off.

Ah, I see, you've edited your comments. Just wanted to say that I don't see why the Wii U should be able to pull that off.

Anyway, I'm sure games will look better than that Bird Demo, simply because consoles have always surpassed their own early tech demos. Every time. Don't know why it should be different this gen.

Anyway, for a console that can output graphics like Bayonetta 2, X, and MK8 only 1+ years in it's life-cicle, has tgo render some games twice (TV+Gamepad) AND only runs on 33 watts, it's fucking impressive!
 
Well, I stand corrected. Though at the very least my point stands on the fact that tesselation is not simply "disguising low poly models".
Yes, you were accurate in that regard. Tessellation creates new polygons from a low-polygon mesh, so once this mesh is tessellated what you have is a high-poly mesh.
 
Ah, I see, you've edited your comments. Just wanted to say that I don't see why the Wii U should be able to pull that off.

Anyway, I'm sure games will look better than that Bird Demo, simply because consoles have always surpassed their own early tech demos. Every time. Don't know why it should be different this gen.

Anyway, for a console that can output graphics like Bayonetta 2, X, and MK8 only 1+ years in it's life-cicle, has tgo render some games twice (TV+Gamepad) AND only runs on 33 watts, it's fucking impressive!

Go through this thread, you'll see the claims that the WiiU won't be able to reach the Bird Demo quite a few times.
 
I have a few questions for anyone who cares to give some feedback.

1) On Latte, are we expecting dedicated compute units, that work independently of the shaders? I'm kinda getting this vibe, but wasn't sure if it was discussed already.

This would be a bit different from PS4/XBO which - from what I understand - have CUs that can be used for shading or compute tasks, with the trade-off being left up to the programmer. I'm asking because the notion(or rather, one assumption) was that Wii U's GPGPU functions would take away vital processing power from graphical tasks, but that wouldn't be the case if it has dedicated compute units, right?

Which leads to my next questions...

2) If Latte does have dedicated compute-only units, does that mean - in theory - that they would be better tuned for that sole purpose? Could Wii U get by with just 2 or 4 highly specialized units?

3) Which blocks(on Latte) could most likely house these units? Or where would make most sense(next to the sharers? close the the eDRAM? etc) Can we take any hints from other GPUs?
 
This is technically CPU related, but not sure if that thread is still alive...

Just reading through some changelogs n things, noticed the mention of:
Set main thread to (normal priority + 1) so that normal pri threads get cpu time as WiiU threads don’t time slice.

Not sure that's new info or not, thought I'd check here. I got excited by an actual definitive statement regarding WiiU hardware!
 
This is technically CPU related, but not sure if that thread is still alive...

Just reading through some changelogs n things, noticed the mention of:

Not sure that's new info or not, thought I'd check here. I got excited by an actual definitive statement regarding WiiU hardware!

Where did you see this?
 
I have a few questions for anyone who cares to give some feedback.

1) On Latte, are we expecting dedicated compute units, that work independently of the shaders? I'm kinda getting this vibe, but wasn't sure if it was discussed already.

This would be a bit different from PS4/XBO which - from what I understand - have CUs that can be used for shading or compute tasks, with the trade-off being left up to the programmer. I'm asking because the notion(or rather, one assumption) was that Wii U's GPGPU functions would take away vital processing power from graphical tasks, but that wouldn't be the case if it has dedicated compute units, right?

Which leads to my next questions...

2) If Latte does have dedicated compute-only units, does that mean - in theory - that they would be better tuned for that sole purpose? Could Wii U get by with just 2 or 4 highly specialized units?

3) Which blocks(on Latte) could most likely house these units? Or where would make most sense(next to the sharers? close the the eDRAM? etc) Can we take any hints from other GPUs?

The GPGPU comments from Iwata are a bit quizzical. At the risk of sounding inflammatory, I believe he was in damage control mode there, as at the time, the CPU was very much being painted in a negative light by the media/devs. As you mention, there is not much power to spare when we look at the shaders. That still doesn't rule out implementing compute shaders in circumstances where they are the most convenient way for programmers to achieve what they are aiming for. Devs, as always, will just have to balance their resources.

It's very unlikely that there are any blocks on the GPU dedicated to compute - besides the 2 LDS blocks, rather large constant cache, and robust GDS (all things I'm fairly confident in identifying at this point). Those pools of super fast memory in combination with the 32 MB eDRAM and fast connection to the CPU should enable devs to get more bang for their buck out of the limited number of shaders by freeing up clock cycles. I don't believe we are looking at anything extremely exotic in architecture besides the use of eDRAM. The AMD representative that was interview around E3 specified that Nintendo basically licensed the AMD tech (in contrast with MS/Sony, who worked closely with AMD in order to develop/modify their system architecture), so I wouldn't be expecting any significant customizations in the Unified Shader Architecture besides some tweaks to the base R700 ISA and GX2 API.
 
The AMD representative that was interview around E3 specified that Nintendo basically licensed the AMD tech (in contrast with MS/Sony, who worked closely with AMD in order to develop/modify their system architecture), so I wouldn't be expecting any significant customizations in the Unified Shader Architecture besides some tweaks to the base R700 ISA and GX2 API.

Doesn't this mean that Nintendo could have just customized it on their own? Otherwise, wouldn't the dieshot have more familiar elements?
 
Doesn't this mean that Nintendo could have just customized it on their own? Otherwise, wouldn't the dieshot have more familiar elements?

It wouldn't make much sense to take an AMD design and not ask for their input in customizing it when that help is certainly available. Although, I'm sure Nintendo did get some support in the implementation of components.

Besides, the die shot does have quite a few familiar elements. How do you think we have been able to identify so many of them? The SRAM looks a bit different because we are comparing Renesas' to that of GF and TSMC. The main difference is the overall floor plan (arrangement of the blocks on the die) and inclusion of the large eDRAM/SRAM pools. Plus the South Bridge which includes the ARM core, DSP, USB control, etc.
 
might be worth looking at vid again. i figured he meant more that since sony and MS have it all on a single die with AMD CPUs that there was a lot of integration to do.
 
Go through this thread, you'll see the claims that the WiiU won't be able to reach the Bird Demo quite a few times.

Who made this claim and based on what? I've seen no such claims by any of the people who are actually knowledgeable about the hardware and forwarding the discussions progress.

Anyone can come in this thread and claim anything. Was it substantiated with facts, details or any form of logical evidence? I have a feeling that whoever said that was one of the same people who claimed that every Wii U game could be done on the PS3/360 even when the games is an enhanced version of a PS3/360 game or when the devs say otherwise. Going by the rumors, the hardware in the Wii U at the time they made that demo was less powerful than what is in the final hardware. The RAM rumors especially were all lower aiming for 786MB or 1.5 MB at the time.
 
http://mynintendonews.com/2013/07/0...ble-on-wii-u-but-the-console-is-low-priority/

It was sad the way many users (whom I shall not name) rushed to support and defend the claim that the Wii U could not run the Frostbite engine by any means necessary after the statement was made that EA wasn't bringing it.

They attacked every aspect of the hardware they could, but just like with the Cryengine 3, it turned out to have absolutely nothing to do with hardware strength and everything to do with EA's ego.


Also, I was going to bring this up before but I couldn't find any major source to support it. Itsn't Bayonetta 2 suppose to be 1080p 60FPS? http://gamingbolt.com/list-of-next-...cted-to-be-running-at-1080p-resolution-60-fps


Just the graphical jump from Bayonetta 1 to Bayonetta 2 already shows far more than "just a 360 with some extra effects". http://www.neogaf.com/forum/showpost.php?p=66720001&postcount=6901The 360 ran B1 at 720p >48FPS. If It pulls it at 1080p on the Wii U then that pretty much destroys everyone's arguments against the hardware strength. It would also finally allow us to put this 160 shader claim to rest(as it should already be). It seems to improbable at this point.
 
http://mynintendonews.com/2013/07/0...ble-on-wii-u-but-the-console-is-low-priority/

It was sad the way many users (whom I shall not name) rushed to support and defend the claim that the Wii U could not run the Frostbite engine by any means necessary after the statement was made that EA wasn't bringing it.

They attacked every aspect of the hardware they could, but just like with the Cryengine 3, it turned out to have absolutely nothing to do with hardware strength and everything to do with EA's ego.

Didn't a ton of people remark the engine is making a mobile appearance and that the Wii U version wouldn't be up to snuff as the other console engines but entirely possible none the less?


Also, I was going to bring this up before but I couldn't find any major source to support it. Itsn't Bayonetta 2 suppose to be 1080p 60FPS? http://gamingbolt.com/list-of-next-...cted-to-be-running-at-1080p-resolution-60-fps


Just the graphical jump from Bayonetta 1 to Bayonetta 2 already shows far more than "just a 360 with some extra effects". http://www.neogaf.com/forum/showpost.php?p=66720001&postcount=6901The 360 ran B1 at 720p >48FPS. If It pulls it at 1080p on the Wii U then that pretty much destroys everyone's arguments against the hardware strength. It would also finally allow us to put this 160 shader claim to rest(as it should already be). It seems to improbable at this point.

I'll believe Bayo 2 at 1080x60fps when its officially announced. It would be fantastic if it were and in line with my hopes for what the Wii U could do way back during the WUST days. But ultimately, the Wii U is still a slouch and closer to the Xbox360/PS3 than to the XboxONE/PS4.
 
Do you guys have any idea why the Wii U version of Sonic & All-Stars Racing Transformed runs at sub-HD resolution (1024×576) @30 fps?

Seeing as that the PS3 version runs at the highest res, I wouldn't be surprised if that were the lead platform (though it does have its own set of hiccups). They may have used Cell/Xenon for some of the dynamic landscape transformations and water physics. The tracks have alot going on.

Despite the lower res, the game is great on Wii U. Worth it for off tv play alone.
 
Do you guys have any idea why the Wii U version of Sonic & All-Stars Racing Transformed runs at sub-HD resolution (1024×576) @30 fps?

I will use math as it is easy to understand.

time + resources = Money

New Platform with low userbase will get minimum effort. I have the game and it looks good IMO and runs just fine, great game.

edit: beaten

The Wii U hardware is being put to good use in the coming games. I feel very happy with what has been shown so far. The second half of the year has a lot of games coming and I intend to keep playing my PS3 also. Will wait on the other next gen console or PC til late 2014 or early 2015. I might have trouble justifying it with the games I have yet to beat on PS3 and the coming games on Wii U.

The problem is the userbase, if the Wii U does not end up selling well, then ports of 3rd party games might never become good indicators of the HW capabilities with some exceptions. Really hoping AC4 and Watchdogs do not disappoint, also Project Cars might be interesting and a good fit for Wii U as it lacks that type of game. If Project Cars runs very good on Wii U I will pick it up.
 
I'll believe Bayo 2 at 1080x60fps when its officially announced. It would be fantastic if it were and in line with my hopes for what the Wii U could do way back during the WUST days. But ultimately, the Wii U is still a slouch and closer to the Xbox360/PS3 than to the XboxONE/PS4.

I haven't really been here for long but I always hear you mentioning "WUST days". What does that mean? Just curious.
 
A lot of people here will tell you to not get your hopes up about that tech demo. While that is running on RV770, it's probably running on way more memory. I want to say way faster as well but the built in eDRAM/SDRAM aid with speed. So faster, but not WAY faster I suppose. Though I will say the Bird demo (on floor) looks close to the RV770 tech demo (though people here are saying not to even expect graphics of the bird demo).

By the way, http://www.youtube.com/watch?v=7fzkHGch12c

There is the full RV770 tech demo. Edit: Ok, just watch the full demo again. Wow that is no longer that advanced looking or impressive. A lot of those textures were really awful and while some of the effects were ok, it's nothing all that impressive. I think the WiiU can pull that off.

Wait, what? People are saying that we shouldn't expect games to look as good as the Garden demo (which was running on most likely half-speed dev kits)? Where does that reasoning come from?
 
Wait, what? People are saying that we shouldn't expect games to look as good as the Garden demo (which was running on most likely half-speed dev kits)? Where does that reasoning come from?

I was more along the lines of them doubting they were running on devkits at all than actual PCs.
 
Status
Not open for further replies.
Top Bottom