WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I like to thank everyone that has properly contribute to this thread. It has been very informative, and I'm glad that bringing up Latte's possible connection to Brazos had resulted to some good discussions.

Yeah, people got a little carried away while the tech heads were gone.

Brazo actually does do some GPGPU things, in fact a university decided to use this processor and ended up nearly doubling the compute speed of every day use thanks to some home made drivers using the GPU for GPGPU tasks whenever possible.

Not sure Renesas is really going to offer that big of a difference, that is part of the point I'm trying to make, since they are so great at memory modules, is it also possible that the count of the registry is wrong and they are using larger modules there? they have always looked bigger to me compared to the brazo shots. Even if it isn't there is the possibility that the cache doesn't need to be as big thanks to not doing all the things a GPU normally does (mostly in video land here) as well as having all the extra eDRAM and SRAM on chip in other places.

The power consumption is the last thing I would use to attribute to only having 160ALUs, the targeted low clock, mature 40nm process using a low power format as well as it being on a MCM, it could be saving quite a bit of power, every power consumption guess I've seen never takes those things into account.

Mem 0 could be used for a number of things, my guess is that it is being used exactly like Gamecube's texture memory, as the edram should be large enough to cache the cpu pool as needed, leaving the faster latency SRAM for heavy latency GPU specific things. As I mentioned above I also think some of this memory can be used in place of some registry space, such as GPGPU instruction caching, or a number of other tasks that you would want quick access to but doesn't need to be exactly immediate.

Disclaimer: Although I do some light programming, this is a hobby for me and I do not have a computer science degree, nor do I code in OpenCL.

That story about Brazos is cool. Do you have a link to that story? It seems to be a processor that punches above its weight. How is its performance compared to the other GPUs in the r700 series?
 
He was one of the biggest evangelists of the Wii U's hardware superiority over 360/PS3 which is what caused him to get his tag where he claimed to be able to count the polygons in Wii U games proving that they were far superior to PS3/360 titles.
Uh, what? That's not what happend.

Someone was questioning whether the Zelda demo was real because of the CPU but I chimed in by saying what if the Zelda demo wasn't actually special (i.e other games surpassed it). I don't recall ever saying the demo (or any Wii U game) pushed more polys on this forum.
 
Yes, now this is what I like to see. Progressive analysis.

After reading Fourth's Storms explanation, the 160 is starting to seem more plausible, though I still can't fully agree with it till it is better explained why the units are physically larger than the standard 20 ALU components.

I'm still leaning toward it being something around the line of 24 or 28 ALU. Technological efficiency can go in many directions.

This is interesting though. http://www.youtube.com/watch?v=00IRlhoH0XQ&feature=youtu.be The lighting in the Wii U version is definitely a few notches up. Like at :45. There are more particle effects present in the Wii U version than the 360 version. As far as the overal "brightness" it hard to tell where that is a difference in the footage or the balance of lighting. I remember Trine 2 DC was originally much brighter than the PS3/360 version.

What I'm really curious about now is GX2. Given the DX11 back and forth, I believe its safe to the say that the Wii U doesn't support DX11 but it capable of producing the effects much the same as the GC didn't support Shader Model 1.1 but could still produce the same texture effects as it.

We have Normura saying the Wii U isn't getting KH3 do to lack of DX11(which is obviously a bogus excuse just like when Deep Silver said the Wii U didn't support the Dead Island 2 game engine only for the engines creator's to come out and say they made it supported earlier and that they were lying), but then we have the Project C.A.R.S. listed as using DX11 features specifically.

What is GX2? Maybe someone can find some scrap documentation lying around for it.
 
Yes, now this is what I like to see. Progressive analysis.

After reading Fourth's Storms explanation, the 160 is starting to seem more plausible, though I still can't fully agree with it till it is better explained why the units are physically larger than the standard 20 ALU components.

I'm still leaning toward it being something around the line of 24 or 28 ALU. Technological efficiency can go in many directions.

This is interesting though. http://www.youtube.com/watch?v=00IRlhoH0XQ&feature=youtu.be The lighting in the Wii U version is definitely a few notches up. Like at :45. There are more particle effects present in the Wii U version than the 360 version. As far as the overal "brightness" it hard to tell where that is a difference in the footage or the balance of lighting. I remember Trine 2 DC was originally much brighter than the PS3/360 version.

What I'm really curious about now is GX2. Given the DX11 back and forth, I believe its safe to the say that the Wii U doesn't support DX11 but it capable of producing the effects much the same as the GC didn't support Shader Model 1.1 but could still produce the same texture effects as it.

We have Normura saying the Wii U isn't getting KH3 do to lack of DX11(which is obviously a bogus excuse just like when Deep Silver said the Wii U didn't support the Dead Island 2 game engine only for the engines creator's to come out and say they made it supported earlier and that they were lying), but then we have the Project C.A.R.S. listed as using DX11 features specifically.

What is GX2? Maybe someone can find some scrap documentation lying around for it.

Squeenix may be having a hissy-fit with Nintendo...how well did Kingdom Hearts 3D sell?
 
Yes, now this is what I like to see. Progressive analysis.

After reading Fourth's Storms explanation, the 160 is starting to seem more plausible, though I still can't fully agree with it till it is better explained why the units are physically larger than the standard 20 ALU components.

I'm still leaning toward it being something around the line of 24 or 28 ALU. Technological efficiency can go in many directions.

This is interesting though. http://www.youtube.com/watch?v=00IRlhoH0XQ&feature=youtu.be The lighting in the Wii U version is definitely a few notches up. Like at :45. There are more particle effects present in the Wii U version than the 360 version. As far as the overal "brightness" it hard to tell where that is a difference in the footage or the balance of lighting. I remember Trine 2 DC was originally much brighter than the PS3/360 version.

What I'm really curious about now is GX2. Given the DX11 back and forth, I believe its safe to the say that the Wii U doesn't support DX11 but it capable of producing the effects much the same as the GC didn't support Shader Model 1.1 but could still produce the same texture effects as it.

We have Normura saying the Wii U isn't getting KH3 do to lack of DX11(which is obviously a bogus excuse just like when Deep Silver said the Wii U didn't support the Dead Island 2 game engine only for the engines creator's to come out and say they made it supported earlier and that they were lying), but then we have the Project C.A.R.S. listed as using DX11 features specifically.

What is GX2? Maybe someone can find some scrap documentation lying around for it.
The 360 one looks sharper and less washed out but other than that they look pretty much identical.
 
Yes, now this is what I like to see. Progressive analysis.

After reading Fourth's Storms explanation, the 160 is starting to seem more plausible, though I still can't fully agree with it till it is better explained why the units are physically larger than the standard 20 ALU components.

I'm still leaning toward it being something around the line of 24 or 28 ALU. Technological efficiency can go in many directions.

This is interesting though. http://www.youtube.com/watch?v=00IRlhoH0XQ&feature=youtu.be The lighting in the Wii U version is definitely a few notches up. Like at :45. There are more particle effects present in the Wii U version than the 360 version. As far as the overal "brightness" it hard to tell where that is a difference in the footage or the balance of lighting. I remember Trine 2 DC was originally much brighter than the PS3/360 version.

What I'm really curious about now is GX2. Given the DX11 back and forth, I believe its safe to the say that the Wii U doesn't support DX11 but it capable of producing the effects much the same as the GC didn't support Shader Model 1.1 but could still produce the same texture effects as it.

We have Normura saying the Wii U isn't getting KH3 do to lack of DX11(which is obviously a bogus excuse just like when Deep Silver said the Wii U didn't support the Dead Island 2 game engine only for the engines creator's to come out and say they made it supported earlier and that they were lying), but then we have the Project C.A.R.S. listed as using DX11 features specifically.

What is GX2? Maybe someone can find some scrap documentation lying around for it.

Square never said,that, that was a translation error, and people changing what he said.
 
No, my biggest reasons for rejecting anything other than 160 shaders at this point are the register banks within the shaders, the TMU count, and the TDP.

Wasn't the biggest reason for your opinion on the number of TMU's based on your opinion of 160 shaders? Or am I misremembering?

I believe TDP is a strong argument in favor of a 160 ALU part. Wii U has been shown to commonly output 33 watts during Gameplay without any USB drain or anything like that. Let's look at the identically clocked Redwood LE, a 320 ALU card. 39 watts. Without breaking down all the individual components for the Wii U and this comparison card, let's just use a little common sense. What is going to output more heat? A graphics card or an entire system? The answer is obvious. All the hardware in Wii U is going to negate the difference of any mundane graphics card circuitry and then some! Thus, I just can't see how it could be 320 ALUs.

Now, the 160 ALU parts seem alot more reasonable. 18 watts for a 625 Mhz card sounds about right as I'd peg Wii U's GPU at ~15 Mhz after doing some rough guestimation for RAM, CPU, disc drive etc.

A few things to consider here. That Redwood LE you're referring to uses GDDR5, which is extremely power hungry compared to DDR3. So much so that in gameplay power consumption tests a 512MB GDDR5 based Redwood LE (320 ALU, 16 TMU 550Mhz) uses the same amount of power (37w) at absolute peak as a 1GB GDDR3 based Redwood Pro (400 ALU's, 20 TMU's and clocked at 650Mhz). On average over 12 tests in game the Redwood Pro GDDR3 actually used less power (33w average for the Redwood LE 512MB GDDR5 vs 31w for the 1GB DDR3 Redwood Pro).

Also 39w is a theoretical TDP, where as 33w is a current measurement from WiiU launch games (almost certainly not the highest it will hit, even if it might end up close to it).

Although I agree 320 seems very unlikely, I'm still very unsure about 160 as well. I wonder if we'll ever know for a fact..
 
Wasn't the biggest reason for your opinion on the number of TMU's based on your opinion of 160 shaders? Or am I misremembering?

Nope, it was based on very similar SRAM structures found in the 8 TMU Brazos die.


A few things to consider here. That Redwood LE you're referring to uses GDDR5, which is extremely power hungry compared to DDR3. So much so that in gameplay power consumption tests a 512MB GDDR5 based Redwood LE (320 ALU, 16 TMU 550Mhz) uses the same amount of power (37w) at absolute peak as a 1GB GDDR3 based Redwood Pro (400 ALU's, 20 TMU's and clocked at 650Mhz). On average over 12 tests in game the Redwood Pro GDDR3 actually used less power (33w average for the Redwood LE 512MB GDDR5 vs 31w for the 1GB DDR3 Redwood Pro).

Also 39w is a theoretical TDP, where as 33w is a current measurement from WiiU launch games (almost certainly not the highest it will hit, even if it might end up close to it).

Although I agree 320 seems very unlikely, I'm still very unsure about 160 as well. I wonder if we'll ever know for a fact..

Mind linking to your source for those figures? The effective power draw of GDDR5 chips is tough to find info on, but from what I've seen, we're not talking about a significant increase over DDR3 - maybe a few watts, but nothing that's going to close the gap we're looking at here. The numbers you see in whole system benchmarks are not good indicators, because if there is an increase in performance, that means the GPU is working harder (the RAM is getting more out of it), and that will produce more heat. You would have to find benchmarks with exactly identical performance in order to observe the true difference between DDR3 and GDDR5 power draw. (edit: saw where you got your figures and it's for the graphics card only, so nevermind the part about whole system benchmarks)

I have no reason to believe AMD's reference number to be merely theoretical, nor do I have any reason to believe that Wii U is going to all of a sudden start sucking a significantly higher amount of juice. It's max TDP appears to be about 33 watts, granted no USB ports are in use.

Edit: Were you getting your figures here? If so, you'll see the max TDP for the HD 5550 was 40 watts (and that's what we're comparing for now, the max TDP). The max TDP of the HD 5570 with GDDR3 and a higher clock was 50 watts - 10 watts higher. Pretty much as expected.
 
Nope, it was based on very similar SRAM structures found in the 8 TMU Brazos die.




Mind linking to your source for those figures? The effective power draw of GDDR5 chips is tough to find info on, but from what I've seen, we're not talking about a significant increase over DDR3 - maybe a few watts, but nothing that's going to close the gap we're looking at here. The numbers you see in whole system benchmarks are not good indicators, because if there is an increase in performance, that means the GPU is working harder (the RAM is getting more out of it), and that will produce more heat. You would have to find benchmarks with exactly identical performance in order to observe the true difference between DDR3 and GDDR5 power draw.

I have no reason to believe AMD's reference number to be merely theoretical, nor do I have any reason to believe that Wii U is going to all of a sudden start sucking a significantly higher amount of juice. It's max TDP appears to be about 33 watts, granted no USB ports are in use.

Edit: Were you getting your figures here? If so, you'll see the max TDP for the HD 5550 was 40 watts (and that's what we're comparing for now, the max TDP). The max TDP of the HD 5570 with GDDR3 and a higher clock was 50 watts - 10 watts higher. Pretty much as expected.

and what of a 5550 with regular old DDR3 RAM; what's the TDP of that card? Also, aren't the TDPs of PC graphics cards higher because the RAM and GPU aren't the only things on the PCB getting powered?
 
This is interesting though. http://www.youtube.com/watch?v=00IRlhoH0XQ&feature=youtu.be The lighting in the Wii U version is definitely a few notches up. Like at :45. There are more particle effects present in the Wii U version than the 360 version.
The lighting models look just about identical, and as far as I can tell there's close parity in particle counts (at :45, the WiiU seems to have its particles distributed a little wider, but they're less dense). If there are any major differences in the source footage besides colour balance, that video doesn't have the bitrate to show it.
 
and what of a 5550 with regular old DDR3 RAM; what's the TDP of that card? Also, aren't the TDPs of PC graphics cards higher because the RAM and GPU aren't the only things on the PCB getting powered?

I haven't been able to find the power consumption for the DDR3 variant. I'd be interested in them if someone can. Been a long day for me, and I'm getting tired.

What you mentioned about what else is on the PCB is exactly my point. If we're comparing Max TDPs of Wii U and a graphics card, Wii U also has a PCB, and there's alot more on it. Setting aside RAM and fan, there's the CPU, 2 x flash storage, 2 x wifi (one for the Gamepad), the disc drive, and probably a few other things I'm forgetting as well. Whatever additional overhead the Radeon HD 5550 is sporting, it is rendered completely moot in light of what's present on the Wii U motherboard.
 
I haven't been able to find the power consumption for the DDR3 variant. I'd be interested in them if someone can. Been a long day for me, and I'm getting tired.

What you mentioned about what else is on the PCB is exactly my point. If we're comparing Max TDPs of Wii U and a graphics card, Wii U also has a PCB, and there's alot more on it. Setting aside RAM and fan, there's the CPU, 2 x flash storage, 2 x wifi (one for the Gamepad), the disc drive, and probably a few other things I'm forgetting as well. Whatever additional overhead the Radeon HD 5550 is sporting, it is rendered completely moot in light of what's present on the Wii U motherboard.

38 watts under full load...

http://www.kitguru.net/components/graphic-cards/zardon/his-hd5550-silence-1gb-ddr3-review/6/ card doesnt even have a fan.

Correct range would be 12-20 watts at most for the GPU in Wiiu. 20 is on the very high side. Doing the math it looks like it needs to be under 15 watts for the gpu.

320 is completely ruled out. It has to be 160 ALU or some weird number like 30 ALU per block but that wouldn't match the other thing on the chip. 160 is confirmed...

Also like to point out that its a myth that GDDR5 uses a ton more power than DDR3. Gddr5 is based on ddr3. Comparing the GDDR5 ram in ps4 to DDR3 in xbone you are looking at a couple watts at most and that is with 8GB!
 
38 watts under full load...

http://www.kitguru.net/components/graphic-cards/zardon/his-hd5550-silence-1gb-ddr3-review/6/ card doesnt even have a fan.

Correct range would be 12-20 watts at most for the GPU in Wiiu. 20 is on the very high side. Doing the math it looks like it needs to be under 15 watts for the gpu.

320 is completely ruled out. It has to be 160 ALU or some weird number like 30 ALU per block but that wouldn't match the other thing on the chip. 160 is confirmed...

Also like to point out that its a myth that GDDR5 uses a ton more power than DDR3. Gddr5 is based on ddr3. Comparing the GDDR5 ram in ps4 to DDR3 in xbone you are looking at a couple watts at most and that is with 8GB!

http://www.amd.com/us/products/desk...hd-5000/hd-5550/Pages/hd-5550-overview.aspx#2

If this is really the Wii U Gpu that's meat it's support DX 11 & 320 Stream Processing Units , Shader Model 5.0,352 GigaFLOPS .....!!!??

It's good power for 720p resolution target!!
 
Oh, lawdy. Fourth Storm and USC-fan find the shader count. Nice detective work you two.

The GPU seems to be very impressive for what it is. Great engineering.

http://www.amd.com/us/products/desk...hd-5000/hd-5550/Pages/hd-5550-overview.aspx#2

If this is really the Wii U Gpu that's meat it's support DX 11 & 320 Stream Processing Units , Shader Model 5.0,352 GigaFLOPS .....!!!??

It's good power for 720p resolution target!!

We've long come to the conclusion that the GPU you linked to isn't the one in the Wii U.
 
That's an interesting fact about Brazos. Just goes to show what 80 shaders can do!

On the topic of the fab house, I suppose this is another place where we differ. You don't see it as having much of an impact on component size, and I can see it as potentially having a huge one. Despite the maturity of whatever process (40nm, 45nm...) they used, the fact is that they don't have much experience with Radeon blocks whereas TSMC do. And this is nothing against Renesas. It just doesn't seem to be a familiar market for them, whereas I do know they are quite successful in making microcontrollers for cars and such. Even if each slice of SRAM in the GPRs was 8 kB (and not 4 kB, as they appear to be), it wouldn't work, because the threads need independent access to a certain number of them. It would require dual-ported SRAM. Any other type of memory trying to make up for a lack of register space/access would be too slow. Register access needs to be like lightning. Unless there is just a drastic reworking of the VLIW architecture, which I don't see as being very likely at all.

I believe TDP is a strong argument in favor of a 160 ALU part. Wii U has been shown to commonly output 33 watts during Gameplay without any USB drain or anything like that. Let's look at the identically clocked Redwood LE, a 320 ALU card. 39 watts. Without breaking down all the individual components for the Wii U and this comparison card, let's just use a little common sense. What is going to output more heat? A graphics card or an entire system? The answer is obvious. All the hardware in Wii U is going to negate the difference of any mundane graphics card circuitry and then some! Thus, I just can't see how it could be 320 ALUs.

Now, the 160 ALU parts seem alot more reasonable. 18 watts for a 625 Mhz card sounds about right as I'd peg Wii U's GPU at ~15 Mhz after doing some rough guestimation for RAM, CPU, disc drive etc.

80 shaders for GPGPU is great yes, it's like having 80 very small CPU cores doing the work just very limited work. 160 should be fine at replacing Xenon code, and I still haven't ruled out that Latte is a 160 shader part, I simply don't know and find it funny when other people "do".

You can't compare TDPs to actual power draw, this is the top wattage the chip is designed to handle and almost never use this wattage level, it's almost always 2/3rds to 3/4ths this number. meaning for this card's GDDR5 version would run between 26watts and 30watts on average, though stuff like furmark can overheat a card and push them beyond their TDP, this doesn't happen in normal use like you'd see on a console, especially one targeting a specific low clock, which usually will see an efficiency gain with a lower power draw for the same performance. The same is true for MCM which could shave another 1-2watts thanks to sharing the board with a CPU that is also drawing less than 8 watts. (they would consume some of that power together) The e6760 for instance uses 35watts comes with 1GB GDDR5 and is 480 shaders clocked at 600MHz, I love how everyone says this is a binned part but yet is used in millions of devices world wide from PoS units to Casino machines.

I actually think we could be looking at the e6460 which is a 160 shader part, Nintendo would likely want to use an existing chip and modify it, in 2009 this chip wasn't ready so they built their kits on the high end (at the time) AMD GPU HD 4800 series and simply down clocked it until it was usable. I'm almost positive there is no TEVs in the shader blocks thanks to the Wii U Iwata asks where the engineers who built Wii U said that they integrated Gamecube's design into the AMD GPU they used. This could be an interesting aspect though because they may of had to add ALUs to the shader blocks to emulate TEV freely, also something that has been bugging me is that Gamecube had 3 raternizers, I'm not an expert at this so maybe someone like blu could answer this. Does that cause a problem with the idea that latte only has 1? or could it be hiding a couple more without the entire 5 block duality that HD 6970 and HD 7000 series displays?

Fourth Storm, while you have answered some questions definitely, the truth is you also are not an expert on AMD designs and you could simply be seeing what you want to see. That is why I keep bringing up the idea that you could be wrong, because to be frank. You probably are about quite a few of the things you think you know. I'm sure plenty of people who might have insider information have helped you through PMs but you need to realize that most of those people are probably fake and the people that do have an idea might not actually know anything. I've been fine with the idea of 160 shaders in latte for a long time, it really doesn't make any difference whether it is 160, 256 or 320, but it is quite possible that we have this wrong and it is a bit different.

Also just because the chip is VLIW (if it is) It doesn't mean that it has to be VLIW 5 or 4, VLIW has been around a lot longer than HD2000, it's not even ATI's idea afaik, it's just another instruction type. Nintendo could of worked with AMD and their tech team to have something that is a bit different, which could explain why the blocks are 90% bigger than 20 ALU blocks, which is a huge huge difference, and while you are sure that Renesas is going to make it bigger, there are certainly things they should be able to make smaller especially since they are hand layouts which usually improves on density over machine layouts like we see in Brazo. You are just too sure to be taken seriously, and maybe that is because you know some things you can't divulge, but since I don't know those things, it just looks silly that you are so sure.

PS lets throw out the notion that Wii U games are even using all of the GPU's resources and thus pushing it's limits as far as TDP goes. I would guess we will see 37-38watts becoming the norm in the future which could give 3+ watts to the GPU. I really don't think 360 ports are going to push an AMD GPU with even 160 shaders to it's full power potential.

To anyone interested in e6460, you can find the PDF here: http://www.amd.com/us/Documents/AMD-Radeon-E6760-Discrete-GPU-product-brief.pdf it is the second page and the TDP is 16watts.
 
So basically, the more we think we know, the less we actually do. Btw, you still owe me a story. ;)



That's an interesting fact about Brazos. Just goes to show what 80 shaders can do!

On the topic of the fab house, I suppose this is another place where we differ. You don't see it as having much of an impact on component size, and I can see it as potentially having a huge one. Despite the maturity of whatever process (40nm, 45nm...) they used, the fact is that they don't have much experience with Radeon blocks whereas TSMC do. And this is nothing against Renesas. It just doesn't seem to be a familiar market for them, whereas I do know they are quite successful in making microcontrollers for cars and such. Even if each slice of SRAM in the GPRs was 8 kB (and not 4 kB, as they appear to be), it wouldn't work, because the threads need independent access to a certain number of them. It would require dual-ported SRAM. Any other type of memory trying to make up for a lack of register space/access would be too slow. Register access needs to be like lightning. Unless there is just a drastic reworking of the VLIW architecture, which I don't see as being very likely at all.

I believe TDP is a strong argument in favor of a 160 ALU part. Wii U has been shown to commonly output 33 watts during Gameplay without any USB drain or anything like that. Let's look at the identically clocked Redwood LE, a 320 ALU card. 39 watts. Without breaking down all the individual components for the Wii U and this comparison card, let's just use a little common sense. What is going to output more heat? A graphics card or an entire system? The answer is obvious. All the hardware in Wii U is going to negate the difference of any mundane graphics card circuitry and then some! Thus, I just can't see how it could be 320 ALUs.

Now, the 160 ALU parts seem alot more reasonable. 18 watts for a 625 Mhz card sounds about right as I'd peg Wii U's GPU at ~15 Mhz after doing some rough guestimation for RAM, CPU, disc drive etc.


What about Laptop GPUs though? Mobility Radeon HD 5650 (RV830) has a TDP of 15 and 19 Watts respectively (400 ALUs, 450 and 650mhz respectively, 1GB DDR3 and 1GB GDDR3 included respectively).
I'm just not sure if it's of much use to compare desktop GPU cards' TDP with the supposed one in the WiiU.
The Wii U, in terms of the PCB size, structure, cooling solutions, overall TDP, absolute size, PSU etc. is much closer to a laptop (and not even a gaming laptop) than to a PC.
People also still sort of disregard the fact that the 40nm process back in 2012 was pretty mature. It's pretty likely that AMD/Nintendo/Renesas were able to reduce voltages compared to GPUs that launched back in 2010.
 
If we compared the Wii U chipset to an existing product (don't bother with the RAM, I'm talking CPU+GPU combo), what kind of power is it packing? Netbook? High end tablet?
 
If we compared the Wii U chipset to an existing product (don't bother with the RAM, I'm talking CPU+GPU combo), what kind of power is it packing? Netbook? High end tablet?

How could we do this when we don't know the GPU's specs? If it's 160 shaders, and comparable to say r800 shaders, I would have to say it is above netbook but below their A series. http://www.newegg.com/Product/Product.aspx?Item=N82E16819113282 This could be the closest, you'd have to downclock the CPU to 1.8-2GHz and the GPU to ~500MHz but this would be the closest you could do. Given how a console would have a game coded to it though, you'd be best off without downclocking it at all though because while this chip's specs are higher than Wii U's minimum, it will waste a lot of resources on it's OS and pushing details that the Wii U game might lack, also you'd likely be using Direct X which just isn't as good as Open GL

i do think to a certain degree we need to start throwing some of the "logic" out of the window. In the Iwata ask on the hardware they stated the three bolded companies got together and created some weird hybrid that NO ONE outside of the group assigned to designed latte and exspresso is going to be able to figure out. i just dont think people should go off of convential knowledge of AMD GPU's and say it has to be this end of story.... the reality is the engineers at these three companies worked their ASS off to achieve what they have with the Wii U. until we get some leaked information from a developer or someone in the KNOW we need to be careful calling a end to the story just because in "OUR" mind going with logic there could be no other answer.

Exactly, if it was just a run of the mill 160ALU, AMD wouldn't of had to spend much time on it at all. Yet we know that they did and we know Nintendo worked hard at getting a certain performance out of this part.
 
and the reality is we know Nintendo has a history (gamecube) of designing something weird that VASTLY outperforms the specs released for the console. Nintendo purposely didnt design this console to go head on spec for spec with xb1 or ps4.... but i just think those that think its basically on par or just a notch up from ps360 are not giving it enough credit. to their credit though yes we havent seen many games on Wii U that outperform the best of ps360... but i truthfully believe 2014 is going to be the year that what the Wii U can do will be on full display. Bayo 2 is already running at 60 FPS possibly 1080p not confirmed, mario kart 8 60 FPS possibly 1080p not confirmed, smash 4 confirmed 1080p 60 FPS.... and we dont even no what else is coming but X should have hands on and we can see whats that like. it just boggels my mind that there is a group out there that believes we have seen the best from Wii U and its basically a 360 with 1GB of slow ram..... 2014 is going to wake some people up its a in between console.... and thats the way nintendo wanted it to be.
I would bet large sums of money that Bayonetta 2 and Mario Kart will be 720p. If Bayonetta 2 were 1080p, wouldn't Wonderful 101 be too? (it's not). As for Nintendo in house games, they're ALL 720p: NSMBU, Nintendoland, Pikmin 3... the only games in 1080p are Wario, which has extremely simple graphics, and the Windwaker remake/port thing, and that's an updated gamecube game. And it's still 30fps (I know, I know, it's that way because of the animations, which makes sense). 1080p 60fps? Those games will be the Wario-like exceptions, and Smash.

I agree we're going to have very pretty games on the Wii U, but the power just isn't there for much fancy 1080p.
 
and the reality is we know Nintendo has a history (gamecube) of designing something weird that VASTLY outperforms the specs released for the console.

Only if you didn't know how to read the specs.

Also, with this generation comparisons are a lot easier than they were back then. All three competitors use AMD GPUs. And there really isn't a chance that AMD offered one company some sort of super secret technology which wasn't available to all the other customers if they wanted it. Even if Latte isn't a stock R700 design, there is no reason to believe that it is as advanced or efficent as GCN.

2014 is going to wake some people up its a in between console.... and thats the way nintendo wanted it to be.

One thing that Nintendo certainly didn't want is the slow start of Wii U. If anything, they would have wanted people to notice the difference right away, before PS4 and XBO were released.
 
I would bet large sums of money that Bayonetta 2 and Mario Kart will be 720p. If Bayonetta 2 were 1080p, wouldn't Wonderful 101 be too? (it's not). As for Nintendo in house games, they're ALL 720p: NSMBU, Nintendoland, Pikmin 3... the only games in 1080p are Wario, which has extremely simple graphics, and the Windwaker remake/port thing, and that's an updated gamecube game. And it's still 30fps (I know, I know, it's that way because of the animations, which makes sense). 1080p 60fps? Those games will be the Wario-like exceptions, and Smash.

I agree we're going to have very pretty games on the Wii U, but the power just isn't there for much fancy 1080p.

And it dosen't even need 1080p desperatly. Because somehow Wii U upscales 720p insanely good to 1080p. I played Pikmin 3 with Wii U set to 1080p and its really really hard to tell if its rendered in 720p or 1080p. because the image is so clean even if it is rendered in 720p. And i wouldn't be surprised if Nintendo had AMD implent a superb scaler into the GPU.

Even Wii games profit from the gpu scaling capabilities. (Not as much as 720p games though. Wii content is just not built to be played in HD but Wii Us gpu it still makes them look smoother/cleaner)
 
Only if you didn't know how to read the specs.

Also, with this generation comparisons are a lot easier than they were back then. All three competitors use AMD GPUs. And there really isn't a chance that AMD offered one company some sort of super secret technology which wasn't available to all the other customers if they wanted it. Even if Latte isn't a stock R700 design, there is no reason to believe that it is as advanced or efficent as GCN.

Those specs for Gamecube were designed by ArtX of which Nintendo employs a large number of their own staff. The team that designed the Wii U also worked on the Gamecube and Nintendo's own ArtX guys helped design it as well, it was a collaboration, so what you are saying is completely out of context to reality, AMD didn't design the Wii U's GPU as an AMD GPU with only AMD patents and without Nintendo input, it certainly isn't an off the shelf part either. What likely happened was Nintendo's assembled team took an AMD architecture and built on top of it the things they wanted to do, the chip is clearly custom and that is likely because the people who worked on it from it's conception didn't make an AMD Graphics card, but modified an existing one to become something new and different.
 
Those specs for Gamecube were designed by ArtX of which Nintendo employs a large number of their own staff. The team that designed the Wii U also worked on the Gamecube and Nintendo's own ArtX guys helped design it as well, it was a collaboration, so what you are saying is completely out of context to reality, AMD didn't design the Wii U's GPU as an AMD GPU with only AMD patents and without Nintendo input, it certainly isn't an off the shelf part either. What likely happened was Nintendo's assembled team took an AMD architecture and built on top of it the things they wanted to do, the chip is clearly custom and that is likely because the people who worked on it from it's conception didn't make an AMD Graphics card, but modified an existing one to become something new and different.

Nintendo is no hardware developer (and they can't be since they'd have to invest huge sums in that branch). Surely they have a few engineers who communicated their special wishes to AMD who then implemented them. There even was an interview with an AMD representative where this was talked about. It's still all AMD tech.
What I wanted to express is that one shouldn't expect that any secret technological breakthroughs were made during Wii U development. It's an AMD GPU with some console specific enhancements, just like what the competition has done (but based on a less advanced design since Wii U was released earlier).
 
Nintendo is no hardware developer (and they can't be since they'd have to invest huge sums in that branch). Surely they have a few engineers who communicated their special wishes to AMD who then implemented them. There even was an interview with an AMD representative where this was talked about. It's still all AMD tech.
What I wanted to express is that one shouldn't expect that any secret technological breakthroughs were made during Wii U development. It's an AMD GPU with some console specific enhancements, just like what the competition has done (but based on a less advanced design since Wii U was released earlier).

I'm not trying to say there is secret sauce in the Wii U, it could very well be just a 160 shader part, but it wasn't made by only AMD, you need a history lesson, because Nintendo and ATI go back quite a ways, in fact AMD bought ATI, who bought ArtX (that made the gamecube), who formed from the team that made the N64. Nintendo hired people from the original N64 team, then again hired people from ArtX as they were bought out by ATI, TEV is also a primitive programmable shader and ATI's design largely came from the ArtX teams/ideas they had acquired. Seriously to say that Nintendo doesn't have a hardware team for this sort of thing is a grossly wrong assumption, they have the best in the console hardware industry. If AMD had kept their mobile division (that is now know as qualcomm, yes snapdragon is based on old ATI mobile ideas) I have no doubt that 3DS's GPU would of been based on AMD's group since those teams are very familiar with the designs.
TL;DR: Every personnel Nintendo has acquired for their graphics team over the years is from the exact same place as the origin of all modern AMD's Graphics (ATI) products.
 
Nintendo is no hardware developer (and they can't be since they'd have to invest huge sums in that branch). Surely they have a few engineers who communicated their special wishes to AMD who then implemented them. There even was an interview with an AMD representative where this was talked about. It's still all AMD tech.
What I wanted to express is that one shouldn't expect that any secret technological breakthroughs were made during Wii U development. It's an AMD GPU with some console specific enhancements, just like what the competition has done (but based on a less advanced design since Wii U was released earlier).

Nintendo has their own chip designers... Like z0m3le already pointed out... They CAN design a custom chip if they want to.
 
I'm not trying to say there is secret sauce in the Wii U, it could very well be just a 160 shader part, but it wasn't made by only AMD, you need a history lesson, because Nintendo and ATI go back quite a ways, in fact AMD bought ATI, who bought ArtX (that made the gamecube), who formed from the team that made the N64. Nintendo hired people from the original N64 team, then again hired people from ArtX as they were bought out by ATI, TEV is also a primitive programmable shader and ATI's design largely came from the ArtX teams/ideas they had acquired.

I know that. But I don't really get your point.

Seriously to say that Nintendo doesn't have a hardware team for this sort of thing is a grossly wrong assumption,[...]

I never said that. What I'm saying is: They obviosuly don't have the means to develop a whole GPU by themselves.
Just to put some numbers to this: AMD had R&D expenses of > 1.3 billion $ last year.

[...]they have the best in the console hardware industry.

This is by no means a fact.


Thing is, every time i read something like "Wii U is a really efficient machine!" i ask myself: Efficient in comparison to what? More efficient than PS360? Certainly, there's 7 years seperating them. More efficient than PS4/XBO? Highly doubtable.
 
I know that. But I don't really get your point.



I never said that. What I'm saying is: They obviosuly don't have the means to develop a whole GPU by themselves.
Just to put some numbers to this: AMD had R&D expenses of > 1.3 billion $ last year.



This is by no means a fact.


Thing is, every time i read something like "Wii U is a really efficient machine!" i ask myself: Efficient in comparison to what? More efficient than PS360? Certainly, there's 7 years seperating them. More efficient than PS4/XBO? Highly doubtable.

"the company is currently spending more than ten times as much on research and development as it was five years ago, and since the Wii was launched in 2006, R&D spending has more than tripled. While this could be attributed to any number of additional projects, the level of spending suggests that a large project is in the works. In 2003, Nintendo declared that $34 million was spent on R&D. This figure steadily climbed to $103 million in 2006 (the year that the Wii launched) and the following year bumped dramatically to $370 million. When asked to explain the escalated spending, Nintendo representatives were unable to provide comment."

The thing about console makers, is their R&D budget is spent over 5+ years on hardware, and Nintendo only releases 2 platforms, so most of the R&D budget is spread across 6 years (in case of Wii U and 3DS) and includes well over $500 million USD (just the 2 years counted here) for just 2 products, where as AMD is releasing CPUs, GPUs, APUs, Motherboard chipsets, RAM, Mobile parts and Embedded parts Yearly. Nintendo didn't develop the GPU by themselves, but it's pretty plain that they didn't just use an off the shelf AMD GPU, even the pictures point to something bizarre. That is what I'm saying and that is why you can't compare it to PS4 and XB1 and say they are all going to be completely comparable based on ALU count and clock speed. I think that the Wii U does simply do some things differently than you'd expect a basic AMD GPU to do things.

So no even in R&D spending, Nintendo is there. I think you need to change targets, I'm not someone who thinks the Wii U is extremely efficient, what I believe it is, is a cool and quiet gaming device that will produce good visuals that are a step up from previous generation.
 
I'd love to see a die shot of an e6760 for comparison. I know Latte is not going to look too similar, but part of me believes it's just a cut back e6760 at heart. That thing only uses 35w with 480 SP, 24 TMU, 8 ROP, 600mhz core and 1GB 800mhz GDDR5. Perfect candidate to tune down to WiiU levels ;)
 
Nintendo is no hardware developer (and they can't be since they'd have to invest huge sums in that branch). Surely they have a few engineers who communicated their special wishes to AMD who then implemented them. There even was an interview with an AMD representative where this was talked about. It's still all AMD tech.
What I wanted to express is that one shouldn't expect that any secret technological breakthroughs were made during Wii U development. It's an AMD GPU with some console specific enhancements, just like what the competition has done (but based on a less advanced design since Wii U was released earlier).
I think the 'secret technology' argument has gone beyond bastardization, and we all have our part in that. It's not about secret technology, it's about design priorities. Nintendo could have had different design priorities for the WiiU than the other two vendors have had for their respective consoles, or AMD for their consumer devices from that timeframe.
 
Wiiu is a test case where BC held back the console design. If you read the statements on the design team they just talk about getting new parts to work with wii code. They should have just put a wii chip in there and move to newer design. Like sony did with earlier ps3. Wii chip would only cost a couple dollars.

I'd love to see a die shot of an e6760 for comparison. I know Latte is not going to look too similar, but part of me believes it's just a cut back e6760 at heart. That thing only uses 35w with 480 SP, 24 TMU, 8 ROP, 600mhz core and 1GB 800mhz GDDR5. Perfect candidate to tune down to WiiU levels ;)
Wiiu has nothing to do with that gpu core design.
 
I think the 'secret technology' argument has gone beyond bastardization, and we all have our part in that. It's not about secret technology, it's about design priorities. Nintendo could have had different design priorities for the WiiU than the other two vendors have had for their respective consoles, or AMD for their consumer devices from that timeframe.

Thanks that cuts more to the point, Nintendo spent a lot of money on R&D, has a team that knows the GPU intimately and probably designed it differently than any off the shelf AMD GPU.

I'd love to see a die shot of an e6760 for comparison. I know Latte is not going to look too similar, but part of me believes it's just a cut back e6760 at heart. That thing only uses 35w with 480 SP, 24 TMU, 8 ROP, 600mhz core and 1GB 800mhz GDDR5. Perfect candidate to tune down to WiiU levels ;)

That is why I brought up the e6460, which is a cut down e6760 (160 ALUs vs the original 480) and 16watt TDP (with 512MB GDDR5 consuming at least a watt to itself as TDP) it's also a 600MHz clock on the 40nm process, consumption of the actual GPU is going to be ~12-13watts there... should be very comparable to what we have pegged as the minimum for Latte.

Wiiu is a test case where BC held back the console design. If you read the statements on the design team they just talk about getting new parts to work with wii code. They should have just put a wii chip in there and move to newer design. Like sony did with earlier ps3. Wii chip would only cost a couple dollars.

Wiiu has nothing to do with that gpu core design.

Unless Wii U is the new architecture Nintendo is building their future around, all of their tools and VC instead of redoing that and wasting resources every hardware cycle, they could just build up or down Wii U's architecture to make future platforms.

e6460/e6760 is redwood iirc, which is R800 series and could potentially be Wii U's base. R700 and R800 are not even really that different.
 
If we compared the Wii U chipset to an existing product (don't bother with the RAM, I'm talking CPU+GPU combo), what kind of power is it packing? Netbook? High end tablet?

Higher than both. Netbooks and Tablets don't draw nearly that much power. I think they pull closer to something like 10w.
 
"the company is currently spending more than ten times as much on research and development as it was five years ago, and since the Wii was launched in 2006, R&D spending has more than tripled. While this could be attributed to any number of additional projects, the level of spending suggests that a large project is in the works. In 2003, Nintendo declared that $34 million was spent on R&D. This figure steadily climbed to $103 million in 2006 (the year that the Wii launched) and the following year bumped dramatically to $370 million. When asked to explain the escalated spending, Nintendo representatives were unable to provide comment."

The thing about console makers, is their R&D budget is spent over 5+ years on hardware, and Nintendo only releases 2 platforms, so most of the R&D budget is spread across 6 years (in case of Wii U and 3DS) and includes well over $500 million USD (just the 2 years counted here) for just 2 products, where as AMD is releasing CPUs, GPUs, APUs, Motherboard chipsets, RAM, Mobile parts and Embedded parts Yearly.

On the other hand AMD only designs Chips and not whole platforms. I wouldn't be surprised if a large portion of the R&D costs at Nintendo went to the controllers or castaway ideas. Still, the numbers are surprisingly high no doubt. I guess we can agree that the comparison is somewhat flawed (my mistake).

Nintendo didn't develop the GPU by themselves, but it's pretty plain that they didn't just use an off the shelf AMD GPU, even the pictures point to something bizarre. That is what I'm saying and that is why you can't compare it to PS4 and XB1 and say they are all going to be completely comparable based on ALU count and clock speed. I think that the Wii U does simply do some things differently than you'd expect a basic AMD GPU to do things.

Thing is: PS4's GPU isn't off the shelf either, and who knows what MS has done. Overall we don't have any evidence that Nintendo is doing something out of the regular here in comparison to the competitors.

I think you need to change targets, I'm not someone who thinks the Wii U is extremely efficient, what I believe it is, is a cool and quiet gaming device that will produce good visuals that are a step up from previous generation.

I didn't target you directly. My initial statement largely consisted of the fact that, opposed to the sixth generation, this time all console GPUs are based on AMD designs and that therefore the differences won't be as huge as back then. Afterwards I just responded to the comments I earned for that.

Anyway, our views don't seem to differ that much (only that in my opinion the step up is relatively small - but that depends on the definition of "small" ;)).
 
On the other hand AMD only designs Chips and not whole platforms. I wouldn't be surprised if a large portion of the R&D costs at Nintendo went to the controllers or castaway ideas. Still, the numbers are surprisingly high no doubt. I guess we can agree that the comparison is somewhat flawed (my mistake).



Thing is: PS4's GPU isn't off the shelf either, and who knows what MS has done. Overall we don't have any evidence that Nintendo is doing something out of the regular here in comparison to the competitors.



I didn't target you directly. My initial statement largely consisted of the fact that, opposed to the sixth generation, this time all console GPUs are based on AMD designs and that therefore the differences won't be as huge as back then. Afterwards I just responded to the comments I earned for that.

Anyway, our views don't seem to differ that much (only that in my opinion the step up is relatively small).

Better shaders and more ram from 360/PS3 for 720p games will be a huge difference. Not even taking the performance difference into account (seems to be quite a bit higher considering Bayonetta 2 is running at 60fps solid while Bayonetta 1 lacked a lot of detail we see in the sequel while only pushing ~45fps)

However relative to the other next gen consoles, yes it will be much smaller of a step up, the others will be closer to a leap. I think it is most peoples views and the real difference of opinion is whether or not this is satisfactory to the individual. Some play up the differences quite a bit and say that Wii U is a last gen console (whatever that means) while some say that Wii U will only lack a bit against next gen consoles, I doubt it will be a small difference, but I do think diminishing returns happen every generation, and thanks to a lot of CG and bullshots from this last gen, people will be scratching their heads looking for the "next gen graphics" in many titles.

What will look better is solid frame rate and no pop ins or screen tears, that is more important to me than faking a few less things. I was really excited about global illumination, but it seems this gen is going to lack it in all but the most basic implementations.

I also think if you were to give the XB1 specs during WUST 1 and said this is a next gen console, everyone would of said "that is Wii U's specs" "must be" everyone had expected 2-3TFLOPs for XB360's and PS3's Successors. That is one reason I am just not impressed with any of these consoles and really only care about better textures, lighting and getting rid of graphic errors like pop ins, tears and frame rate drops. Considering what we can fake in this last generation (GTAV, uncharted 3, last of us looks amazing) so giving those things are about as good as I can hope for ATM.

So I think Wii U fits the minimum of what I needed to see as a jump thanks to my new lowered expectations, had XB1 and PS4 been truly beasts with the ability to do GI and much higher IQ overall (correct me if I'm wrong but isn't thief 4 even 720p?) with no indication that we will have solid framerates (PS4's e3 showing was pretty bumpy) or the words Vsync (to avoid screen tear) I'm just a bit afraid that this industry is going to continue to give us horrible compromises to push whatever they can out of these machines while lacking what I consider true next gen jumps.
 
One thing that Nintendo certainly didn't want is the slow start of Wii U. If anything, they would have wanted people to notice the difference right away, before PS4 and XBO were released.
But they brought nothing in the way of content to show it. I really think Nintendo just doesn't care about a console's relative power. They play their own game and aim for cheap plus unique compelling games. It's the latter they failed at. Nintendo would stick with GC graphics if they thought people would still by their machines.
 
Only if you didn't know how to read the specs.

Also, with this generation comparisons are a lot easier than they were back then. All three competitors use AMD GPUs. And there really isn't a chance that AMD offered one company some sort of super secret technology which wasn't available to all the other customers if they wanted it. Even if Latte isn't a stock R700 design, there is no reason to believe that it is as advanced or efficent as GCN.



One thing that Nintendo certainly didn't want is the slow start of Wii U. If anything, they would have wanted people to notice the difference right away, before PS4 and XBO were released.


It's the other way around though. Thinking that Latte is 160 ALUs basically means that Nintendo made a pretty shitty deal considering that it's a tad over 100mm² for just the GPU (+ eDRAM adding up to sth. like 145mm² afaik). Redwood is 104mm² and has 400 ALUs (that is including things like the GDDR5 interface), Turks (HD6xxx) is 118mm² and has 480 ALUs (again including GDDR5 interface).
I'm totally open to anything from 160 to 320 ALUs, but going by the "Why should anybody have gotten a significantly better/worse deal"-argument basically means that one should reject the idea of 160 ALUs unless there is some kind of "secret sauce" on the chip. And by secret sauce I don't necessarily mean sth. that's making Latte much more powerful, but it may very well be sth. that we simply aren't thinking of atm. Maybe sth. with the Wii BC, maybe sth. completely different (I really don't know ^^).
 
I think the one difference is the on-die eDRAM and its bandwidth, I just get the filling its insanely high, also the SRAM blocks located where FourthStorm and others believe to be Shader cores may have wider buses or higher than 64kb. Since it only has 8 ROPS don't expect any AA higher than 2X. Bandwidth from eDRAM alone isn't enough. Shader core clock is probably very high, and I would put eDRAM bandwidth 256GB/s.
 
Does anyone that posts in here have the required tool to take a power consumption reading from WiiU while it's running ?.

It would be good to see how much it uses when W-101 is being played as it looks to be the game that pushes WiiU the hardest so far (720p native / 60fps / more modern graphical effects).

If it's still only pushing 33w then that can be assumed to be the max while gaming, the extra 10w is most probably reserved for USB devices.

Also with the talk about Bayonetta running at 45fps, wasn't that just the poor PS3 port ?, I'm sure it runs at 60fps on the 360. If Bayonetta does run at 45fps on the 360 aswell, then Bayo 2 running at 720p native / 60fps with improved visuals on WiiU is a nice upgrade indeed.
 
Better shaders and more ram from 360/PS3 for 720p games will be a huge difference. Not even taking the performance difference into account (seems to be quite a bit higher considering Bayonetta 2 is running at 60fps solid while Bayonetta 1 lacked a lot of detail we see in the sequel while only pushing ~45fps)

As I added in my stealth edit, it depends on the definition of a "large step". It's subjective. That might acutally be a frequent source of misunderstanding.
To me, the differences between PS2, Gamecube and Xbox were reletively small. And that's where I would classify Wii U in comparison to PS360: Like the step between PS2 and Gamecube.

Regarding Bayonetta 2: How do we know if it will run at 60 fps constantly? I've read this claim often, yet it's quite common that devs state that they "aim for xx fps" but don't reach it in the end. I think we should wait until the game is released before making any definite claims.

However relative to the other next gen consoles, yes it will be much smaller of a step up, the others will be closer to a leap. I think it is most peoples views and the real difference of opinion is whether or not this is satisfactory to the individual. Some play up the differences quite a bit and say that Wii U is a last gen console (whatever that means) while some say that Wii U will only lack a bit against next gen consoles, I doubt it will be a small difference, but I do think diminishing returns happen every generation, and thanks to a lot of CG and bullshots from this last gen, people will be scratching their heads looking for the "next gen graphics" in many titles.

What will look better is solid frame rate and no pop ins or screen tears, that is more important to me than faking a few less things. I was really excited about global illumination, but it seems this gen is going to lack it in all but the most basic implementations.

I also think if you were to give the XB1 specs during WUST 1 and said this is a next gen console, everyone would of said "that is Wii U's specs" "must be" everyone had expected 2-3TFLOPs for XB360's and PS3's Successors. That is one reason I am just not impressed with any of these consoles and really only care about better textures, lighting and getting rid of graphic errors like pop ins, tears and frame rate drops. Considering what we can fake in this last generation (GTAV, uncharted 3, last of us looks amazing) so giving those things are about as good as I can hope for ATM.

So I think Wii U fits the minimum of what I needed to see as a jump thanks to my new lowered expectations, had XB1 and PS4 been truly beasts with the ability to do GI and much higher IQ overall (correct me if I'm wrong but isn't thief 4 even 720p?) with no indication that we will have solid framerates (PS4's e3 showing was pretty bumpy) or the words Vsync (to avoid screen tear) I'm just a bit afraid that this industry is going to continue to give us horrible compromises to push whatever they can out of these machines while lacking what I consider true next gen jumps.

I agree with that. PS4 and especially XBO don't strike me as particulary impressive either.

And another personal thing: I have no problems with Wii U's technical capatibilites. I buy consoles for games, not graphics. Wii was my most-played console last gen.
It's just that, when I view the technical apsects, I try to mask that out and be as neutral about it as possible.
 
I would bet large sums of money that Bayonetta 2 and Mario Kart will be 720p. If Bayonetta 2 were 1080p, wouldn't Wonderful 101 be too? (it's not). As for Nintendo in house games, they're ALL 720p: NSMBU, Nintendoland, Pikmin 3... the only games in 1080p are Wario, which has extremely simple graphics, and the Windwaker remake/port thing, and that's an updated gamecube game. And it's still 30fps (I know, I know, it's that way because of the animations, which makes sense). 1080p 60fps? Those games will be the Wario-like exceptions, and Smash.

I agree we're going to have very pretty games on the Wii U, but the power just isn't there for much fancy 1080p.

You are using off logic. Kamiya stated quite cleearly that he personally chose 720p 60FPS instead of 1080p because "It was more important to him". He also remakred on how it was because there were so many dynamic characters on screen at once eating up detailed. This is not the case in Bayonetta 2 which has been largely rumored to be 1080p.

There are many 1080p 60FPS games on the Wii U eshop as well. Most of the eshop games are 1080 60 FPS(Nano Assault was running fine at it but Shin'en decided to use the resource 1080p ate up to ad post-fx at 720p), Rayman Legends runs at 1080p 60 FPS.

Tri Ultimate was listed to be 1080p 60FPS by Capcom though some analyzerclaimed to be 1080p 45FPS though from the video I saw of it, the Frame Rate was all over the place look to have been constantly increasing as the game ran. Also, I believe its using the European version which would have a target of 50hz and thus a 50 FPS max frame rate. This is also prior to the update that caused a large number of game that were experiencing slow down and other perform issues to no longer have them(this has is completely factored out by everyone trying to use performance issues in launch games to limit the console). I have heard numerous complaints about Lego City having slow downs, but I have not experienced one. I didn't get it till after the first big performance update. If the U.S. version of MH Tri:Ultimate were tested now, I believe it would give different results.

All of the other games you listed were more or less launch up-ports. NSMB U and Pikmin 3 are still mostly using Wii assets that wouldn't look good as good 1080p.

Nintendo stated themselves that they had trouble with HD development at first and that they had trouble get the Wii U working properly. This is one of those many facts and details taht people prefer to "omit" when talking about the Wii U's performance. There are a multitude of problems with developing for the console and the console itself at at lunch. As it stands, most games are now showing far superior performance and more games are appearing at 1080p, which I think it stupid, in accordance with what Shin'en said, because 1080p doesn't add much, and the resources used for 1080p would be better spent on more effects.
 
Bayonetta 2 will not be 1080p, neither will MK8, neither will X. The only WiiU games that will be 1080p are Rayman Legends and Smash Bros. Show me where these games are 'largely rumoured' to be 1080p, the only place I have seen it mentioned is in this thread.

The console is simply not equipped or designed to handle complex 3D games at 1080p.

Don't you need like 2.25x the GPU FLOPs to run the same engine at 1080p as opposed to 720p ?, that would mean WiiU's GPU would need to be at least 540 GFLOPs... and as people have shown Bayo 2 looks better than the original so it would need to be even more powerful than that.
 
I think the one difference is the on-die eDRAM and its bandwidth, I just get the filling its insanely high, also the SRAM blocks located where FourthStorm and others believe to be Shader cores may have wider buses or higher than 64kb. Since it only has 8 ROPS don't expect any AA higher than 2X. Bandwidth from eDRAM alone isn't enough. Shader core clock is probably very high, and I would put eDRAM bandwidth 256GB/s.

Bayonetta 2 will not be 1080p, neither will MK8, neither will X. The only WiiU games that will be 1080p are Rayman Legends and Smash Bros. Show me where these games are 'largely rumoured' to be 1080p, the only place I have seen it mentioned is in this thread.

The console is simply not equipped or designed to handle complex 3D games at 1080p.

Don't you need like 2.25x the GPU FLOPs to run the same engine at 1080p as opposed to 720p ?, that would mean WiiU's GPU would need to be at least 540 GFLOPs... and as people have shown Bayo 2 looks better than the original so it would need to be even more powerful than that.

Maybe resolution is linked to bandwidth more than flops. And WiiU has the bandwidth in it eDRAM, which in my opinion is much higher than PS4.
 
Does anyone that posts in here have the required tool to take a power consumption reading from WiiU while it's running ?.

It would be good to see how much it uses when W-101 is being played as it looks to be the game that pushes WiiU the hardest so far (720p native / 60fps / more modern graphical effects).

If it's still only pushing 33w then that can be assumed to be the max while gaming, the extra 10w is most probably reserved for USB devices.

Also with the talk about Bayonetta running at 45fps, wasn't that just the poor PS3 port ?, I'm sure it runs at 60fps on the 360. If Bayonetta does run at 45fps on the 360 aswell, then Bayo 2 running at 720p native / 60fps with improved visuals on WiiU is a nice upgrade indeed.

I haven't go 101 but I did it with, IIRC NSMBU. It was flat 33 all the way from boot to menu screen to game.
 
Status
Not open for further replies.
Top Bottom