WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
No other CPU in the Nintendo CPU family has ever achieved clocks as high as a Expresso. The highest achieved with overclocks was 1.1 Ghz I belive and the performance increase is always small with a huge increase in heat production and power consumption. This is why it was always annoying when people insisted that the Wii was an overclocked Gamecube. The hardware doesn't overclock well at all.

We do no not know the entirety of the Espresso's design so to say it absolutely can't reach an overclock of x2 since is could very well be underclocked from all we know.

I've heard nothing of a GPU power increase.

The CPU would likely not be able to be clocked much higher... a rough guess would be 1.4GHz would be pushing the max you could see from the current design at 45nm. The main reason is the extremely short stages, there hasn't been anything clocked this high on such a short stage. Of course if Nintendo continues to use this family, they would find higher clocks as they push the processor to lower nodes, bringing it down to 20 or 16nm could yield a very large jump in clocks, still well short of the 3.5ghz that ridiculous rumor talked about but 2GHz to 2.5GHz is probably possible on smaller nodes though the stages would still be the largest obstacle.
 
The CPU would likely not be able to be clocked much higher... a rough guess would be 1.4GHz would be pushing the max you could see from the current design at 45nm. The main reason is the extremely short stages, there hasn't been anything clocked this high on such a short stage. Of course if Nintendo continues to use this family, they would find higher clocks as they push the processor to lower nodes, bringing it down to 20 or 16nm could yield a very large jump in clocks, still well short of the 3.5ghz that ridiculous rumor talked about but 2GHz to 2.5GHz is probably possible on smaller nodes though the stages would still be the largest obstacle.

That is interesting. I have question petaining this but I will take that to the CPU thread. I don't want get to far off the topic of the GPU in here.

Another thing I would like to bring up a in relation to the how the Wii U graphics stand versus last gen that I haven't seen come up yet are the paints on Mario in Brawl and the cloths/hair on Link.
GbiilaI.jpg
j2QM7D3.jpg


J5Uxlbt.jpg
rMMQvqR.jpg

I've seen nothing like this in last gen games. Link's shirt neck, hit tunic collar, tunics tail, his sleeve(self shadowing :) )his pants legs, boot openings, his belt. They are all indvidually rendered to make them pronounced. They are not simply textured on as what was done with most last gen games.

This looks like a substantial leap in power for the GPU to me. That seems like a lot of polygons to spare. These models look more detailed than the ones in NG3 Razor's Edge and they had 60k polygon counts.

This is the first time I noticed that Link wear's earrings.

Might tessellation be used in these rendering techniques?
 
rMMQvqR.jpg


While the character models are very nice (Nintendo's artists are incredible!), the rest of the shot, the foreground including the low res jaggy grass and the low res, low poly, blurry backgrounds look very 'current gen' to me.

I never expected Smash to be a visual show case for WiiU though, both Smash and MK are built around fast, frenetic gameplay running at a rock solid 60fps with multiple players on screen, even online.

Where I expected to see visuals being pushed first party wise was on the new Zelda, Metroid and Starfox. Unfortunately two of those games have not even been confirmed to be in development and after finding out just how tiny Retro are, I'm not convinced they could produce a really high quality 'AAA' HD action game which would blow the likes of even Killzone 2/3 and Halo 4 away nevermind competing with KZ Shadow Fall and Ryse.
 
rMMQvqR.jpg


While the character models are very nice (Nintendo's artists are incredible!), the rest of the shot, the foreground including the low res jaggy grass and the low res, low poly, blurry backgrounds look very 'current gen' to me.

I never expected Smash to be a visual show case for WiiU though, both Smash and MK are built around fast, frenetic gameplay running at a rock solid 60fps with multiple players on screen, even online.

Where I expected to see visuals being pushed first party wise was on the new Zelda, Metroid and Starfox. Unfortunately two of those games have not even been confirmed to be in development and after finding out just how tiny Retro are, I'm not convinced they could produce a really high quality 'AAA' HD action game which would blow the likes of even Killzone 2/3 and Halo 4 away nevermind competing with KZ Shadow Fall and Ryse.
No aspect of anything I'm seeing is low res except for maybe the far, FAR background islands.

That is all completely irreverent to what I what I was pointing out and why though. Did you respond for no other reason than to find flaws to downplay the image?

You should go to the brawl thread for that. I'm trying to analyze the graphical techniques and features that are present, so that we can get a better idea of what the GPU is capable of. What are you trying to do?
 
What I find the most amusing about WUST threads, is we originally expected XB1 specs for Wii U and said that would be a stop gap, not quite the "next gen" leap we were expecting.

I don't think we were wrong to expect a 1TF GPU with 4GB's of RAM tbh (even if they went with the exact same CPU). Nintendo simply put at least 30% of the machines built cost into the tablet controller, which as of right now has yet to show why it even exists beyond off tv play and after waiting 6 years for HD Nintendo games the last thing I want to do is to play them on a 6" 480p screen...

I think it's also a bit disingenuous to start poking fun at XB1's specs never mind PS4's, esp as it looks like we are dealing with a 176GFLOP GPU and 1GB of RAM for WiiU compared with a 1.3TF GPU and 5GB's of RAM in XBO and a 1.8TF GPU and 5GB's of extremely fast RAM for PS4. The difference is still massive, PS4 is a 10x tech leap over PS3 according to Cerny.

I've said this before but if your a massive Nintendo fan then all that should matter is that WiiU is a massive (the biggest leap in hardware from a company in a single generation) over the Wii.

Its a 12 GFLOP GPU vs a 176GFLOP GPU, 88MB's of RAM vs 1GB of RAM and a much faster clocked CPU ! :).
 
The grass is very low res, and the geometry on the ground is pretty simple.

There's not much impressive about Links model. Some creases in clothes look like normal maps, some geometry. His hair uses alpha textures. Par for the course.
 
No aspect of anything I'm seeing is low res except for maybe the far, FAR background islands.

That is all completely irreverent to what I what I was pointing out and why though. Did you respond for no other reason than to find flaws to downplay the image?

You should go to the brawl thread for that. I'm trying to analyze the graphical techniques and features that are present, so that we can get a better idea of what the GPU is capable of. What are you trying to do?

You really need to get a grip Krizz... It seems like if you're not going to post something positive about the GPU then you're not welcome in this thread by you and a few others.

I'm not trolling, it's my opinion, the trolls are the people that come in and say 'lol looks like Brawl running on Dolphin' and leave after one post.

I'm interested in what the GPU can do just as much as you are and like discussing it.

We have gone as far as we can go with regards to arguing about the DIE shot, most think its 176GFLOPs and the games shown so far along with the cost of the tablet and the consoles power envelope back it up.

It seems to be fine for you to post (shrunk down) screen shots of WiiU games that you think are amazing looking but if you dare disagree with how fantastic they look then you get accused of taking the discussion off topic or posting to get a reaction...

Edit

Also you must have missed the part where I praised Link's model ? (the part you were talking about)...
 
The grass is very low res, and the geometry on the ground is pretty simple.

There's not much impressive about Links model. Some creases in clothes look like normal maps, some geometry. His hair uses alpha textures. Par for the course.

That is so not what I was asking or pointing to.
 
This comes across as some rather desperate reaching, from someone who either hasnt played any of the major fighting games this gen or is wilfully blind.

try a quick google search of Marvel vs capcom 4. Its certainly arguable that Smash bros is better looking, but to claim that it is some kind of technical leap ahead is simply delusional.
 
I've said this before but if your a massive Nintendo fan then all that should matter is that WiiU is a massive (the biggest leap in hardware from a company in a single generation) over the Wii.

Its a 12 GFLOP GPU vs a 176GFLOP GPU, 88MB's of RAM vs 1GB of RAM and a much faster clocked CPU ! :).

You couldn't be more wrong. The leap from the NES to the SNES, SNES to the N64 and N64 to the Gamecube were all much larger. The leap from the SNES to the N64 was over a hundred times in raw power. That was the biggest leap in console history by the way.
 
This comes across as some rather desperate reaching, from someone who either hasnt played any of the major fighting games this gen or is wilfully blind.

try a quick google search of Marvel vs capcom 4. Its certainly arguable that Smash bros is better looking, but to claim that it is some kind of technical leap ahead is simply delusional.

There is no Marvel vs. Capcom 4 and I've made no such claim to dispute. You are attacking things that are all off to the side of what I was addressing. I have a had 3 responses that did not address a single thing that I pointed out.
 
This comes across as some rather desperate reaching, from someone who either hasnt played any of the major fighting games this gen or is wilfully blind.

try a quick google search of Marvel vs capcom 4. Its certainly arguable that Smash bros is better looking, but to claim that it is some kind of technical leap ahead is simply delusional.

He is comparing it directly to All Stars in most cases and to be fair it does look better, by how much is up for debate and I don't imagine All Stars had even half the budget of Smash U or people half as talented working on it.
 
You couldn't be more wrong. The leap from the NES to the SNES, SNES to the N64 and N64 to the Gamecube were all much larger. The leap from the SNES to the N64 was over a hundred times in raw power. That was the biggest leap in console history by the way.

I meant the specs on paper obviously...

I lived and played through the jump from 2D to 3D :p.
 
There is no Marvel vs. Capcom 4 and I've made no such claim to dispute. You are attacking things that are all off to the side of what I was addressing. I have a had 3 responses that did not address a single thing that I pointed out.

yeah, a typing error invalidates my point.

Any dscussion is obviously pointless with you., so ill gladly leave.
 
afaik it was an OS update and mostly dealt with how the system's memory was used. As well as allowing interaction before the page is 100% ready, the GPU hasn't changed clock speeds or there would probably be a bump in power consumption but I guess we could ask a hacker who has already gotten to the clocks before.
Ahhh! Ok.

No other CPU in the Nintendo CPU family has ever achieved clocks as high as a Expresso. The highest achieved with overclocks was 1.1 Ghz I belive and the performance increase is always small with a huge increase in heat production and power consumption. This is why it was always annoying when people insisted that the Wii was an overclocked Gamecube. The hardware doesn't overclock well at all.

We do no not know the entirety of the Espresso's design so to say it absolutely can't reach an overclock of x2 since is could very well be underclocked from all we know.

I've heard nothing of a GPU power increase.

AFAIK, the CPU the Nintendo chips are based on are naturally low clock CPU's. I think there is more info on this in the Espresso thread.

That is so not what I was asking or pointing to.

While I'm not fond of Apophis's posts, but those things you pointed out in those pictures aren't as big as you make them out to be. The Wii U is a much more powerful console than the Wii. The gap between the Wii and Wii U is the largest out of all of the Nintendo consoles since gaming went 3D.
 
...after finding out just how tiny Retro are, I'm not convinced they could produce a really high quality 'AAA' HD action game which would blow the likes of even Killzone 2/3 and Halo 4...

Granted H4 had a huge budget, but to call it "really high quality" only works for its visual presentation, which ironically made the gameplay part of the, you know, game, suffer.

H4 is (the finest?) example of bean counters spear-heading the design with the focus on "accessibility" and a corporate exercise in box ticking to wantonly copy CoD elements. It should not be heralded as some "bar" of "high quality AAA HD action game". This is further highlighted when the vast bulk of the online population has ditched the game, leaving 343i's efforts to "fix" (read: make more like Halo, less like CoD) the game relevant to only a small stalwart population. (As an fyi, I loved H1-ODST and whilst Reach was different, props to Bungie for genuinely trying to mix it up).

So whilst those games can be used for "aaa comparison" nonsense, using them as a bar for game quality is not ideal, given the subjective nature of "really high quality".

This is off-topic though and I apologize.
 
2014 will reveal the truth.... cant say much more than that... but its going to be surprising.

*Starts working on a plan to kidnap and brutally torture Tron#1 to get info*

*Picks up dentist drill*

Is it safe..?

:Oo

Might have to watch that film again now I'm thinking about it lol
 
I don't think we were wrong to expect a 1TF GPU with 4GB's of RAM tbh (even if they went with the exact same CPU). Nintendo simply put at least 30% of the machines built cost into the tablet controller, which as of right now has yet to show why it even exists beyond off tv play and after waiting 6 years for HD Nintendo games the last thing I want to do is to play them on a 6" 480p screen...

I think it's also a bit disingenuous to start poking fun at XB1's specs never mind PS4's, esp as it looks like we are dealing with a 176GFLOP GPU and 1GB of RAM for WiiU compared with a 1.3TF GPU and 5GB's of RAM in XBO and a 1.8TF GPU and 5GB's of extremely fast RAM for PS4. The difference is still massive, PS4 is a 10x tech leap over PS3 according to Cerny.

What we expected with those specs were 1.2TFLOPs (XB1 uses 10% of it's GPU for the OS so yeah it's almost exactly what we expected for Wii U's GPU. The 8GBs is obviously more but if you go back to WUST 1, you would see that the bandwidth is also very close to what we expected (~50GB/s) We saw these specs as a stop gap, a half step between generations, since 2008 PC gamers have enjoyed 1.2TFLOP GPUs, a year later the HD 5870 came out to declare itself king with an enormous ~2.9TFLOPs. As a PC gamer I expected XB1 to keep up a bit more with the times though I did speculate that they might go for a weak console just to hug Wii U and change the "twins" dynamic from PS and XB to XB and WU. While it has seem to of happened for Microsoft. Sony choose the exact same architecture while Nintendo went far below anyone's expectations, probably even USC-fan's who speculated that the Wii U originally would only have 320ALUs, funny enough that is the MAX that Wii U could have and even then it is probably 160ALUs. (or as he and I have speculated, some odd ball 30-32ALUs per SPU)

I'm not "down playing XB1's power" I'm saying that next gen specs are disappointing all around, If PS4 could at least perform GI, I'd be satisfied with one of the consoles. Luckily when the specs were revealed and I saw unreal engine 4's compromises, I bought a HD 7950, Pretty sure the 4,3TFLOPs (I have it clocked to 1.2GHz) is enough to handle next gen at a higher fidelity than PS4, not really worried about the larger memory since it is too slow to fill, while my card's 384bit memory bus blows it away.

Personally I'll probably buy all 3 consoles again, the exclusives look interesting, but I don't like the paywall so I might ignore them for a few years and just enjoy kart, smash and X which all thankfully look great for what they are and will not have tearing and other graphic irritations I had to put up with for the last 8 years. (frame rate issues and pop ins to name a few)

Basically I never expected Nintendo to give us next gen graphics, just run the same games in a lower fidelity, it was ok because Microsoft was going to be around to push PCs to a new level but honestly these consoles can't produce anything that PCs haven't been already doing since 2009.
 
What we expected with those specs were 1.2TFLOPs (XB1 uses 10% of it's GPU for the OS so yeah it's almost exactly what we expected for Wii U's GPU. The 8GBs is obviously more but if you go back to WUST 1, you would see that the bandwidth is also very close to what we expected (~50GB/s) We saw these specs as a stop gap, a half step between generations, since 2008 PC gamers have enjoyed 1.2TFLOP GPUs, a year later the HD 5870 came out to declare itself king with an enormous ~2.9TFLOPs. As a PC gamer I expected XB1 to keep up a bit more with the times though I did speculate that they might go for a weak console just to hug Wii U and change the "twins" dynamic from PS and XB to XB and WU. While it has seem to of happened for Microsoft. Sony choose the exact same architecture while Nintendo went far below anyone's expectations, probably even USC-fan's who speculated that the Wii U originally would only have 320ALUs, funny enough that is the MAX that Wii U could have and even then it is probably 160ALUs. (or as he and I have speculated, some odd ball 30-32ALUs per SPU)

I'm not "down playing XB1's power" I'm saying that next gen specs are disappointing all around, If PS4 could at least perform GI, I'd be satisfied with one of the consoles. Luckily when the specs were revealed and I saw unreal engine 4's compromises, I bought a HD 7950, Pretty sure the 4,3TFLOPs (I have it clocked to 1.2GHz) is enough to handle next gen at a higher fidelity than PS4, not really worried about the larger memory since it is too slow to fill, while my card's 384bit memory bus blows it away.

Personally I'll probably buy all 3 consoles again, the exclusives look interesting, but I don't like the paywall so I might ignore them for a few years and just enjoy kart, smash and X which all thankfully look great for what they are and will not have tearing and other graphic irritations I had to put up with for the last 8 years. (frame rate issues and pop ins to name a few)

Basically I never expected Nintendo to give us next gen graphics, just run the same games in a lower fidelity, it was ok because Microsoft was going to be around to push PCs to a new level but honestly these consoles can't produce anything that PCs haven't been already doing since 2009.

It just looks like Sony esp didn't want to eat $250+ losses this time, along with having heat issues that 3TF+ GPU's and 3GHz+ CPU's would bring, it's really as simple as that.

Both new consoles games look amazing to me, a real generational leap, whether it's Ryse, Dead Rising 3 and Forza on XBO or Killzone SF, Driveclub and Infamous SS on PS4. BF4 running at 1080p, 60fps with 64 player mp mode is really nice as well.

If you don't game on PC then XBO and esp PS4 are a massive upgrade to PS360 games.

I just really wish Nintendo had went with the pro controller as standard while offering Wii motes as an alternative control scheme while combining a 1TF GPU & 4GB's of RAM with the same CPU.

It would have been profitable at $249, the launch games would have been far more impressive (PS360 games running at 1080p/60fps) and they would have so much more third party support than they currently have and in terms of getting next gen only down ports.

I remember being bitterly disappointed when BG said it would be 600GFLOPs at the most in September, I think if I had heard it was 176GFLOPs I would have offed myself haha.

WiiU is what it is though, although some people in this thread can't seem to accept it.

Nintendo games will look amazing and that at the end of the day is all I wanted, Nintendo games in HD.
 
so i was in the miiverse snapshot thread and these 3 pikmin 3 shots blew my mind

im simply blown away, im not a big graphics guy but these pics really got me

As I've mentioned before, for me personally, the diffuse mapping is the standout feature for Pikmin 3's graphics. Blows away any diffuse mapping we've seen last gen on the PS3 and 360. It's what makes the fruit look so realistic.

I did ask earlier if this was a DX11-equivalent feature but everyone ignored me :o(

I think the lower res floor textures are down to the depth of field effects rather than a lack of power. It's a bit pointless having high res textures on something that's going to be out of focus most of the time.
 
It just looks like Sony esp didn't want to eat $250+ losses this time, along with having heat issues that 3TF+ GPU's and 3GHz+ CPU's would bring, it's really as simple as that.

Both new consoles games look amazing to me, a real generational leap, whether it's Ryse, Dead Rising 3 and Forza on XBO or Killzone SF, Driveclub and Infamous SS on PS4. BF4 running at 1080p, 60fps with 64 player mp mode is really nice as well.

If you don't game on PC then XBO and esp PS4 are a massive upgrade to PS360 games.

I just really wish Nintendo had went with the pro controller as standard while offering Wii motes as an alternative control scheme while combining a 1TF GPU & 4GB's of RAM with the same CPU.

It would have been profitable at $249, the launch games would have been far more impressive (PS360 games running at 1080p/60fps) and they would have so much more third party support than they currently have and in terms of getting next gen only down ports.

I remember being bitterly disappointed when BG said it would be 600GFLOPs at the most in September, I think if I had heard it was 176GFLOPs I would have offed myself haha.

WiiU is what it is though, although some people in this thread can't seem to accept it.

Nintendo games will look amazing and that at the end of the day is all I wanted, Nintendo games in HD.

I get the best of both worlds with my PC set up, I'll probably pick up Wii U by the end of the year, though I might wait until next year so I can finally close the case on the ALU count, people care far too much what it is. Basically if you don't have enough for GI, then you aren't doing anything that last gen wasn't already doing, especially if you are a PC gamer who has had "next gen" graphics for years. So whether it's 176GFLOPs @ 720p with a solid playable frame rate, or 1080p with 60fps and slightly better lighting, It's pretty much the same game, you can even do this with a PC, just limiting your frame rate and resolution. The difference to the average gamer is on the unimportant side of noticeable IMO (There is certainly some people who can't stand the lower resolution, however I wear glasses so the difference is literally unimportant as could be)

It's a matter of opinion but I don't see PS4 as next gen and when I look at XB1, I see a stop gap. When I look at Wii U, I see an improvement over last gen, only to the things that really hurt the experience (again that is Sub HD, screen tearing, frame rate issues and low resolution textures) That is what I see when I look at all 3 consoles, to be satisfied with XB1's specs considering that all Microsoft makes is hyper (fantasy) realism games except for kid stuff on kinect, I just don't think XB1 offers enough, even though it's money hatting is very impressive... If it was smaller, it would probably make a good HTPC since those lack real gaming performance to stay quiet anyways. I don't know, this is pretty far off topic now, I'm going to stop.

To bring this back into topic, I do wonder exactly what feature set Latte really does carry, and before I hear Direct X 10.1, I do have to point out again that the API could lack Direct X 11 type features, if people aren't following, it's like a driver on a GPU that only supports Direct X 10, when the hardware really supports Direct X 10.1 (actually a bigger update than DX10.1 to DX11 IMO) HD 3800 series actually did this and if you were to try and code in Direct X 10 to that card, you couldn't get it to do Direct X 10.1 without some trickery (which is what I think of when I hear Direct X 11 isn't natively supported) because well... it's OpenGL and a custom API at that, Nintendo's entire goal last year was probably flushing out all of the original dev kit's GPU features, which was designed around Direct X 10.1, this is where I think it is a bit strange for them to not use a higher feature set, Latte is fairly custom even by standard design means, we were even comparing it to a Direct X 11 APU. These sort of ideas are the reason I keep coming back into the thread, not whether Wii U runs on 1 cylinder or 2. I could care less as the results are right in front of us.
 
I don't see the point in these long combative posts anymore. If you didn't understand my reasoning in the last post, I don't think you ever will.

Edit: We will know very soon if you are right about the manufacturing group making a large difference, every console sold in 2014 should be produced in TSMC's factories, so if the GPU die is significantly smaller you'd certainly be right. However if it more or less stays the same size, it would be clear that your assumption is wrong, as die size is the most expensive part about making a chip. afaik every chip made for Wii U this year was done by TSMC, but since it is selling so badly, we will need to wait for holiday sales to clear old stock before we can really be sure we have a new Wii U. I haven't bought mine yet, so if I end up waiting until next year, I'll open it up myself and measure the GPU for this thread. It would be an easy case closed if the GPU shrinks significantly.

If you don't want a long combative post then knock it off with the inflammatory remarks and stick to debating the points. Simple as that. I'm done if you are.

I have not found any report that TSMC have done manufacturing this year for Nintendo. The articles I've read say that the Renesas factory closure should not affect Nintendo for another 2-3 years. After that, a complete redesign may be necessary, as it is unclear if Renesas would allow TSMC to touch their proprietary eDRAM tech. So either they do and sell them the actual machines (in which case nothing will change) or they don't, in which case there will be a more drastic change than just smaller shader blocks.

As you've just pointed out these cards only hit those numbers in synthetic benchmarks, not in game. During gameplay for instance the 5570 hit only 31w average and occasionally peaked at 37w, but in synthetic benchmarks it hit 50w. The WiiU tests are not synthetic/theoretical, they're in game. So they aren't comparable, which was my point.

At no point have I said WiiU will significantly increase its power draw. But the fact is as shown consistently throughout console power usage testing power draw does increase as more complex games are released. The fact is 33w is a gameplay result from launch games (I haven't even seen any tests on newer games) and not a max TDP as you keep referring to it. Whether any of this "closes the gap" isn't really the issue. The issue is with accuracy, and comparing apples to apples.

The benchmarks simulate gameplay conditions and are repeated many times over. I would be more hesitant to make the comparison if there were any indication that Wii U's power consumption were more variable, but the average and peak are basically one in the same! In all the games DF tested, there were no significant spikes. None! In fact, Nintendo tested Wii U's cooling system more than 2,000 times. This sheds some light on what we're seeing - they didn't want any surprises down road. If anything, maybe they'll optimize a bit more and get the avg power consumption down (in the Wii U menu at the very least). But it's probably not a priority because the 32-33w is low already and their cooling system does a good job of keeping up.

As to DDR3 vs GDDR5, I think the difference in power draw is quite large (in comparison to the power draw of DDR3), and to be honest those tests suggest that. I mean I don't think only 25% higher power draw should be expected from a card with 25% more ALU's 25% more TMU's, a 25% higher clock and 100% more RAM. But I'll look into more specific numbers.

2 Gigabytes (so likely 4 chips 8 chips) of Samsung's GDDR5 @ 1000 Mhz, 1.5v, 128-bit bus is 5.9 watts.

http://www.samsung.com/us/business/oem-solutions/pdfs/Green-GDDR5.pdf
 
He is comparing it directly to All Stars in most cases and to be fair it does look better, by how much is up for debate and I don't imagine All Stars had even half the budget of Smash U or people half as talented working on it.

In most cases? I compared it to All-stars once and only once along with a few other comparisons when the difference between ports that was topic being discussed.

Please, stop derailing the thread with by attacking nonexistent arguments.

Ahhh! Ok.





While I'm not fond of Apophis's posts, but those things you pointed out in those pictures aren't as big as you make them out to be. The Wii U is a much more powerful console than the Wii. The gap between the Wii and Wii U is the largest out of all of the Nintendo consoles since gaming went 3D.

I didn't say they were. That was there own conjuration. I said they were minute details that I never saw done in such a way in last gen games. Then they proceded with a chain of strawman arguments acting the background resolution and aspecst around the things I pointed out without ever actually addressing what I said but spoke like they were things I stated(which is what a strawman argument is).
 
Also on paper. The Wii to the Wii U is the smallest leap after the Gamecube to the Wii.

How is that? Not only is the memory from the Wii to Wii U significantly greater than the difference from GC to Wii, but the raw performance AND the ability to use programmable shaders is huge. Not to mention all the features that the GPU has over the Wii.

I didn't say they were. That was there own conjuration. I said they were minute details that I never saw done in such a way in last gen games. Then they proceded with a chain of strawman arguments acting the background resolution and aspecst around the things I pointed out without ever actually addressing what I said but spoke like they were things I stated(which is what a strawman argument is).

Last gen as in...?
PS3 and 360?
Or Wii?
 
How is that? Not only is the memory from the Wii to Wii U significantly greater than the difference from GC to Wii, but the raw performance AND the ability to use programmable shaders is huge. Not to mention all the features that the GPU has over the Wii.



Last gen as in...?
PS3 and 360?
Or Wii?

All of the above. Go reread the post, and examine the areas I pointed out up close.. Show me some games with character models that had so many "minute" details, untop of the cloth actualy moving. The cloth is not only rendered separate from the body(as oppossed to being texture on like in most cases last gen) but it moves around as well. Minute details like freely moving shirt collars, sleeve, and pants legs as well as boots with spacing that you can see off in the boot are things that I don't remember seeing in last gen games.

Things like that would be seen as waste of resources on the last gen consoles but there presence here suggest that the GPU has resources to waste.
 
I've seen nothing like this in last gen games. Link's shirt neck, hit tunic collar, tunics tail, his sleeve(self shadowing :) )his pants legs, boot openings, his belt. They are all indvidually rendered to make them pronounced. They are not simply textured on as what was done with most last gen games.

Eh, looks on par with DOA5 for example.

DOA5U_GroupB_screenshot_07_EIN_Win.jpg


Super Smash Bros. doesn't really look good. The character models are OK, the rest is mostly bad.
 
Eh, looks on par with DOA5 for example.

DOA5U_GroupB_screenshot_07_EIN_Win.jpg


Super Smash Bros. doesn't really look good. The character models are OK, the rest is mostly bad.

That doesn't even look remotely close.

You can't see off in his sleave or off in his shirt. They are textured to the body like I said how last gen games did it earlier. The textures themselves are also "far" less detailed and dynamic. Also, the sleeves and collar don't move around with the movement of the body. The openings in the cloths on Link are casting shadows on his body at the collar and sleeve. The pictures doesn't show his boots so I can't tell if you can see spacial area and shadows of in them as well.

Only the lower part of that jacket is moving and that's been done in games since the PSX/N64 era.

That is miles from what I was pointing out.
 
That doesn't even look close.

Comparing to current gen fighting games:


I've seen nothing like this in last gen games. Link's shirt neck, hit tunic collar, tunics tail, his sleeve(self shadowing :) )his pants legs, boot openings, his belt. They are all indvidually rendered to make them pronounced. They are not simply textured on as what was done with most last gen games.

I don't see anything impressive about his model complexity at all. And there's no self shadowing, and they are not individually rendered either as they are part of the character model. With that I suppose you are saying that you could strip naked the character model and add the clothing pieces on him individually like they do with CG, it's not so. "Simply textured" seems like a weird complain about current gen, as seen by the comparison shots above. You must be confusing the PS3 and 360 with the Ps2.

You are really really reaching.
 
How is that? Not only is the memory from the Wii to Wii U significantly greater than the difference from GC to Wii, but the raw performance AND the ability to use programmable shaders is huge. Not to mention all the features that the GPU has over the Wii.

Go re-read my post. Gamecube to Wii was the smallest leap.
 
Bowser's teeth are composed of geometric primitives (Tetrahedron).
Hey look, I made next gen.

Yeah but most of those fighting games do not feature four players on the screen and aren't running at 1080p 60fps.
I think Tekken keeps 2 extra characters in memory (it is tag afterall). I think DOA5 does the same (but they're 720p).
 
Comparing to current gen fighting games:





I don't see anything impressive about his model complexity at all. And there's no self shadowing, and they are not individually rendered either as they are part of the character model. With that I suppose you are saying that you could strip naked the character model and add the clothing pieces on him individually like they do with CG, it's not so. "Simply textured" seems like a weird complain about current gen, as seen by the comparison shots above. You must be confusing the PS3 and 360 with the Ps2.

You are really really reaching.

I'm still not seeing "any" of the specific things I addressed in the screenshots you posted. Not a single one. They aren't even wearing shirts in the Tekken shot. The closest you caim to it is the one with the pants leg shot, but its still a mile off. His sleeves and collar are still drawn on his body and the textures do not remotely look like really cloth. They aren't even high res. The pants legs aren't loose either. Their motion is stiff and baked in that game. Its no where near close. I do no know what you are trying to dismiss with those screen.


I don't care what "impresses" you as I doubt you would find anything released on Nintendo hardware impressive. Please take the console war stuff elsewhere. Too many people come in here for no other reason than to find angles to downplay the proposed gains demonstrated by the hardware and not actually help the analysis. My goal is to help determine the capability of the GPU, not to see which system/game is you find impressive.

I'm talking about specific details. I've yet to see anyone provide anything to the contrary of what I said. If you have some photos of character models on the 360/PS3 demonstrating the exact traits I pointed out on the same scale then that would be a different story.
 
must say the location of creases and stuff on Link all remain in the same place regardless of pose. so just looks like a higher textured model (but it'd be pointless to make it impressively high for this scenario anyway, then as others say theres not much in the details of the level and background so theres saved resources there).
[note: not downplaying, just not getting as excited, at this point we might as well wait for tw101 to come out and see if anything neat in there]

I imagine the place this game will look impressive is in motion due to res+fps, both meaning its not gonna be an outright showcase for screenshot graphics.
Edit: Just to clarify i know nothing of graphic analysis, was a 1 minute look between those pics posted.
 
I don't care what "impresses" you as I doubt you would find anything released on Nintendo hardware impressive. Please take the console war stuff elsewhere. Too many people come in here for no other reason than to find angles to downplay the proposed gains demonstrated by the hardware and not actually help the analysis. My goal is to help determine the capability of the GPU, not to see which system/game is you find impressive.

Don't dare post unless you agree with Krizz !...

That post pretty much sums him up along with the usual 'You won't find anything Nintendo do impressive'. /sigh
 

Wow

Nasty low poly stage, nasty grass, low res textures, low poly boots. There are current gen games that look quite a bit nicer as have been posted above. Some effects are nicer, but that doesn't make up for all the low poly, low detail areas in the image. Wow, I mean wow.
 
I get the best of both worlds with my PC set up, I'll probably pick up Wii U by the end of the year, though I might wait until next year so I can finally close the case on the ALU count, people care far too much what it is. Basically if you don't have enough for GI, then you aren't doing anything that last gen wasn't already doing, especially if you are a PC gamer who has had "next gen" graphics for years. So whether it's 176GFLOPs @ 720p with a solid playable frame rate, or 1080p with 60fps and slightly better lighting, It's pretty much the same game, you can even do this with a PC, just limiting your frame rate and resolution. The difference to the average gamer is on the unimportant side of noticeable IMO (There is certainly some people who can't stand the lower resolution, however I wear glasses so the difference is literally unimportant as could be)

It's a matter of opinion but I don't see PS4 as next gen and when I look at XB1, I see a stop gap. When I look at Wii U, I see an improvement over last gen, only to the things that really hurt the experience (again that is Sub HD, screen tearing, frame rate issues and low resolution textures) That is what I see when I look at all 3 consoles, to be satisfied with XB1's specs considering that all Microsoft makes is hyper (fantasy) realism games except for kid stuff on kinect, I just don't think XB1 offers enough, even though it's money hatting is very impressive... If it was smaller, it would probably make a good HTPC since those lack real gaming performance to stay quiet anyways. I don't know, this is pretty far off topic now, I'm going to stop.

That's the first time I have ever read anyone say that tbh but I respect your opinion. Also remember you are comparing them to a $500 GPU and probably what ? another $500 worth of PC hardware ontop of it, they were never going to compete, esp PS4 at $400.

I agree completely about your assessment of WiiU though, that's exactly what I think it is, last gen ++ meaning native 720p / no tearing / more stable framerates / slightly better textures and really, really nice looking Nintendo games in motion which is great !.
 
Does anyone that posts in here have the required tool to take a power consumption reading from WiiU while it's running ?.

It would be good to see how much it uses when W-101 is being played as it looks to be the game that pushes WiiU the hardest so far (720p native / 60fps / more modern graphical effects).

If it's still only pushing 33w then that can be assumed to be the max while gaming, the extra 10w is most probably reserved for USB devices.

I loaded up the W101 demo with my Wii U plugged into "Kill o Watt" meter. I'd say the accuracy of the meter is around +- 1 watt since it fluctuates between 0 and 1 watt without anything plugged in.

The max wattage I saw used for the length of the demo was 32 watts (I had about 86 on my team and it was the scene recusing citizens out of fire). The minimum was during the credits and was 28 watts.

I was curious to see if it was higher wattage when inside the hanger when the game takes place on the gamepad and the TV screen shows the outside of the hanger (two different views). There was no difference but to be fair the TV image at that point in the game is incredibly basic, no more complex than an inventory screen or map. On average it stayed around 30 or 31 watts.

Other notes about it that people might ask... the demo is installed to the Wii U internal hard drive, so wasn't reading from disc. My TV is HDMI (didn't use the composite hookup), I had no sensor bar plugged in, and I had no USB drives plugged in.
 
I loaded up the W101 demo with my Wii U plugged into "Kill o Watt" meter. I'd say the accuracy of the meter is around +- 1 watt since it fluctuates between 0 and 1 watt without anything plugged in.

The max wattage I saw used for the length of the demo was 32 watts (I had about 86 on my team and it was the scene recusing citizens out of fire). The minimum was during the credits and was 28 watts.

I was curious to see if it was higher wattage when inside the hanger when the game takes place on the gamepad and the TV screen shows the outside of the hanger (two different views). There was no difference but to be fair the TV image at that point in the game is incredibly basic, no more complex than an inventory screen or map. On average it stayed around 30 or 31 watts.

Other notes about it that people might ask... the demo is installed to the Wii U internal hard drive, so wasn't reading from disc. My TV is HDMI (didn't use the composite hookup), I had no sensor bar plugged in, and I had no USB drives plugged in.

Thanks a lot for doing that, although I don't think W-101 is by any means the max what WiiU could output it is a bit of a tech showcase running at 720p native and 60fps with nice depth of field and some other effects.

32 Watts seems to be the absolute maximum at present then.
 
Didn't Iwata say a while back that the Wii U could draw up to 40W of power or am I remembering things wrong..?

Yes, he said that was a "realistic expectation". But his statement was so vague that we don't know if he meant with a bunch of things plugged into the console or if the console itself could use that much power.
 
I loaded up the W101 demo with my Wii U plugged into "Kill o Watt" meter. I'd say the accuracy of the meter is around +- 1 watt since it fluctuates between 0 and 1 watt without anything plugged in.

The max wattage I saw used for the length of the demo was 32 watts (I had about 86 on my team and it was the scene recusing citizens out of fire). The minimum was during the credits and was 28 watts.

I was curious to see if it was higher wattage when inside the hanger when the game takes place on the gamepad and the TV screen shows the outside of the hanger (two different views). There was no difference but to be fair the TV image at that point in the game is incredibly basic, no more complex than an inventory screen or map. On average it stayed around 30 or 31 watts.

Other notes about it that people might ask... the demo is installed to the Wii U internal hard drive, so wasn't reading from disc. My TV is HDMI (didn't use the composite hookup), I had no sensor bar plugged in, and I had no USB drives plugged in.

Great work. Did you have a disc in the drive at all? From what I've read, the drive seems to be active as long as any disc is in there. But it only accounts for an increase of ~1.6w.

Yes, he said that was a "realistic expectation". But his statement was so vague that we don't know if he meant with a bunch of things plugged into the console or if the console itself could use that much power.

Indeed, seems to me he was averaging one USB device plugged in when making that statement. But it was vague...
 
Great work. Did you have a disc in the drive at all? From what I've read, the drive seems to be active as long as any disc is in there. But it only accounts for an increase of ~1.6w.

Not anymore since the last (or the one before, dunno) update.


wilsoe2 said:
I loaded up the W101 demo with my Wii U plugged into "Kill o Watt" meter. I'd say the accuracy of the meter is around +- 1 watt since it fluctuates between 0 and 1 watt without anything plugged in.

The max wattage I saw used for the length of the demo was 32 watts (I had about 86 on my team and it was the scene recusing citizens out of fire). The minimum was during the credits and was 28 watts.

I was curious to see if it was higher wattage when inside the hanger when the game takes place on the gamepad and the TV screen shows the outside of the hanger (two different views). There was no difference but to be fair the TV image at that point in the game is incredibly basic, no more complex than an inventory screen or map. On average it stayed around 30 or 31 watts.

Other notes about it that people might ask... the demo is installed to the Wii U internal hard drive, so wasn't reading from disc. My TV is HDMI (didn't use the composite hookup), I had no sensor bar plugged in, and I had no USB drives plugged in.


Is this Kill O Watt meter some kind of a professional device or is it just some ~20-50 Dollar device?
 
Great work. Did you have a disc in the drive at all? From what I've read, the drive seems to be active as long as any disc is in there. But it only accounts for an increase of ~1.6w.

haha certainly! I've thoroughly enjoyed reading your's and everyone else's analysis on here, its my pleasure to make even a small contribution. To answer your question, yes there was a disk in the drive.
 
Didn't Nintendo themselves say that the TDP of the Wii U was 75 watts? With games running at 32 Watts at peak, it makes me wonder if they are preparing for an overclock with the next "speed" update allowing more power to also be drawn to accommodate the newer overclocked speeds.
 
Status
Not open for further replies.
Top Bottom