WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
The GTX 780 is a more sophisticated chip. We know that the Xbone and PS4 are more sophisticated chips but not the exact details. I mentioned they were more sophisticated chips. So just by judging clock speed we know the Wii U is less powerful.

The point being made is that core clock does not put any piece of silicon in a gen. No one is implying or suggesting its comparable to X1 or PS4 in overall power.
 
You're the one making definitive statements to that effect, you tell me!

There is no point. The page clearly indicated that this is not a technique enjoyed by the majority of console games do to various hardware limitations.

Things are moving "toward" deferred rendering now that DX10 and beoyond hardware with more memory is available.

The answer is clear. "Most" games were forward rendered on the PS3/360. I see nothing that points to otherwise.
 
There is no point. The page clearly indicated that this is not a technique enjoyed by the majority of console games do to various hardware limitations.

Things are moving "toward" deferred rendering now that DX10 and beoyond hardware with more memory is available.

The answer is clear. "Most" games were forward rendered on the PS3/360. I see nothing that points to otherwise.

You're pissing into a hurricane of made up nonsense. Just go back through the last four or five pages for more examples.

It's best to just smile and move on.

:)
 
There is no point. The page clearly indicated that this is not a technique enjoyed by the majority of console games.

Things are moving "toward" deffered rendering now that DX10 and hardware with more meomory is available.

The answer is clear. Most games were forward rendered on the PS3/360. I see nothing that points to otherwise.

It has nothing to do with DX10. DX10 is just what made deferred rendering compatible with MSAA.
 
Edit to add more information: Yes. A lot of engines use deferred rendering, people didn't wait for next gen to start on it though. "Most" games might still be forward rendered on current gen but it's not like 99%/1%.

I'm not sure I see anything in the WiiU that makes me think that it was built for deferred rendering. It's nice that they have 32MB of eDRAM vs. the 10MB in the 360 and maybe it can render to more render targets at once? I guess I see those as general GPU improvements and not "it's MADE for deferred rendering"...allthough it is true the technique became popular while the system was in R&D. If there are limitations that force all buffers to be in eDRAM for the GPU to see it, it seems like that would be less good? We've never gotten much information about that, aside from the fact that devs do not have direct access to the eDRAM, right?

wsippel has some semblance of an explanation many pages back for his logic IIRC so perhaps he'll chime in. Just thought it was an interesting point that hasn't been discussed much.
 
Yes. Yes, they did. Ever since developers realized that they could.

Which was around 2008.

Is there any source that supports this claim?

Saying the technique has been used since then, and saying the majority of games(most) have used the technique since then are two different things.

It has nothing to do with DX10. DX10 is just what made deferred rendering compatible with MSAA.

I know it has nothing to do with DX10 itself. I never said it did. I'm pretty sure I stated that DX10+ led to it becoming more feasible to use, not that it enabled its use. Otherwise it would have been completely impossible for the last gen hardware which clearly is not the case.


Passive aggressive personal attacks as opposed to addressing what has been said or the topic. How new.
 
The point being made is that core clock does not put any piece of silicon in a gen. No one is implying or suggesting its comparable to X1 or PS4 in overall power.

Well, its not like a 4 ghz i5 is going to output graphics like a custom GPU running at 800 mhz. Or if given only GPU speed can we judge the relative merits of these three consoles. But given what little we know about the Wii U, its clock speed is one of the big indicators it is nowhere near as powerful or sophisticated as the other two and a hint that it is closer to the 7th generation consoles than the 8th. The tech the 8th gen is based on is architecturally capable of running up to much higher speeds.

Is there any there any source that supports this claim?

So wait, so every major AAA game enjoyed deferred rendering? What does that say about the hardware and its capabilities?
 
Well, its not like a 4 ghz i5 is going to output graphics like a custom GPU running at 800 mhz. Or if given only GPU speed can we judge the relative merits of these three consoles. But given what little we know about the Wii U, its clock speed is one of the big indicators it is nowhere near as powerful or sophisticated as the other two and a hint that it is closer to the 7th generation consoles than the 8th. The tech the 8th gen is based on is architecturally capable of running up to much higher speeds.

No. There's nothing stopping a VLIW product from running at 800mhz or close to it - . What's stopping this VLIW product from 800mhz is Nintendo's thermal design target. The GPU featureset is much closer to the 8th gen consoles, and the raw power is much closer to the 7th.
 
Is there any there any source that supports this claim?

Tons. If you google it. Or just follow graphics technology.

Gears of War was using deferred shading back in 2006. It's what forced PC users to use driver flags to enable MSAA in DX9 mode.

Link

Edit: And this is UE3 a.k.a. the most used graphics engine in history.
 
Well, its not like a 4 ghz i5 is going to output graphics like a custom GPU running at 800 mhz. Or if given only GPU speed can we judge the relative merits of these three consoles. But given what little we know about the Wii U, its clock speed is one of the big indicators it is nowhere near as powerful or sophisticated as the other two and a hint that it is closer to the 7th generation consoles than the 8th. The tech the 8th gen is based on is architecturally capable of running up to much higher speeds.

Let's not pretend that the parts in the upcoming consoles are anything above tablet CPUs and laptop GPU's. The Jaguar APU is still comparable to the crappy Intel Atom line of procs, and the PS4s GPU is still showing signs of being less capable than a 7970m.

There are finite limits to overclocking those two hardware configurations, especially in a slightly bigger than a laptop chassis.
 
Tons. If you google it. Or just follow graphics technology.

Gears of War was using deferred shading back in 2006. It's what forced PC users to use driver flags to enable MSAA in DX9 mode.

Link

Okay?

Gears of war is one game. There were hundreds of games released on the PS3/360 this gen.

How does Gear of War having differed shading in 2006 prove that most PS3/360 games used it? Gears of War was a AAA exclusive. I would expect an outlier like that to use the best the console had to offer.

As for UE3. It supported a lot, but that doesn't mean that most games that used it, used deferred rendering. That is a leap in logic.

If most games used it, then that would likely have been stated somewhere as it would then be the norm and not be worth bringing up to begin with. That would in turn invalidate the argument you initially made for it where you tried to say that PS All Stars using it was something to take into account when comparing, or something to invalidate the comparison. If all games use it now, then that means nothing.

Remember, you are the one who made the initial claim suggesting that deferred rendering was something "special" to be taken into consideration when making the comparison.
 
No. There's nothing stopping a VLIW product from running at 800mhz or close to it - . What's stopping this VLIW product from 800mhz is Nintendo's thermal design target. The GPU featureset is much closer to the 8th gen consoles, and the raw power is much closer to the 7th.

You're right. I don't think the point I was making was correct. I think what I have in mind is that its raw power and its similarity to 7th generation performance is due to the chipsets clockspeed. If it were closer to the Xbone or PS4 then it would be much harder to judge their relative performance.
 
OKay?

Gears of war is one game. There were hundreds of games released on the PS3/360 this gen.

How does Gear of War having differed shading in 2006 prove that most PS3/360 games used it? Gears of War was a AAA exclusive. I would expect an outlier like that to use the best the console had to offer.

Is this some kind of sick joke?

I'm not going to do your homework for you. And I'm not going to name every single example of deferred rendering in modern gaming when it's an inherent part of modern rendering engines.

The fact that you don't know this means that you don't know anything about modern rendering engines.
 
Is this some kind of sick joke?

I'm not going to do your homework for you. And I'm not going to name every single example of deferred rendering in modern gaming when it's an inherent part of modern rendering engines.

The fact that you don't know this means that you don't know anything about modern rendering engines.

Once again, you are the one who started this argument about deferred rendering as an attempt to dismiss the comparison of Super Smash Bros U to Playstation All Stars Batttle Royal.

The burden of proof is on you. It is your homework, for you made the claim, not I.

Going by what you have just tried to claim, your entire argument was invalid form the start. If all modern games are using it than there was never a point to differentiate and the comparison is still a valid comparison.

I hope you do not expect me to just take one game with a massive budget and skilled to conclude that over 50% of all games were the same as it. Forgive me for not taking your word on the matter when available info has pointed to the contrary.
 
What a terrible topic this has become. One time, way, way in the past it actually was a place where people talked about the chip. Those were good times.
 
What a terrible topic this has become. One time, way, way in the past it actually was a place where people talked about the chip. Those were good times.

There has been much talk about the chip in spots in the last few pages. It just keeps getting sidewinded by individuals who like to make every effort they can to downplay anything good that comes out of it and impede advancement.

The biggest thing was the mention by a credible source that Bayonetta 2 was 1080p. That would suggest far more capability in the GPU than previously believed and allow for best estimates of performance.

The current range of flops is anywhere from 176-352. It could help us better estimate that. The capability of the eDRAM would also come into play. Then there was the bit about the capability of the CPU that came about a little earlier. It has not been entirely fruitless.

Everything went haywire with people trying to dismiss it, though. That is likely not going to stop. You can expect this every time a new achievement is shown or someone posts new media for analysis. So long as anyone suggest that Wii U is more capable than the absolute lowest estimate, people will come in here to derail the thread .
 
In terms of performance AND efficiency, can we compare the Wii U to anything? Whatever it does, it appears to be doing it very, very efficiently. Did we ever get any hard answer on the wattage its drawing?
 
When comparing identical parts, clock speed is great. Ultimately in GPU's the Wii U, Xbone and PS4 are inherently quite similar. Having a low clock speed is a major detriment. We cannot use it to get an exact percentage of how less powerful it is, but we can tell just by the speed alone that the Wii U doesn't hold a candle to the more sophisticated chips.

One thing we all have to remember is that there's around a third of the silicone on the GPU that's a complete mystery...although it wouldn't surprise me if Nintendo have filled it full of silicone that does absolutely fuck all just to fuck with us lololol

*Iwata reads post*

(laughs)
 
In terms of performance AND efficiency, can we compare the Wii U to anything? Whatever it does, it appears to be doing it very, very efficiently. Did we ever get any hard answer on the wattage its drawing?

Maybe a 7790 but that's a pc design only reason I mention it has insane power to performance in benchmarks.
 
Maybe a 7790 but that's a pc design only reason I mention it has insane power to performance in benchmarks.

The leading chip for comparison, I thought, was Brazos. An HD6XXX chip.

There seem to be a lot of similarities in the design of the components. Honestly, this chip is too custom made to compare to any one GPU.
 
You're right. I don't think the point I was making was correct. I think what I have in mind is that its raw power and its similarity to 7th generation performance is due to the chipsets clockspeed. If it were closer to the Xbone or PS4 then it would be much harder to judge their relative performance.

Its performance isn't similar, when you have two exclusives at 720p 60fps. Watch a couple videos and compare.
 

Double Sigh.

Stevie, can a GPU perform above its expected performance (relative to hardware) if the memory subsystem is tailored at squeezing the most out of the GPU? The punching above its weight comments come to mind when I think about this.

I thought the punching above its weight comment pertained to the CPU, not the GPU.

You are probably right but I think I heard this from other dev in relation to the whole system ..... checking....
 
I thought the punching above its weight comment pertained to the CPU, not the GPU?

EDIT: My mistake. that was about the overall hardware.

http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-behind-the-scenes

Honestly, the best we have to go on for performance is Shin'ens comment. They have told us the most.

I'm still interested the GPU/CPU conjunctive speed boost they spoke of. No one seems interested in exploring that.

The sooner we can accept it, the better this thread will be.

I accept nothing one way or the other without sound proof.

If you wish to believe the hardware it not that much stronger and will never achieve anything much greater than the last gen consoles, I will not try to dissuade you, but why do you want everyone else to believe it with you? Why must everyone think this?
 
Double Sigh.

Stevie, can a GPU perform above its expected performance (relative to hardware) if the memory subsystem is tailored at squeezing the most out of the GPU? The punching above its weight comments come to mind when I think about this.

You're better off posing that kind of question to members who are well versed in not only console programming, but GPU programming in general (such as blu). I don't know much about data movement within a multilayered memory subsystem.

With that said, even if the console "punches above its weight" - it is what it is. A small, quiet, compact 8th gen Nintendo box capable of producing stuff that's a bit nicer looking than the previous generation when put to use correctly. But not by leaps and bounds, or any such exaggeration as presented in this thread.

The sooner we can accept it, the better this thread will be.
 
The leading chip for comparison, I thought, was Brazos. An HD6XXX chip.

There seem to be a lot of similarities in the design of the components. Honestly, this chip is too custom made to compare to any one GPU.

While I get your last bit the only reason I mentioned that chip was cause of the power and performance. I didn't mean to compare them in feature set rather considering what we know about WiiU gpu or the utter lack of it can't make a comparison.
 
Passive aggressive personal attacks as opposed to addressing what has been said or the topic. How new.
Cool, let's discuss what you've actually said then.


There isn't one.

This is just what most hardware is geared toward in the current day. It would make the code the most portable. That doesn't matter in exclusives though, whit is why the Wii U has so many 1080p 60fps retail exclusives.
Name them.

No, that is not remotely fact. Not unless you know something about the CPU and GPU capabilities that the rest of us don't.

DX11 features, GPGPU, Shader Modal 4/5 and adaptive tessellation were not available on the last gen consoles in any form.

On top of that, we are seeing more than double the polygon count in a lot of games when compared to similar last gen games and a higher number of individual effects all while outputting to two screen at once. The raw power of the Wii U is no less than 2x that of the last gen consoles.
Give me the numbers you used to arrive at this definitive statement.

What I'm saying in a not so subtle way is that you're full of shit. You spout this stuff off with absolutely nothing to back it up and then deflect or ignore anyone that calls you on it. I'd rather be aggressive-aggressive than passive-aggressive and I guess I'll have to deal with the consequences.
 
Is this some kind of sick joke?

I'm not going to do your homework for you. And I'm not going to name every single example of deferred rendering in modern gaming when it's an inherent part of modern rendering engines.

The fact that you don't know this means that you don't know anything about modern rendering engines.
I'm most interested in the Wii U early ports with regard to this. For example with a quick search I learned that AC3 uses deferred lighting but not deferred shading, and B02 uses no deferred rendering at all.
 
I'm most interested in the Wii U early ports with regard to this. For example with a quick search I learned that AC3 uses deferred lighting but not deferred shading, and B02 uses no deferred rendering at all.

Exactly, and going by the list, it seems to have been used before, mostly deferred lighting.

Deferred shading seems to have the most impact on memory, I think the lists is clear in that this was not a standard in current gen due to the memory.

I am clear Wii U is not a beast, but I am particularly interested in little details like this, that when combined could bring performance and IQ boosts to Wii U games. Using deferred rendering, plus using the CPU and memory subsystem as it should should produce some cool results IMO. I think we have two extermists in this thread, some people just look at ports and thinks Wii U is maxed out, but when XB1 games are going back into 720p60 (KI), that must be that devs need time with the hardware. I am calling that bullshit out too!!
 
You're better off posing that kind of question to members who are well versed in not only console programming, but GPU programming in general (such as blu). I don't know much about data movement within a multilayered memory subsystem.

With that said, even if the console "punches above its weight" - it is what it is. A small, quiet, compact 8th gen Nintendo box capable of producing stuff that's a bit nicer looking than the previous generation when put to use correctly. But not by leaps and bounds, or any such exaggeration as presented in this thread.

The sooner we can accept it, the better this thread will be.

Not gonna happen. Too many people want to be right and even have drawn their line in the sand. I've tried to follow the thread as a lurker for the most part, but people seem too invested in their own view (both for and against) that there's no way there will be any meaningful discussion. It's like having a discussion about religion. I have no desire to continue reading this kind of stuff. Seems to happen in just about every Wii U thread. I guess there's just no way around it, so time to bail out.
 
I'm most interested in the Wii U early ports with regard to this. For example with a quick search I learned that AC3 uses deferred lighting but not deferred shading, and B02 uses no deferred rendering at all.

Ya, the CoD engine still uses forward rendering. At 880x720 (1024x600 in earlier games) it fits nicely into the 360's eDRAM with 2xAA.

Darksiders 2, Arkham City, and Mass Effect 3 all use deferred techniques. Same as on 360/PS3/PC.

Edit: The main drawback of using deferred rendering is that it tends to kill MSAA support in DX9. It's also the chief reason we see post-process AA instead of MSAA in so many games nowadays. AMD released a demo last year showcasing the benefits of forward rendering while using compute shaders to retain some of the benefits of the deferred approach. The ability to use MSAA being the main benefit of going back to FR.
 
I thought the punching above its weight comment pertained to the CPU, not the GPU?

EDIT: My mistake. that was about the overall hardware.

http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-behind-the-scenes

Honestly, the best we have to go on for performance is Shin'ens comment. They have told us the most.

I'm still interested the GPU/CPU conjunctive speed boost they spoke of. No one seems interested in exploring that.

I accept nothing one way or the other without sound proof.

If you wish to believe the hardware it not that much stronger and will never achieve anything much greater than the last gen consoles, I will not try to dissuade you, but why do you want everyone else to believe it with you? Why must everyone think this?

I thought Eurogamer were 'biased' against WiiU ?, or is that only when the disagree with your view of the console ?.
 
The Wii U is like a perfected 7th gen console. The GPU is quite a piece of work for what, 15 or 20 watts at the max. Makes you wonder what they could have pulled off without the gamepad and they opted for a beefier GPU....or even just designed a more robust cooling system (making the system larger) and ran the GPU faster.

:(
 
The Wii U is like a perfected 7th gen console. The GPU is quite a piece of work for what, 15 or 20 watts at the max. Makes you wonder what they could have pulled off without the gamepad and they opted for a beefier GPU....or even just designed a more robust cooling system (making the system larger) and ran the GPU faster.

Naw, cpu is still an oddball to program for, so they would have had to have swapped that out for it to be perfect. Nintendo has trouble with this system, what chance to 3rd parties have?

Wow. it seems like bunch of you guys are talking to this giant talking wall.

This is not a discussion. This is just nonsense.

Yeah, Krizz is a piece of work or something.
 
I would much rather a system have a refined triple core g3 than that piece of crap PPE that was in both of the previous generation hd consoles. (Edit: and yes, an old refined g3 isn't going to give results as good as a jaguar - but we were talking previous generation comparisons).
 
I thought Eurogamer were 'biased' against WiiU ?, or is that only when the disagree with your view of the console ?.

Huh? The comment was made by Criterioin, not Eurogamer, and even if it were, a single postive statement would not change my stance toward their methods.

Also, what is my supposed "console view" that you speak of? That I don't see a reason to bash a console because of perceived specs?

Yeah, Krizz is a piece of work or something.

Another personal attack? How many does that make today? Apparently, not agreeing with unsupported opinions is a terrible thing. I'm not the topic of this thread.

Notice that I don't attack, insult or bash people when they don't agree with me. That's because I actually came here to discuss the topic. Attacking a person in an argument when it doesn't go your way is a tactic of people with an negative agenda.

I always say its fine if you don't accept what I say as fact as I don't expect you to. Its an opinion. But if I don't agree with your opinoin then apparently I'm a "piece of work" or something other negative inference. How mature and progressive.
 
i think it would be better if you just ignore the personal attacks and keep on topic. you arent going to please anyone saying the things you are saying. its your opinion just keep moving and forget the personal attacks.

That would be simple if it didn't happen so frequently, but your right. At least I will maintain my maturity in the discussion.

On that note, back to the topic. We now have more footage of Sonic Lost World(its a 1080p trailer for what that's worth).

http://www.youtube.com/watch?v=RyeX6XG0k8w

I must say, that those are some very round models. These high 60 FPS polygon count point to more than just being a little better than the last gen in my opinion.
 
Name them.
I bet the number is lower than '30 games'. :P

Comparing Bayonetta to Bayonetta 2 is inherently flawed simply because Bayonetta came out four years ago, and was Platinum's first major iteration of the Platinum Engine. On the 360/PS3 they had the chance to refine the engine three times over with Vanquish, Max Anarchy, and Revengeance. And they've further refined it for Wii U hardware with The Wonderful 101. It will look better on the principle that the engine itself is more capable these many years later.
I've said that as well in this thread. Anything that isn't a multiplatform release is too flawed to draw any conclusion. People should be looking at splintercell and future releases that appear on the three systems to say something meaningful about the Wii U hardware.

Besides, the difference between Bayonetta 1 and 2 is about as big as the difference between Uncharted 1 and 2 and even smaller than the difference between Halo 3 and 4. Technically, the game doesn't seem to be doing anything beyond 'current-gen' hardware.
The argument that it does is based on nothing but the assumption that Bayonetta 1 is the maximum that the PS3 and Xbox 360 can do. We know this is a false assumption because, even by 2009 standards, Bayonetta 1 wasn't exactly a technical marvel.
 
Many current generation games already use deferred rendering. Note that the Killzone 2 presentation is from '07 - lots of game engines support it now.

http://www.guerrilla-games.com/publications/dr_kz2_rsx_dev07.pdf
http://dice.se/publications/spu-based-deferred-shading-in-battlefield-3-for-playstation-3/
Well, there was even at least one game for the original Xbox that used deferred rendering...

Yes, pretty much all common engines support deferred rendering. They also support forward rendering. Deferred is more often used on PC from what I can tell, forward is more common on consoles due to hardware limitations. As you probably know, Xenos only supports up to four render targets for example. On top of that, some serious voodoo is required to even reach those four targets on 360 thanks to its memory architecture.

So yes, it was used on PS3 and 360. It wasn't standard, and it had some pretty significant limitations.
 
The fact that Megachips is the supplier of the video encoding/decoding LSI is old news (i.e press release) however, the following that's included in a patent application published today (filed Oct 13, 2011 - Nintendo/Megachips as assignees) raised some -possibly stupid- questions here.

untitledb0a4k.png
untitled23gx4y.png


It talks about controlling/setting
/sacrificing
gamepad image quality to eliminate input lag, depending on style of game (fast moving Vs more static image).

It seems to me (and please correct me if I'm wrong) this just involves the encoding/decoding part and not the stuff GPU has to process/render. (i.e same GPU usage)

Really, how does the WiiU (GPU) handle the TV and Gamepad screens? As two separate "monitors" similar to a dual monitor PC setup?

Do devs have control over other quality settings (AF,AA,etc) regarding what's outputted on the gamepad screen or just the above streaming-quality levels? (assuming what's proposed in the patent app. is real/being actually used)


Hope I make sense :P

Anyone guys? (especially the bolded part)
 
Really, how does the WiiU GPU handle the TV and Gamepad screens? As two separate "monitors" similar to a dual monitor PC setup?
That's supposed to be the case, via an Eyefinity implementation.
Do devs have control over other quality settings (AF,AA,etc) regarding what's outputted on the gamepad screen or just the above streaming-quality levels? (assuming what's proposed in the patent app. is real/being actually used)
They must have access to AF and AA for both virtual viewports.

I'm more unsure regarding them having access to the described bitrate option/presets.
 
M°°nblade;81577893 said:
I've said that as well in this thread. Anything that isn't a multiplatform release is too flawed to draw any conclusion. People should be looking at splintercell and future releases that appear on the three systems to say something meaningful about the Wii U hardware.

Besides, the difference between Bayonetta 1 and 2 is about as big as the difference between Uncharted 1 and 2 and even smaller than the difference between Halo 3 and 4. Technically, the game doesn't seem to be doing anything beyond 'current-gen' hardware.
The argument that it does is based on nothing but the assumption that Bayonetta 1 is the maximum that the PS3 and Xbox 360 can do. We know this is a false assumption because, even by 2009 standards, Bayonetta 1 wasn't exactly a technical marvel.

I disagree with your logic. Bayonetta 1 was released in late 2009 / early 2010 depending on region. Clearly the xbox 360 was the lead platform which came out in late 2005. That leaves at least 4 years (not sure when Platinum starting playing with 360 dev kits) of documentation and experimentation with development. To your point, would a Bayo 2 for PS/360 look better than Bayo 1? Yeah, probably. But Bayo 2 Wii U is exceeding Bayo 1 with maybe 2 years of work and no prior efforts for the Wii U platform (I think Bayo 2 and W101 were developed in parallel?) .

Your argument focuses heavily on experience (for example the differences between Uncharted 1 and 2, same platform, more experience). Yet you don't acknowledge at all the differences in experience between Wii U and PS/360 development.

So the question becomes to what extent does experience with PS/360 carry over to Wii U?

As to judging power and capabilities based on multi plat games, that seems wrong. You must agree that exclusives often have better graphics (Halo, Uncharted, God of War, etc) because devs have focus and don't need to worry about feature compatibility etc.
 
Status
Not open for further replies.
Top Bottom