WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I doubt Wind Waker HD will be the game that shows off the Wii U's power, 3D Mario and Retro's game most likely be the first, Super Smash Bros. 4 is probably not even close to be finished yet so it will have improvements over time.

And why does everyone expect Zelda Wii U at E3 2013? Last thing I expect.

I don't expect a playable demo, I expect a gameplay trailer like with Twilight Princess.
 
3D Mario is near the corner, and maybe something else.

I don't know what people are expecting from this game or anything from Nintendo for that matter. Nintendo has never even attempted to max out any of their systems with their games. It was always a third party. Nintendo is more about design and aesthetics than power.

On the NES it was Square.
On the SNES it was Enix
On the N64 is was Rare and Factor 5
On the GC is was Factor 5.
We never saw the max capabilities of the Wii because nothing ever surpassed Rogue Squadron GC for the console. The closest I can think of are Zangeki no Reginleiv which few people have ever heard of, much less seen.

Shin'en and Monolith Soft will be the ones to bank on for showings offs the Wii U's capabilities to a decent level, though I doubt either of them will push it that hard either. Shin'en intends to use its tessellation features in their next game.

Nintendo doesn't have devs like Quantic Dream and Naughty Dog anymore. Rare sold out and Factor 5 was one of the many devs that died trying to sell out to Sony in the early days of the PS3.
 
I disagree with you, big time.


First off, why Enix on the SNES? They did nothing that pushed SNES; and dragon quest games were outsourced things to Chunsoft (and looked subpar). They published Star Ocean, but didn't develop it. Comes off as odd.

And GC was Factor 5 and Capcom with RE4, for sure. I'd add Sega/Smilebit too, F-Zero GX was very impressive.


Sure, some companies go further than Nintendo on their own hardware, but going further is the keyword. For instance, Factor 5 went further on effects, tech, using CPU for graphics... And that's fine, but Nintendo is always the one going for the an rounder delivery; namely Factor 5 sacrificed a lot to get those graphics out of the system which is why they rarely delved outside their "flying battles" genre.

No game on GC did more than Zelda TP, for instance, all kinds of water variants; effects, lightning, particle sub system on twilight world, polygon de-clipping when warping, detailed shadows for everyone; shadows deformed themselves to the geometry of the objects they were being cast upon; night and day cycle, dynamic music, clouds casting shadows over hyrule field... Similarly, no Wii game did more than Mario Galaxy (probably), hence why it looks so good. You couldn't pull Factor 5 tech on those scenarios without sacrificing something, way more gaming logic going on there.


In the end hardware usage is going to be pretty damn similar if you looked at load percentages and what they're doing per cycle, it's just that Nintendo didn't focus on one thing, they tried to do a little bit of everything whereas a lot of the companies you're listing would throw a lot of things out of the window (AI, physics, proper animations, complementary/background tasks) to focus on one thing alone, graphics from a technical standpoint.


I like the system pusher versus the ambient pusher terms, basically, on the N64 you had Factor 5 doing battle for naboo; that's an impresive game for the time, but it looks very barren and lifeless now; then look at OoT, doing the whole lost woods particles flying around, buttlerflies and insects flying around bushes; fish in the water, day and night system... and that's an impressive ambient pusher for 1998; of couse they did sacrifice a few polygons for it, but it's simply more vivid and impressive; that has way more impact than doubling the polygons and having a barren game nonetheless.

Instead of pushing "more polygons than anyone else" they pushed for an all rounder package that looks and feels better today.

There's nothing wrong with Nintendo tech. They did chips that enabled scrolling on the NES, before most; kept doing it on the SNES with shit like Starfox 2 (that didn't come out officially, but we know how evolved it was); they did 60 FPS on the N64 (F-Zero X), before most developers even thought about it too, and on the GC shit like Pikmin was unheard of; of course though, doing 100 little men on screen takes it's toll on the hardware, but that's what it's there for.



Quantic Dream tends to be considered a waste of perfectly good time and money on something that ends up not being quite a game, and for that Nintendo would never put down the money for it; but they have similar approaches on titles like Hotel Dusk, cheaper and way better artistically. As for Naughty Dog, that's the equivalent of Retro Studios (even because they went from platformers to tomb raider clones and Retro went from First Person Adventures to... Platformers), although Retro Studios is clearly the better one artistically.

Shin'en is as good as Factor 5 was; it's just they're less fascinated by making it big (Factor 5 downfall) and tackling big projects that take years to do. That said the dudes origin is the same, seeing they come from the Amiga demoscene. As for Monolith, time will tell, they fall a lot within the ambient pusher I just described above anyway (and that's better in my book than aiming for being king of the block on a boring part of the hardware)
 
Actually.. that could have been a reason why the FB engine was having trouble running on the Wii U. The system was designed to let its DSP to handle most sound tasks, while that engine was likely not optimized to use it.

I doubt it - I think the folks at DICE are good at what they do, and I'm sure they knew about the existence of the audio DSP.

One of the first things I would do if I were tasked to evaluate the performance on Frostbite would have been to completely disable sound, under the basic assumption that most of those CPU cycles would go onto the DSP. Disabling sound would get us in the CPU ballpark of "I completely rewrote the sound engine for the DSP" without actually spending time to do it.

I'm guessing FB3 on PS3/X360 isn't a walk in the park either, but EA would force them to do it regardless of their opinion regarding it. (BF3 on PS3 has all that input lag, far from an optimal port)

I would guess that FB3 running on PS3/X360 was a requirement, and since I think FB3 is a direct evolution of FB2 it was basically running on those platforms from the get go. Also, wasn't the PS3 input lag fixed?
 
I disagree with you, big time.

Sure, some companies go further that Nintendo, but going further is the keyword. For instance, Factor 5 went further on effects, tech, using CPU for graphics... And that's fine, but Nintendo is always the one going for the all rounder delivery.

No game on GC did more than Zelda TP, for instance, and no Wii game did more than Mario Galaxy (probably).

It's just that one did an expansive varied world and the later did a spectacular game (and sequel) at 60 frames per second.

In the end hardware usage is going to be pretty damn similar, it's just that Nintendo didn't focus on one thing, they tried to do a little bit of everything whereas a lot of the companies you're listing would throw a lot of things out of the window to focus on one thing alone, graphics.


I like the system pusher versus the ambient pusher terms, basically, on the N64 you had Factor 5 doing battle for naboo; that's an impresive game for the time, but it looks very barren and lifeless now; then look at OoT, doing the whole florest particles, buttlerflies and insects flying around; fish in the water... and that's an impressive ambient pusher; of couse they did sacrifice a few polygons for it, but it's simply more vivid; that has way more impact than doubling the polygons and having a barren game nonetheless.
I want to add to your comment that Metroid Prime: 3, a 2007 game is the second if not best looking wii game, right there with Galaxy. Skyward Sword also looked really good.
 
I don't know what people are expecting from this game or anything from Nintendo for that matter. Nintendo has never even attempted to max out any of their systems with their games. It was always a third party. Nintendo is more about design and aesthetics than power.

On the NES it was Square.
On the SNES it was Enix
On the N64 is was Rare and Factor 5
On the GC is was Factor 5.
We never saw the max capabilities of the Wii because nothing ever surpassed Rogue Squadron GC for the console. The closest I can think of are Zangeki no Reginleiv which few people have ever heard of, much less seen.

Shin'en and Monolith Soft will be the ones to bank on for showings offs the Wii U's capabilities to a decent level, though I doubt either of them will push it that hard either. Shin'en intends to use its tessellation features in their next game.

Nintendo doesn't have devs like Quantic Dream and Naughty Dog anymore. Rare sold out and Factor 5 was one of the many devs that died trying to sell out to Sony in the early days of the PS3.

Retro and EAD are no slouches in terms of graphical prowess on Nintendo consoles.

This might be wishful thinking but im expecting a difference similar to or as large as Xenoblade vs Metroid Prime 3 and the Galaxy games when we finally get a glimpse at the next 3D Mario and whatever Retro is working on.
 
I would guess that FB3 running on PS3/X360 was a requirement, and since I think FB3 is a direct evolution of FB2 it was basically running on those platforms from the get go.
It's still retrofitting it in there when they're obviously moving further.

In a way is like realizing CoD games and Valve's on Source engine are Quake 3 derived... and yet they're not supposed to run on hardware that ran Quake 3; but they could if they had to, I guess. Battlefield 3 on PC's is one of the few PC games on a league of it's own; of course looking back is always retrofitting it there; but EA took the decision of relying on such engine, so it has to look back; not supporting the Wii U ends up being very silly, it's not about Battlefield 4, it's about development environment. Even Square-Enix ported Crystal Tools (FFXIII engine) to the Wii and Capcom did the same with MT Framework, because in the advent of doing games for it having the same development environment was important. And they did, Dragon Quest X is crystal tools and MT Framework got used in Sengoku Basara 3.

Supporting PS3/X360 is a requirement, no doubt; then again running it on some capacity on the Wii U should be too, regardless of what they think of it. I'm saying they only get to do that decision because EA is going along with it (and they shouldn't, but that's for another thread).
I've seen that, but heard arguments that they just matched lag for every resolution (ie: the 480p output is delayed now).

I dunno, anywho, 400 days to fix that is a lot; so much for tech support from the dude's whose game uses their own tech (should have made it easier to point out and solve the problem considering they knew the inner workings well).
I want to add to your comment that Metroid Prime: 3, a 2007 game is the second if not best looking wii game, right there with Galaxy. Skyward Sword also looked really good.
True, played it the other way, that game is stunningly beautiful.
Would add Metroid: Other M and LOZ: Twilight Princess to that list, too. Both games have beautiful visuals on the Wii.
TP is a GC game, doesn't really impress on the Wii (although it's more impressive than SS, because they weren't trying to fight the hardware then, and they were before).

As for Other M, I agree, Team Ninja knows their shit (when working exclusively for one platform).
 
I disagree with you, big time.


First off, why Enix on the SNES? They did nothing that pushed SNES; and dragon quest games were outsourced things to Chunsoft (and looked subpar). They published Star Ocean, but didn't develop it. Comes off as odd.

And GC was Factor 5 and Capcom with RE4, for sure.


Sure, some companies go further that Nintendo on their own hardware, but going further is the keyword. For instance, Factor 5 went further on effects, tech, using CPU for graphics... And that's fine, but Nintendo is always the one going for the all rounder delivery.

No game on GC did more than Zelda TP, for instance, and no Wii game did more than Mario Galaxy (probably). You couldn't pull Factor 5 tech on those scenarios without sacrificing something, way more gaming logic going on there.


In the end hardware usage is going to be pretty damn similar if you looked at load percentages, it's just that Nintendo didn't focus on one thing, they tried to do a little bit of everything whereas a lot of the companies you're listing would throw a lot of things out of the window (AI, physics, complementary tasks) to focus on one thing alone, graphics from a technical standpoint.


I like the system pusher versus the ambient pusher terms, basically, on the N64 you had Factor 5 doing battle for naboo; that's an impresive game for the time, but it looks very barren and lifeless now; then look at OoT, doing the whole lost woods particles flying around, buttlerflies and insects flying around bushes; fish in the water, day and night system... and that's an impressive ambient pusher for 1998; of couse they did sacrifice a few polygons for it, but it's simply more vivid and impressive; that has way more impact than doubling the polygons and having a barren game nonetheless.

Instead of pushing "more polygons than anyone else" they pushed for an all rounder that looks and feels better.

There's nothing wrong with Nintendo tech. They did chips that enabled scrolling on the NES, before most; kept doing it on the SNES with shit like Starfox 2 (that didn't come out officially, but we know how evolved it was); they did 60 FPS on the N64, before most developers even thought about it too, and on the GC shit like Pikmin was unheard of.

Of course it took resources, but that's the whole point.


Quantic Dream tends to be considered a waste of perfectly good time and money on something that ends up not being quite a game, and for that Nintendo would never put down the money for it; but they have similar approaches on titles like Hotel Dusk, cheaper and way better artistically. As for Naughty Dog, that's the equivalent of Retro Studios, although Retro Studios is clearly the better one artistically.

Shin'en is as good as Factor 5 was; it's just they're less fascinated by making it big (Factor 5 downfall) and tackling big projects that take years to do. That said the dudes origin is the same, seeing they come from the Amiga demoscene. As for Monolith, time will tell, they fall a lot within the ambient pusher I just described anyway (and that's better in my book than aiming for being king of the block on a boring part of the hardware)

"Looking", "looking", "looking" I keep seeing this word "looking". I'm not about looks. I'm speaking in a technical sense, not a preferential(what I/you find most impressive) sense.

No game in the entire 6th generation of consoles pulled of what Rogue Leader and Rebel Strike did on the GC. No other game broke 14 million polygons per second in real time much less 20 million at 60 FPS with bump mapping, dynamic lighting and other effects. That was what I called pushing a console.

RE4 was a great game that I loved, but it did not "push" the GC at all. Graphically, I would say the RE4 was doing more especially when it came to lighting, texture effects and animation. http://www.youtube.com/watch?v=h_5kwX0LZAc

For the SNES, I was talking about Star Ocean.

Zangeki No Reginleiv did far more than Mario Galaxy all around. It had various levels texture effects and shading on the character, enemies and water, destructible environments, enemy dismemberment, decent physics, 4 player online multilayer and hundreds of massive individual enemies on screen at once. I'm not sure if it ran at 60 FPS(had a lot of slow down issues in huge battles) but was over usually over 30. I didn't think what that game did was possible on the Wii before I saw it.



I can assure you that as far as "pushing the hardware goes" this game is doing far more than Mario Galaxy.
http://www.youtube.com/watch?v=l1dEEKBdkHc
http://www.youtube.com/watch?v=2rDoPXhSxBU
http://www.youtube.com/watch?v=--qoj2ktIdU

I'm not analyzing peoples opinions of the games or "how they feel". I'm not talking about what people find is visually appealing or a waste of time. What people like is irrelevant. I'm talking about the pure factual technical prowess that is achieved on the hardware and demonstrated. Pure math and science.
 
I'm going to have to be brutally honest and say that's not going to happen, at least till E3 2014 that is.

anigif_enhanced-buzz-30486-1342447807-4.gif
 
But that game didn't sell very well if I remember correctly.

Either way, that has no bearing on Bayonetta 2 as that is being built from the ground up for the Wii U and we have visual confirmation of a polygon count increase by more than 500%.

No we don't. We have evidence that there is a 191k poly model of Bayonetta being produced: it could be for baking normals, for cutscenes etc... Have a look at some of the figures on this page: 20-30k polys for main character models, whether the game is on Wii or PS3/360. Why would that development practice suddenly change for a machine that is barely any more powerful?
 
No we don't. We have evidence that there is a 191k poly model of Bayonetta being produced: it could be for baking normals, for cutscenes etc... Have a look at some of the figures on this page: 20-30k polys for main character models, whether the game is on Wii or PS3/360. Why would that development practice suddenly change for a machine that is barely any more powerful?

That answer to that is simple. It is not "barely" more powerful. It is a lot more powerful, at least as far as the graphics processor is concerned. It is made of technology from 2008-2010 that didn't even exist when the PS3/360 were made.

As I've said many times. The Wii U is a next-gen console in release and power. It may not be as far next gen as the PS4/Xbox3 but it is more capable than any last gen console. Utilizing that capability properly is up to the developers talent and resources.
 
That answer to that is simple. It is not "barely" more powerful. It is a lot more powerful, at least as far as the graphics processor is concerned.

Fourth Storm doesnt seem to think so. He thinks it does a lot more with a lot less than 360 and is very efficient but in the end its not that much more powerful.
 
Fourth Storm doesnt seem to think so. He thinks it does a lot more with a lot less than 360 and is very efficient but in the end its not that much more powerful.

I thought Fourth Storm's stance on it was that he was waiting to see and that he is still uncertain about a lot of it?

The GPU still has a lot of black boxes that people like to write off as just random objects that are there to fill up space and don't do anything at all like Digital Foundry claimed. If they don't know what it is then its not important/doesn't exist.


There has been little that is "verified" on the GPU and they are still questioning some of those verifications. Until all of the letters are filled in, we do not know full potential of Latte.
 
"Looking", "looking", "looking" I keep seeing this word "looking". I'm not about looks. I'm speaking in a technical sense, not a preferential(what I/you find most impressive) sense.
I think you fail to see what I'm saying.

Factor 5 did some impressive manipulation of the hardware, true. But they did so by bypassing a lot of things that would decrease performance; like minesweeper, avoiding mines and molding their software around them. Hell, urban legend says dudes used the CPU to manipulate the GPU ISA (in ASM) in order to trick and inject more passes onto the TEV pipeline making it do more per cycle. They were putting every available resource after the GPU.

Flying Starwars ships have no complex animations for them, in fact they're not even on the ground so they don't have to animate in a believable way; meanwhile Zelda games uses skeletal animation, and that takes resources; Facial animations too.

Rogue Squadron 3 walking animations speak for themselves.


They had good tech, but were also doing compromises; had those compromises not be taken you would get less polygons (and probably get a better game in return, although people might call me biased for saying this; I was never their fan)

My point is, sure they pulled the most polygons. But that's not necessarily the best use of the hardware ever; Nintendo and other developers opted to pull less polygons and make games with more varied worlds, animations, AI, stuff going on. And that doesn't mean they were using the hardware in a lesser way or not pushing it.

Also bare in mind Factor 5 mindset was that of a demoscene; being playable was a plus.

Anyone could write a game to try and pull the most of something you can think of, and they'll come around as being the kings of doing it; it has merit, but it's also a pretty stupid thing to do most of the time, because you're limiting your own software in order to pull something out; meanwhile a lot of games opt for balancing, giving the adequate resources to everything and balancing from there.

Zelda TP was doing so many things at once, that Rogue Squadron didn't that the fact that it wasn't pulling the same polygons in there doesn't make it any less impressive.


Similarly, Shadow of the Colossus on the PS2 used a geometry simplification process for for away objects/areas (effectively reducing them to images), it wasn't a 10 million polygon game like Jak and Daxter or Virtua Fighter 4 yet it was more impressive than they ever were.
No game in the entire 6th generation of consoles pulled of what Rogue Leader and Rebel Strike did on the GC. No other game broke 14 million polygons per second in real time much less 20 million at 60 FPS with bump mapping, dynamic lighting and other effects. That was what I called pushing a console.
Read above.
RE4 was a great game that I loved, but it did not "push" the GC at all. Graphically, I would say the RE4 was doing more especially when it came to lighting, texture effects and animation. http://www.youtube.com/watch?v=h_5kwX0LZAc
What? RE4 didn't push the Gamecube?

I'm seriously shocked you said that. I can't even start to argue with tech specifications and things it was doing so I'll just give it up.

RE4 was crazy graphically for the GC.
For the SNES, I was talking about Star Ocean.
That's Tri-Ace, ex-wolfteam/telenet.

Enix didn't land a line of code there.
Zangeki No Reginleiv did far more than Mario Galaxy all around. It had various levels texture effects and shading on the character, enemies and water, destructible environments, enemy dismemberment, decent physics, 4 player online multilayer and hundreds of massive individual enemies on screen at once. I'm not sure if it ran at 60 FPS(had a lot of slow down issues in huge battles) but was over usually over 30. I didn't think what that game did was possible on the Wii before I saw it.
Very different games.
I can assure you that as far as "pushing the hardware goes" this game is doing far more than Mario Galaxy.
http://www.youtube.com/watch?v=l1dEEKBdkHc
http://www.youtube.com/watch?v=2rDoPXhSxBU
http://www.youtube.com/watch?v=--qoj2ktIdU
I know the game. Sorry it doesn't impress me as much.
I'm not analyzing peoples opinions of the games or "how they feel". I'm not talking about what people find is visually appealing or a waste of time. What people like is irrelevant. I'm talking about the pure factual technical prowess that is achieved on the hardware and demonstrated. Pure tech.
Me neither.

I'm saying I can write a game meant to push polygons and I'll come up with a scenario when that's easier to do, namely using flying ships that barely animate (and animate in a simple robotic way), or something like Quantic Dream does, barely a game hence it doesn't have to deal with game logic; which is like saying that tech demo's manage to have more resources for them because there's no game logic or how cutscenes pump the graphics up... again because game logic is simplified or gone; rogue squadron games were only complex on the things Factor 5 were doing on top, on a barebones analysis they were awfully simple; it's precisely the arguments you're making for Zangeki; it's not a polygon pusher of any sorts because it's doing a lot o things; just like TP did on the GC, or Shadow of the Colossus did on the PS2; it's impressive because it's an all rounder if you will.

Nintendo will always go for the all rounder, they're never ones to pull all the bullets in one place (polygons or whatever).
 
The wii u version seems to have more noticeable bloom and uses msaa instead of fxaa:
1280x-1

1280x-1

Now this is the original ng3 vs the wii u version, maybe they upgraded the visuals when they released razors edge on the ps3/360 to wii u standard?

EWW talk about some bad skinning on his shoulder (both photos) seriously TN you can do better than that.
 
What are you basing this on?

The few games that aren't bad ports and the ones that have been shown to be built from the ground up...like Bayonetta 2.

The basic model showed 192000 triangles compared to a 20000 for the PS3/360. Unless you have info that points one direction or the other, then this is what I will accept as the Wii U modal for now. There is also the monster at the end that people insisted was CG until the dev himself said it was rendered/real time, leading to the same people who calling it CG immediately jumping to denouncing it as unimpressive and claiming PS3/360 can do better like always(sigh).
 
I'm with lwilliams. If you have some info, spill it! :D And hey, when did you lose your tag?



In essence, it seems like they are squeezing out ~Xbox360/PS3 performance using less but more sophisticated logic in combination with larger/faster memory pools.

Here is hoping that 3D Mario/Mario Kart/X/Smash make dealing easier, but I still think Nintendo deserve a smackdown for aiming so low.



It certainly is. It seems somewhat odd that they still needed to add some additional logic, but it probably does end up amounting to less silicon/complexity in the end than just including a shrunk down Flipper on die. I'd like to try and identify where that 8-bit CPU is out of sheer curiosity. Shame that there doesn't seem to be more interest in Wii U homebrew.

I thought Fourth Storm's stance on it was that he was waiting to see and that he is still uncertain about a lot of it?

The GPU still has a lot of black boxes that people like to write off as just random objects that are there to fill up space and don't do anything at all like Digital Foundry claimed. If they don't know what it is then its not important/doesn't exist.



There has been little that is "verified" on the GPU and they are still questioning some of those verifications. Until all of the letters are filled in, we do not know full potential of Latte.

Lets see what he responds, but its all very much speculation anyway.
 
The few games that aren't bad ports and the ones that have been shown to be built from the ground up...like Bayonetta 2.

The basic model showed 192000 triangles compared to a 20000 for the PS3/360. Unless you have evidence that points one direction or the other, then this is what I accept as the Wii U modal now. There is also the monster at the end that people insist was CG until the dev himself said it was rendered/real time.
Is the 192000 triangle model a gameplay model?
 
Is the 192000 triangle model a gameplay model?

You tell me.

It is what we have, and it is what I base my analysis on until something that shows otherwise is appears.

Someone could simply ask Kamiya. He did answer when people asked if the monster at the end was CG or real.
 
Is the 192000 triangle model a gameplay model?
It's possible.
Bayonetta's polygon count for the PS3/360 was not low. If you add up the four guns that she wields (8k each), her polygon model was around ~52k, and the model for gameplay and cinema scenes was apparantly the same model.
 
It's possible.
Bayonetta's polygon count for the PS3/360 was not low. If you add up the four guns that she wields (8k each), her polygon model was around ~52k, and the model for gameplay and cinema scenes was apparantly the same model.

http://platinumgames.com/2009/12/04/the-secrets-of-bayonettas-models/
http://wiki.polycount.com/BodyTopology

This is interesting, though her model(gun included) apparently was no where near 52k in the original.

Ah, found the real time confirmation, though.
http://www.nintengen.com/2013/02/bayonetta-2-wii-u-confirmed-real-time.html
 
The few games that aren't bad ports and the ones that have been shown to be built from the ground up...like Bayonetta 2.

The basic model showed 192000 triangles compared to a 20000 for the PS3/360. Unless you have info that points one direction or the other, then this is what I will accept as the Wii U modal for now. There is also the monster at the end that people insisted was CG until the dev himself said it was rendered/real time, leading to the same people who calling it CG immediately jumping to denouncing it as unimpressive and claiming PS3/360 can do better like always(sigh).
Well then you have nothing. You are setting yourself up for let down.

What they found in that video was what all games do to make normal maps. Normal maps are used to give the models a high res look and smooth out the jagged low poly models.
 
You tell me.

It is what we have, and it is what I base my analysis on until something that shows otherwise is appears.

Someone could simply ask Kamiya. He did answer when people asked if the monster at the end was CG or real.
I doubt the PS4 will even have models (outside of racing games) with that many triangles.
 
Well then you have nothing. You are setting yourself up for let down.

What they found in that video was what all games do to make normal maps. Normal maps are used to give the models a high res look and smooth out the jagged low poly models.

The Wii U could simply use tessellation for that, though.
 
http://platinumgames.com/2009/12/04/the-secrets-of-bayonettas-models/
http://wiki.polycount.com/BodyTopology

This is interesting, though her model(gun included) apparently was no where near 52k in the original.

Ah, found the real time confirmation, though.
http://www.nintengen.com/2013/02/bayonetta-2-wii-u-confirmed-real-time.html
I don't see polygon counts listed in the links you provide. Furthermore, the first link only shown pics of a few sections of Bayonetta's body (like her butt and bust-line), while the second link did not include her guns.

I got my info from the beyond3d thread I posted eariler.
 
What are you basing that on?

I base that on the Froblins demo. It was a tech demo designed for the GPUs that Latte is primarily derived from to show off their new features. I posted it earlier.
http://www.youtube.com/watch?v=p7PlQ-q17tM

I don't see polygon counts listed in the links you provide. Furthermore, the first link only shown pics of a few sections of Bayonetta's body (like her butt and bust-line), while the second link did not include her guns.

I got my info from the beyond3d thread I posted eariler.

In the comments on the first link, someone state that the polygon count for her basic modal was 7000. I also remember reading somewhere that Bayonetta's basic modal(with guns) was in the 20k range.

I could not find the link, but I remember reading it. I will have to keep looking.

EDIT: Oh, it was this post. http://www.neogaf.com/forum/showpost.php?p=47294548&postcount=371
 
I base that on the Froblins demo. It was a tech demo designed for the GPUs that Latte is primarily derived from to show off their new features. I posted it earlier.
http://www.youtube.com/watch?v=p7PlQ-q17tM



In the comments on the first link, someone state that the polygon count for her basic modal was 7000. I also remember reading somewhere that Bayonetta's basic modal(with guns) was in the 20k range.

I could not find the link, but I remember reading it. I will have to keep looking.
Are you talkng about the bayonetta 2 developers trailerr where it shows them working on the model??
 
So uh, has anyone questioned how the Wii U is rendering a character 10x more complex than most PS3/360 games?

Also, what does that say about the rest of the characters and environments in Bayo2? Will they too be on the same level as Bayonetta?

Something isn't right here...
 
That are many times more powerful than the wiiu gpu?

As i stated before you are setting yourself up for a huge letdown....

Actually, quite the opposite. The working hypothesis for the Wii U lists most of its capabilities as higher than the GPUs the demo was made for. The baseline rumor for the Wii U GPU is that it "was" one of those GPUs until it was discovered how heavily customized it was.
 
So uh, has anyone questioned how the Wii U is rendering a character 10x more complex than most PS3/360 games?

Something isn't right here...

Its simple, its nots.

http://www.neogaf.com/forum/showthread.php?p=47294548

Actually, quite the opposite. The working hypothesis for the Wii U lists most of its specs as higher than the GPUs was demo was made for. The baseline rumor for the Wii U GPU is that it "was" those GPUs until all of the customizations were discovered.
The lowest 4800 series is 736 glfop / 95 watt tpd and some have more power than the x720 gpu on paper....

The dual 4800 have more power than even the ps4 gpu on paper.
 
Its simple, its nots.


The lowest 4800 series is 736 glfop and some have more power than the x720 gpu on paper....

The dual 4800 have more power than even the ps4 gpu on paper.

Yeah, on paper, but if that was true, then why would AMD go backwards in power with their next 3 generations of GPUs? You are not accounting for architectural enhancements and more efficient use of parts at all.

The Wii U CPU is heavily customized though. It is confirmed DX10.1 with some devs claiming DX11 minimum and the tessellation in the froblins demo demonstrates was to demonstrate DX10.1. It also has 32MB of fast EDRAM which would allow performance beyond those GPUs to certain degrees. Threre is more to performance than flops.

The chip the Wii U CPU matches the most is actually a generation ahead of those. I did a small writeup about it a while back. I'm still leaning heavily toward it being derived from that GPU. http://www.neogaf.com/forum/showpost.php?p=47955116&postcount=2487

Ah, and heres a nice explaination from lost in blue. http://www.neogaf.com/forum/showpost.php?p=49098816&postcount=2795
 
I base that on the Froblins demo. It was a tech demo designed for the GPUs that Latte is primarily derived from to show off their new features. I posted it earlier.
http://www.youtube.com/watch?v=p7PlQ-q17tM



In the comments on the first link, someone state that the polygon count for her basic modal was 7000. I also remember reading somewhere that Bayonetta's basic modal(with guns) was in the 20k range.

I could not find the link, but I remember reading it. I will have to keep looking.

EDIT: Oh, it was this post. http://www.neogaf.com/forum/showpost.php?p=47294548&postcount=371
They were off. Bayonetta's base model is 23k without guns. Here is the link.

http://beyond3d.com/showpost.php?p=1699978&postcount=1174
 
Yeah, on paper, but if that was true, then why would AMD go backwards in power with their next 3 generations of GPUs? You are not accounting for architectural enhancements and more efficient use of parts at all.

The Wii U CPU is heavily customized though. It is confirmed DX10.1 with some devs claiming DX11 minimum and the tessellation in the froblins demo demonstrates was to demonstrate DX10.1. It also has 32MB of fast EDRAM which would allow performance beyond those GPUs to certain degrees. Threre is more to performance than flops.

The chip the Wii U CPU matches the most is actually a generation ahead of those. I did a small writeup about it a while back. I'm still leaning heavily toward it being derived from that GPU. http://www.neogaf.com/forum/showpost.php?p=47955116&postcount=2487
You completely lost me. You understand that just because a gpu is newer doesnt mean it outperform every gpu before it?


The link write up is completely wrong. first off the r700[Radeon HD 4770] is already at 40nm. The leak feature set match completely the r700 series.

The "fast" edram is there to make up for the slow main ram. Also there just isnt any way the wiiu gpu is 352 gflop there just isnt enough room. Most likely it is half of that, so 176 glfops.
 
You completely lost me. You understand that just because a gpu is newer doesnt mean it outperform every gpu before it?


The link write up is completely wrong. first off the r700[Radeon HD 4770] is already at 40nm. The leak feature set match completely the r700 series.

The "fast" edram is there to make up for the slow main ram. Also there just isnt any way the wiiu gpu is 352 gflop there just isnt enough room. Most likely it is half of that, so 176 glfops.

You are overextending my statement. I said nothing of sort. I'm saying that you posting some higher RAW numbers does not mean that the Wii U GPU cannot do what is listed. I was pointing out that tech with lower numbers can be more capable.

All of the original analysis stated to the Wii U being 352 gflops exactly. If it were 176, then wouldn't even be able to run 360/PS3 games, much less enhance them the way it has.

That was one of the first things that they concluded about the GPU

http://www.neogaf.com/forum/showthread.php?p=47326597&highlight=320#post47326597
http://www.neogaf.com/forum/showthread.php?p=47302399&highlight=320#post47302399
http://www.neogaf.com/forum/showthread.php?p=47305563&highlight=320#post47305563
http://www.neogaf.com/forum/showthread.php?p=47309098&highlight=320#post47309098
http://www.neogaf.com/forum/showthread.php?p=47314378&highlight=320#post47314378
http://www.neogaf.com/forum/showthread.php?p=47308793&highlight=320#post47308793


Here is the original comparison I did. From the earlier post, the most probable setup was determined to be 40nm, run at 550 Mhz, have 320 SPUs/16 TMUs/8 ROPS as I just posted. Those stats matched the HD5550 perfectly for me. I'm sure the Wii U GPU can run the Froblins demo either way. How well it will run it is debatable, but I'm certain it can.
http://www.neogaf.com/forum/showthread.php?p=49751859&highlight=320#post49751859
 
Yeah, on paper, but if that was true, then why would AMD go backwards in power with their next 3 generations of GPUs? You are not accounting for architectural enhancements and more efficient use of parts at all.

The Wii U CPU is heavily customized though. It is confirmed DX10.1 with some devs claiming DX11 minimum and the tessellation in the froblins demo demonstrates was to demonstrate DX10.1. It also has 32MB of fast EDRAM which would allow performance beyond those GPUs to certain degrees. Threre is more to performance than flops.

The chip the Wii U CPU matches the most is actually a generation ahead of those. I did a small writeup about it a while back. I'm still leaning heavily toward it being derived from that GPU. http://www.neogaf.com/forum/showpost...postcount=2487

Ah, and heres a nice explaination from lost in blue. http://www.neogaf.com/forum/showpost...postcount=2795

It could be DX13 but it would still be a low powered chip. API compatibility is not indicative of power. A 5550 is also much weaker than the chips the Froblins demo was aimed at (4850/70). My earlier comment about the Bayonetta model wasn't necessarily about the Wii-U's processing power (though the evidence is that it is indeed barely more powerful) but that poly count on characters like that sits in the same ballpark across a lot of systems, from Wii to PC. There's no reason or evidence to suggest that Platinum would suddenly throw away best practice and blow a minimum of 200k polys out of the budget of every scene in their game. Finally tessellation and normal mapping are not mutually exclusive; tessellating every small detail on a character model instead of mapping it would be another colossal waste of resources.
 
It could be DX13 but it would still be a low powered chip. API compatibility is not indicative of power. A 5550 is also much weaker than the chips the Froblins demo was aimed at (4850/70). My earlier comment about the Bayonetta model wasn't necessarily about the Wii-U's processing power (though the evidence is that it is indeed barely more powerful) but that poly count on characters like that sits in the same ballpark across a lot of systems, from Wii to PC. There's no reason or evidence to suggest that Platinum would suddenly throw away best practice and blow a minimum of 200k polys out of the budget of every scene in their game. Finally tessellation and normal mapping are not mutually exclusive; tessellating every small detail on a character model instead of mapping it would be another colossal waste of resources.

I'm well aware of the difference between tessellation, and the various forms of texture mapping.

Whether you believe it or not, those are the numbers that have been shown for the Wii U version in all clarity, and as lwilliams3 showed, the modal 4 Bayonetta 1 already threw away "the best practice" as you call it.
 
You are overextending my statement. I said nothing of sort. I'm saying that you posting some higher RAW numbers does not mean that the Wii U GPU cannot do what is listed. I was pointing out that tech with lower numbers can be more capable.

All of the original analysis stated to the Wii U being 352 gflops exactly. If it were 176, then wouldn't even be able to run 360/PS3 game, much less enhance the way it has.

That was one of the first things that they concluded about the GPU

http://www.neogaf.com/forum/showthread.php?p=47326597&highlight=320#post47326597
http://www.neogaf.com/forum/showthread.php?p=47302399&highlight=320#post47302399
http://www.neogaf.com/forum/showthread.php?p=47305563&highlight=320#post47305563
http://www.neogaf.com/forum/showthread.php?p=47309098&highlight=320#post47309098
http://www.neogaf.com/forum/showthread.php?p=47314378&highlight=320#post47314378
http://www.neogaf.com/forum/showthread.php?p=47308793&highlight=320#post47308793


Here is the original comparison I did.
http://www.neogaf.com/forum/showthread.php?p=49751859&highlight=320#post49751859

Not all gflop are equal, meaning glfop do not equal performance. Even at 176 glfop it would out perform the ps360 gpus.

The original analysis was 160 and moved to 320. After more analysis it was found that there was just not enough room for 40 per block. It is very likely and the final analysis that we are looking at a gpu below 320. 20 per block is what is normal for r700 line and its likely this is what it is.

Here is the latest analysis from fourth-storm.

http://forum.beyond3d.com/showpost.php?p=1727786&postcount=4986
 
Not all gflop are equal, meaning glfop do not equal performance. Even at 176 glfop it would out perform the ps360 gpus.

The original analysis was 160 and moved to 320. After more analysis it was found that there was just not enough room for 40 per block. It is very likely and the final analysis that we are looking at a gpu below 320. 20 per block is what is normal for r700 line and its likely this is what it is.

Here is the latest analysis from fourth-storm.

http://forum.beyond3d.com/showpost.php?p=1727786&postcount=4986

Are you sure you read that right?

I saw Fourth Storm's theory when he first made it. That was his latest hypothesis which I believe was shot down not long ago. The key there was that the shaders had fixed function components used primarily to explain the Wii backwards compatibility and DX11 level stated by devs I believe, but it has been found that they aren't. An 8 bit cpu has been found to be responsible for the TEV conversion, so that would make the Wii U GPU less powerful than 360/PS3 GPUs if you kept on that route. That would intern make Trine 2: DC, NFS: Most Wanted U and Deus Ex Director's Cut impossible. Even he called it outlandish at the end.

Till this day, nothing has been shown to reasonably rule out that original claims to my knowledge.
 
Status
Not open for further replies.
Top Bottom