WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
As to judging power and capabilities based on multi plat games, that seems wrong. You must agree that exclusives often have better graphics (Halo, Uncharted, God of War, etc) because devs have focus and don't need to worry about feature compatibility etc.

...which is another reason that comparing Bayonetta (multiplat) to Bayonetta 2 (exclusive) is a bit misleading.
 
So the question becomes to what extent does experience with PS/360 carry over to Wii U?

I think the point was that it's not that much about experience with specific hardware (though that varies depending on how complex the hardware is) and more about general engine improvements that can apply globally.

Just as a simple example, the reason why post process anti-aliasing didn't appear before 2010 was not that developers needed to know the hardware better first (it's quite simple to apply on all "modern" GPUs), but simply that it wasn't developed earlier. If the Xbox 360 came out just now, games could use it on day one.
 
I'd love to know how they came to the 800 million transistor number. Even without the EDRAM that would still make it a resonable size GPU at about 500 million to 600.
 
It seems Toki Tori has received a big graphical update:

“On WiiWare, we were never allowed to keep our games up to date. This meant that our early adopters would end up with the least polished version of the game available”. Said Collin van Ginkel, co-founder of Two Tribes. “With the Wii U eShop this is a thing of the past, so we are very excited to be able to share today’s massive update for free!”

- New song which shows you exactly where you are on the world map.
- In-level warp system to reduce backtracking
- Additional and remixed puzzles based on Miiverse feedback. Thanks!
- More graphics and scenes that support the story.
- World map now automatically tracks where you’ve been
- Graphical overhaul of every single level
- New landmarks and themes (dragon skeletons, windmill park and more!)
- Expanded soundtrack, now comes with dubstep and happy hardcore disco tracks!
- Drawings from our Miiverse contest are now inside the game

Source

Can anyone confirm the differences?
 
An update from the developers of CANDLE:

We've successfully ported several shaders to Wii U's system, and now only a few camera effects (vignetting, DoF, noise&grain...) are left to be OpenGL'd. The process is being surprisingly quick and simple, so we expect to boost up Wii U's version development to be fully implemented soon, and this way aim for a less distant release date for it.

Good to know that the development is easy for them.
 
Well, there was even at least one game for the original Xbox that used deferred rendering...

Yes, pretty much all common engines support deferred rendering. They also support forward rendering. Deferred is more often used on PC from what I can tell, forward is more common on consoles due to hardware limitations. As you probably know, Xenos only supports up to four render targets for example. On top of that, some serious voodoo is required to even reach those four targets on 360 thanks to its memory architecture.

So yes, it was used on PS3 and 360. It wasn't standard, and it had some pretty significant limitations.

Well of course they support forward rendering, not only is that the "traditional" way to render anything, but you have to forward render certain things like translucency, so even an engine that supports deferred rendering of some kind supports it...

My argument is that today, my hunch is that the majority of AAA games released are using a deferred renderer of some type (deferred lighting or deferred shading). It is now standard in just about all multiplatform engines.

I know that I saw the transition happen on stuff I worked on in about 2008.
 
Not really as Nintendo really did hit a jackpot then.

GC managed to pull more polygons per second by a great margin and being a texturing beast; something XBox really wasn't (and could only pass 4 textures per pass vs GC's 8).

Because in the end that was something developers really wanted/needed, it literally made it so that throughout the generation Xbox was stuck doing the "polygon trick"; which is rendering the scene twice just so it can manage to match the 8 textures per pass GC did for free.

It meant that they were halving the already lower attainable polycount; which is why Halo 1 was a 10 million polygon game and then Halo 2 actually had less polygons going on just so it could have bump mapping on every surface.

It also didn't have dedicated memory for framebuffer and texture cache and that made it so that it had to "fish" for it on the main memory bank, framebuffer alone would amount to 8 MB at all times, in the end it still had more memory to work with but it had much higher latency and bottlenecks going on; the whole multipliers balancing act really paid off on Gamecube in making it efficient just so it didn't have to wait or loose cycles waiting for stuff.

Of course both platforms had advantages and disadvantages, for instance Gamecube did EMBM basically for free (used on every surface in Mario Galaxy) and that would be preferable to use on it than DOT3 (which I reckon was an incomplete implementation too); with Xbox it was the contrary, the type of bump mapping that would be cheap for GC would be hefty for it, and vice versa, so one had to take that into account. Xbox also had some problems with stuff like transparencies; they were pricey on it.

Then again, Xbox was a home PC GPU, so it was actually designed to go over 480p (and it did); AA was also easier to pull since despite loaning RAM from the main bank, it had more of it. Feature set was also more modern and documented, but the GC was hardwired to match a lot of those features on the cheap anyway; for anyone that wanted to loose their time on it, that is.


I definitely wouldn't say Xbox was the most powerful; that was the notion back then (helped by Microsoft itself, as it has been said that they gave developers incentives just so their version of the game in multiplatform was better; which would make sense, otherwise why would they bother? they don't bother today and certainly didn't bother on GC a lot of the times despite there being scenarios where PS2 was 30 fps, GC being based on it was 60 fps with better water, and then you had the Xbox version which was the prettier but stuck on 30 frames per second again) and it certainly did have more expensive parts going on, yes, but Gamecube certainly was doing more with less to the point of saying that whatever was there on paper for Xbox didn't really apply to the real-world scenarios; or surpass those of Gamecube on a steady principle.

I know this is a wii-u discussion thread, but I have to ask. Wasn't texturing one of the areas the xbox surpassed the gc in? Most GC games, even the good ones seemed to have relatively poor textures, whereas many xb games had pretty sweet, high res textures along with stuff like bump mapping and normal mapping.
 
With that said, even if the console "punches above its weight" - it is what it is. A small, quiet, compact 8th gen Nintendo box capable of producing stuff that's a bit nicer looking than the previous generation when put to use correctly. But not by leaps and bounds, or any such exaggeration as presented in this thread.

The sooner we can accept it, the better this thread will be.


/thread. Let's hope this penny finally drops for some.


...which is another reason that comparing Bayonetta (multiplat) to Bayonetta 2 (exclusive) is a bit misleading.


Or comparing Bayonetta (4 years old, first game using P* enegine) to Bayonetta 2 (not out yet, 5th[?] Game using P*engine). It's pointless. It doesn't prove or disprove anything.
 
...which is another reason that comparing Bayonetta (multiplat) to Bayonetta 2 (exclusive) is a bit misleading.
I agree that Bayonetta 1 and 2 shouldn't be compared but not because the first was multiplat, but because Platinum was getting used to their new engine. Platinum only developed the 360 version which was then ported by Sega to PS3 (correct?).
 
I know this is a wii-u discussion thread, but I have to ask. Wasn't texturing one of the areas the xbox surpassed the gc in? Most GC games, even the good ones seemed to have relatively poor textures, whereas many xb games had pretty sweet, high res textures along with stuff like bump mapping and normal mapping.

If you mean quality yes supported hi res format and much better filtering. If you're talking about certain aspects of hardware no and and you only need rebel strike to know gc could easily do what you mentioned about xbox titles and this is in higher poly enviroment. Devs never bothered with advanced GC techniques like factor 5, ubisoft, or capcom did hence why we saw what we did.
 
I always say its fine if you don't accept what I say as fact as I don't expect you to. Its an opinion.
The problem is, not everything you say is an opinion. You make claims of fact, and those should be open to questioning, the exact same way you question claims other people make.

For example, how about your claim that WiiU has "so many" 1080p60 retail exclusives? That one could be a matter of interpretation, I guess. But you've also claimed that God of War III was 30fps. That claim is wrong.
 
As to judging power and capabilities based on multi plat games, that seems wrong. You must agree that exclusives often have better graphics (Halo, Uncharted, God of War, etc) because devs have focus and don't need to worry about feature compatibility etc.

This. Mainly any game that's been ported to another platform will be lacking something in one way or another. Sure there are a few games with bumped up details if it's between generations.
 
The problem is, not everything you say is an opinion. You make claims of fact, and those should be open to questioning, the exact same way you question claims other people make.

For example, how about your claim that WiiU has "so many" 1080p60 retail exclusives? That one could be a matter of interpretation, I guess. But you've also claimed that God of War III was 30fps. That claim is wrong.

Well according to the DF analysis for GOW3 it did vary from 30fps right upto 60fps based on what was happening on screen.
 
I know this is a wii-u discussion thread, but I have to ask. Wasn't texturing one of the areas the xbox surpassed the gc in? Most GC games, even the good ones seemed to have relatively poor textures, whereas many xb games had pretty sweet, high res textures along with stuff like bump mapping and normal mapping.
That was due to RAM and storage constraints, GC had 24 MB of useful RAM (the other 16 MB were slow as hell, they were mostly used for sound and light tasks) and 1.4 GB storage per game disc, Xbox had 64 MB, albeit not all would be accessible and up to 8.5 GB on DVD for storage. In the end, it would still have roughly double the graphics memory going for it, if need be. Although that memory was not as good and part of the bottleneck against how the GC was engineered.

Stuff GC did, like Rogue Squadron 2 was never surpassed by Xbox and that entails stuff like bump mapping on every surface, with 15 million polygons @ 60 frames per second. Best Xbox did was 15 million polygons @ 30 frames per second on Rallysport Challenge 2.

It's really like this; Xbox Geforce 3 was technically better on paper, it certainly pushed more RAW polygons (although Microsoft's figures were inflated) and Gamecube's raw polygons (untextured and not lit) weren't anything to write home about; thing is, the thing had very little hit texturing them and applying effects; and on top of it it also did double the work per pass, it's the story about predictable performance and how much Nintendo loves that, and they hit the nail on the head regarding what would be the best thing to have to make a difference that gen.

In the end, it would trump the Xbox easily on attainable polygon count even without the polygon trick going on over on Xbox; and it had ease texturing and going insane on effects based on EMBM (Environment-mapped bump mapping) problem being... at the same time it didn't have much memory for them, so you could end up using low-res maps, like on Pikmin games, lots of shiny surfaces, but crude.

Then there's documentation, Xbox being standard and GC being pretty non-standard, each had it's quirks, I've read some stuff a few years and I forget specifics, but I know GC had less hit doing a specific kind of bump mapping and Xbox had less hit doing the other; that means if you were to port one game from the either system and not tweak those specifics both platforms could suffer so you had to take it into account. I also reckon GC DOT3 implementation was somewhat incomplete. It would work on a basic form (and be more costly than on Xbox), but stuff like normal mapping wasn't in. That doesn't mean it couldn't do normal mapping (in fact Wii did it in a few games, and that's a GC with 50% more clock), but it certainly wasn't on spec so developers didn't rely on it like they did on Xbox (they also relied on it on Xbox because they had to be pulling less polygons than what they wanted). Anyway, going back to the standard versus non-standard, Xbox was widely accepted to have a work pipeline that you ought to master, pixel shader knowledge/language being relevant to this day, with GC you had to really master it through limited means by comparison, you couldn't get a PS1.1 shader and simply drop it in like so many games did on Xbox, some developers didn't take it as a challenge but simply as a useless exercise and that was the wrong mindset.

As lazy as that sounds, though, on GC if you wanted to preview a TEV pipeline "shader" you had to tape the game out, put it on a console, I reckon only in 2008/2009 did Nintendo implement a TEV pipeline previewer on the Wii SDK. It really did take more time and was less wysiwyg to work with; that somewhat excuses a lot of lazy devs in the realms of, perhaps they really didn't have the time and resources.


In the end, it's basically this: you couldn't pull Halo 2 on GC without significantly downgrading textures but it could pull 60 fps no problemo (on the Xbox it was 30 frames per second endeavor); then again Metroid Prime 2 on Xbox could never hope to go past 30 frames per second on Xbox. Choice software ended up chasing the platform strengths and trying to hide the downlows for both consoles; but the GC wasn't really at disadvantage, no; it was crippled on RAM, disc storage and framebuffer RAM though. (I don't understand to this day how they didn't spruce up the embedded 1T-SRAM on the GPU for the Wii, we would have seen AA a lot more frequently that way; and less dithering too)
 
Now that is interesting...especially that they list Shader Model 5.0. I thought it was 4.0. And this
Memory Clock: 800 MHz
1600 MHz effective

Why is no one looking into this?

Why don't you look into it and tell us? :P

They still list it as 12.8GB/s BW which is pretty low.
 
Why don't you look into it and tell us? :P

They still list it as 12.8GB/s BW which is pretty low.

This doesn't really add anything to the discussion without context, nor was it anyway pertinent to what I was asking. What is the purpose of restating this now where is was not being discussed, debated or correlated with anything?
 
This doesn't really add anything to the discussion without context, nor was it anyway pertinent to what I was asking. What is the purpose of restating this now where is was not being discussed, debated or correlated with anything?

Geez, SORRY? I thought speed of ram affected the bandwidth of the memory. My bad ok? I'm still curious what you would find out though.
 
Now that is interesting...especially that they list Shader Model 5.0. I thought it was 4.0. And this
Memory Clock: 800 MHz
1600 MHz effective

Why is no one looking into this?
It seems unlikely that they know anything more than we do. RE the memory clock - that's one of the few known metrices we have: it's 800MHz DDR3, with an effective IO rate of 1600MHz, and a BW of 12.8GB/s. End of story.
 
It seems unlikely that they know anything more than we do. RE the memory clock - that's one of the few known metrices we have: it's 800MHz DDR3, with an effective IO rate of 1600MHz, and a BW of 12.8GB/s. End of story.

I wasn't speaking about the RAM specifically, but the entire link. Thanks for the info though.

They show Shader Model 5 as opposed to 4 which would be a significant piece of new information.
 
Iwata hismelf said the TDP for the Wii U was 45 watts. That is the only thing about the energy use that was confirmed.
 
They still list it as 12.8GB/s BW which is pretty low.

The memory bandwidth is one of the few things we do know for sure, from the module markings we calculated the total bandwidth of all the modules available a long time ago. 12.8 is correct for mem2 (the ddr).

The memory clock is consistent with the bandwidth we've known about for ages, what's the confusion Krizzx?
 
Iwata hismelf said the TDP for the Wii U was 45 watts. That is the only thing about the energy use that was confirmed.

Didn't he say it drew 45w? That's different from chip TDP, as that includes power supply efficiency, the disk drive, USB, wifi, etc etc. Besides, no one seems to be able to make it bump above 33w at the wall. 45W for just the chips alone doesn't seem right.
 
In case we've somehow forgotten

http://en.wikipedia.org/wiki/DDR3_SDRAM#JEDEC_standard_modules

1600MHz equivalent is what we've always known, and that's 12.8GB/s. Why did you ask why we didn't look at that, krizzx? With DDR 800MHz is 1600 effective, this is pretty basic and pretty sure that was first 10 page stuff.

Now that is interesting...especially that they list Shader Model 5.0. I thought it was 4.0. And this
Memory Clock: 800 MHz
1600 MHz effective

Why is no one looking into this?


And that site is likely guessing anyways, as above, the shader model documented is 4.
 
The memory bandwidth is one of the few things we do know for sure, from the module markings we calculated the total bandwidth of all the modules available a long time ago. 12.8 is correct for mem2 (the ddr).

The memory clock is consistent with the bandwidth we've known about for ages, what's the confusion?
Me confused? It was Krizz who said why hasn't anyone questioned about the 800mhz/1600mhz ram.
 
Me confused? It was Krizz who was said why hasn't anyone questioned about the 800mhz/1600mhz ram.

Sorry, misread. Edited. Yeah, 1600MHz equiv = 12.8GB/s, there is nothing to look into.

Frustrating that after all this discussion we've looped back to page one basic material, the memory was the simple stuff :P
 
Me confused? It was Krizz who said why hasn't anyone questioned about the 800mhz/1600mhz ram.

No, that wasn't what krizz said. Krizz said " Why hasn't anyone looked into this?" with "this" referring to the link itself as a whole.

I posted the mhz "and" the Shader Model because those stood out to me. I wasn't specifically asking about them.
 
You just truncated half my post...

You do you know what taking words out of context means?


Sorry, let me correct that.

Now that is interesting...especially that they list Shader Model 5.0. I thought it was 4.0. And this
Memory Clock: 800 MHz
1600 MHz effective

Why is no one looking into this?


Now, what did you want to say with the RAM thing?

You didn't bring it up for no reason. Saying "whoops, I got confused" is ok, you know.
 
No, that wasn't what krizz said. Krizz said " Why hasn't anyone looked into this?" with "this" referring to the link itself as a whole.

I posted the mhz "and" the Shader Model because those stood out to me. I wasn't specifically asking about them.
Using "this" in a post with three separate things you could be referring to, it's no wonder there was confusion. It wasn't a personal affront or misrepresentation of what you said, it was a misunderstanding based on an unclear demonstrative word. Particularly when your sentence referring to "this" came after the MHZ and Share Model bits, not the link you quoted. If you don't see why putting a proximity-based word not near the thing you're referring to could cause confusion, then I'm not sure how else to explain it.
 
Using "this" in a post with three separate things you could be referring to, it's no wonder there was confusion. It wasn't a personal affront or misrepresentation of what you said, it was a misunderstanding based on an unclear demonstrative word. Particularly when your sentence referring to "this" came after the MHZ and Share Model bits, not the link you quoted. If you don't see why putting a proximity-based word not near the thing you're referring to could cause confusion, then I'm not sure how else to explain it.

The MHz thing was in the post for a reason. He's just covering up now and won't say it. Pretty sure it was also him that kept trying to find ways around the measured bandwidth way back when we discovered it too, trying to use things like dual channel in ways that make no sense (you can't go over the combined bandwidth for each module).
 
its sad whats going on in this thread.....

Yeah, it's pretty sad.
When you have some guy dominating the conversation just spouting nonsense and claiming stuff like "that's why WiiU has so many 1080p60fps retail exclusives" (i guess he forgot to reply to smokydave) and a bunch of wishful thinking like "hey look, this site talks about shader model 5 and WiiU!", you can't really have a discussion.
You can't have a technical thread when people are as heavily invested as he is.
 
Using "this" in a post with three separate things you could be referring to, it's no wonder there was confusion. It wasn't a personal affront or misrepresentation of what you said, it was a misunderstanding based on an unclear demonstrative word. Particularly when your sentence referring to "this" came after the MHZ and Share Model bits, not the link you quoted. If you don't see why putting a proximity-based word not near the thing you're referring to could cause confusion, then I'm not sure how else to explain it.

No mistakes were made on their part, at least after I explained what I was asking the first time, or else they would have mentioned the Shader Model 4 in that response as well since I singled out both of those aspects. Then you have the post right above your that insisted on it even after I clarified even more. That isn't confusion. They are intentionally trying to twist and make issues out of my statements on purpose. This has been going on for a long time. For some reason, they seem to especially single me out for attack. I guess its because I make the most positive suggestions.

I can understand how they could be mistaken in what i was asking when I say "this". That is why cleared it up in the response, but when they insist on their preferred version even after I clarify and explain it, there is no more point in trying to hold a discussion. They'll just believe what they prefer to be true.

I try to make headway on the discussion using any new media, or details that surface(hence why I was asking why no one was looking into the link). They, in turn, make every effort they can to downplay, dismiss, spin, mitigate and convolute anything that may look good for the hardware. I'd rather just ignore and continue trying to have an actual on topic discussion. Making posts like this just fuels it on, and this thread gets derailed more than enough by people with negative intentions. I really hate this pointless offtopic arguments.

I just want to make progress on the analysis. I can not speak for the rest though.

Yeah, it's pretty sad.
When you have some guy dominating the conversation just spouting nonsense and claiming stuff like "that's why WiiU has so many 1080p60fps retail exclusives" (i guess he forgot to reply to smokydave) and a bunch of wishful thinking like "hey look, this site talks about shader model 5 and WiiU!", you can't really have a discussion.
You can't have a technical thread when people are as heavily invested as he is.

Thats not the claim nor is it nonsense, its a statement. I even posted the list on the page before I stated it.

It is a fact that at the moment, most Wii U games that aren't launch games, or ports are 1080p.

Full Retail Wii U games
Angry Birds Trilogy
Dragon Quest X
Diney's Planes
Donkey Kong: Tropical Freeze (unreleased but confirmed 1080 60 FPS)
Epic Mickey 2
Fast & Furious Showdown
Family Party: 30 Great Games Obstacle Arcade
Game and Wario
Game Party Champions
Jeopardy
LEGO Batman 2: DC Super Heroes
Monster Hunter Tri: Ultimate
Ryu ga Gotoku 1&2
Pokemon Rumble 2
Rapala Pro Bass Fishing
San Goku Shi 12
Scribblenauts: Unlimated
Skylanders: Giants
The Smurfs 2
Super Smash Bros. for Wii U(unreleased but confirmed 1080p 60FPS)
Transformers Prime
Wind Waker HD(unreleased but confirmed 1080p 30FPS)
Wheel of Fortune


Eshop

Art Academy: SketchPad
Bit Trip Runnner 2
DuckTales Remastered
Dungeons & Dragons: Chronicles of Mystara
Funky Barn
Giana sisters
Kung Fu Rabbit
Mighty Switch Force HD
Pure Chess
Chasing Aurora
Little Inferno
Mutant Mudds Deluxe
The Cave
Cloudberry Kingdom
Spot the Difference Party


Were 1080p but lowered in favor of more noticable details
Toki Tori 2
Nano Assault Neo

How is stating the existence of this information nonsense?
 
Didn't he say it drew 45w? That's different from chip TDP, as that includes power supply efficiency, the disk drive, USB, wifi, etc etc. Besides, no one seems to be able to make it bump above 33w at the wall. 45W for just the chips alone doesn't seem right.

Yup, he said it drew 45W on average and a maximum of 75W if I remember correctly. Once we see games pushing the hardware more that power draw should increase.
 
Didn't he say it drew 45w? That's different from chip TDP, as that includes power supply efficiency, the disk drive, USB, wifi, etc etc. Besides, no one seems to be able to make it bump above 33w at the wall. 45W for just the chips alone doesn't seem right.
How did you come to that conclusion?

When a device is quoted to draw N Watts that's what it means - that the device is drawing N Watts from its source. Not from the wall, or any other place between the source and the nearby power plant, for that matter ; )
 
Thats not the claim nor is it nonsense, its a statement. I even posted the list on the page before I stated it.

It is a fact that at the moment, most Wii U games that aren't launch games, or ports are 1080p.

How is this nonsense?

Yeah, it's nonsense. Out of 110 retail games, 20 (3 unreleased) games being 1080p60fps is hardly "many wiiU retail exclusives", which you're already spinning into "if they're not launch games or ports".
 
Yeah, it's nonsense. Out of 110 retail games, 20 (3 unreleased) games being 1080p60fps is hardly "many wiiU retail exclusives", which you're already spinning into "if they're not launch games or ports".

You just made a strawman argument. You twisted the context of what I said.

I said specifically "Most"(not many which you word swapped it to. They don't mean the same thing) Wii U games that aren't launch games of port(you skpped this entire comment). I said "so many" in a later statement in this thread as a general reference to this fact.

Stop twisting my words to facilitate fictional statements from me to argue against. Using fallacies in an argument is an underhanded tactic by those who don't really have an argument. If you disagree with what i say, then show me the flaws in what I actually said, not your reinterpretation of it.

And its 40 games(24 full retail to my knowledge), not 20. Note, that there may be more but I didn't put the ones I couldn't find confirmation for on the list.
 
You just made a strawman argument. You twisted the context of what I said.

I said specifically "Most"(not many which you word swapped it to. They don't mean the same thing) Wii U games that aren't launch games of port(you skpped this entire comment). I said "so many" in a later statement in this thread as a general reference to this fact.

Stop twisting my words to facilitate fictional statements from me to argue against. Using fallacies in an argument is an underhanded tactic by those who don't really have an argument. If you disagree with what i say, then show me the flaws in what I actually said, not your reinterpretation of it.

And its 40 games(24 full retail to my knowledge), not 20. Note, that there may be more but I didn't put the ones I couldn't find confirmation for on the list.

No, you're the one twisting words yet again. You started the 1080p60fps argument stating "many retail games".

There isn't one.

This is just what most hardware is geared toward in the current day. It would make the code the most portable. That doesn't matter in exclusives though, whit is why the Wii U has so many 1080p 60fps retail exclusives.

It's not 40 retail games and it's not "many", out of 110. I'm not the one twisting your words, you're the one doing it.
Now you're changing your words from "many retail titles" to "most titles that are not ports or launch games". You talked about retail games (which are 20) initially and you're now including eshop titles (other ~20).
You're like theKayle of Nintendo fans, it's impossible to argue with you.
But hey, I'm gonna stop this discussion right here, because this isn't the purpose of this topic.
 
That was due to RAM and storage constraints, GC had 24 MB of useful RAM (the other 16 MB were slow as hell, they were mostly used for sound and light tasks) and 1.4 GB storage per game disc, Xbox had 64 MB, albeit not all would be accessible and up to 8.5 GB on DVD for storage. In the end, it would still have roughly double the graphics memory going for it, if need be. Although that memory was not as good and part of the bottleneck against how the GC was engineered.

Stuff GC did, like Rogue Squadron 2 was never surpassed by Xbox and that entails stuff like bump mapping on every surface, with 15 million polygons @ 60 frames per second. Best Xbox did was 15 million polygons @ 30 frames per second on Rallysport Challenge 2.

It's really like this; Xbox Geforce 3 was technically better on paper, it certainly pushed more RAW polygons (although Microsoft's figures were inflated) and Gamecube's raw polygons (untextured and not lit) weren't anything to write home about; thing is, the thing had very little hit texturing them and applying effects; and on top of it it also did double the work per pass, it's the story about predictable performance and how much Nintendo loves that, and they hit the nail on the head regarding what would be the best thing to have to make a difference that gen.

In the end, it would trump the Xbox easily on attainable polygon count even without the polygon trick going on over on Xbox; and it had ease texturing and going insane on effects based on EMBM (Environment-mapped bump mapping) problem being... at the same time it didn't have much memory for them, so you could end up using low-res maps, like on Pikmin games, lots of shiny surfaces, but crude.

Then there's documentation, Xbox being standard and GC being pretty non-standard, each had it's quirks, I've read some stuff a few years and I forget specifics, but I know GC had less hit doing a specific kind of bump mapping and Xbox had less hit doing the other; that means if you were to port one game from the either system and not tweak those specifics both platforms could suffer so you had to take it into account. I also reckon GC DOT3 implementation was somewhat incomplete. It would work on a basic form (and be more costly than on Xbox), but stuff like normal mapping wasn't in. That doesn't mean it couldn't do normal mapping (in fact Wii did it in a few games, and that's a GC with 50% more clock), but it certainly wasn't on spec so developers didn't rely on it like they did on Xbox (they also relied on it on Xbox because they had to be pulling less polygons than what they wanted). Anyway, going back to the standard versus non-standard, Xbox was widely accepted to have a work pipeline that you ought to master, pixel shader knowledge/language being relevant to this day, with GC you had to really master it through limited means by comparison, you couldn't get a PS1.1 shader and simply drop it in like so many games did on Xbox, some developers didn't take it as a challenge but simply as a useless exercise and that was the wrong mindset.

As lazy as that sounds, though, on GC if you wanted to preview a TEV pipeline "shader" you had to tape the game out, put it on a console, I reckon only in 2008/2009 did Nintendo implement a TEV pipeline previewer on the Wii SDK. It really did take more time and was less wysiwyg to work with; that somewhat excuses a lot of lazy devs in the realms of, perhaps they really didn't have the time and resources.


In the end, it's basically this: you couldn't pull Halo 2 on GC without significantly downgrading textures but it could pull 60 fps no problemo (on the Xbox it was 30 frames per second endeavor); then again Metroid Prime 2 on Xbox could never hope to go past 30 frames per second on Xbox. Choice software ended up chasing the platform strengths and trying to hide the downlows for both consoles; but the GC wasn't really at disadvantage, no; it was crippled on RAM, disc storage and framebuffer RAM though. (I don't understand to this day how they didn't spruce up the embedded 1T-SRAM on the GPU for the Wii, we would have seen AA a lot more frequently that way; and less dithering too)

I didn't ask for your life story, yo.

:P

Thanks for the explanation. Had no idea that was the case.
 
No, you're the one twisting words yet again. You started the 1080p60fps argument stating "many retail games".



It's not 40 retail games and it's not "many", out of 110. I'm not the one twisting your words, you're the one doing it.
Now you're changing your words from "many retail titles" to "most titles that are not ports or launch games". You talked about retail games (which are 20) initially and you're now including eshop titles (other ~20).
You're like theKayle of Nintendo fans, it's impossible to argue with you.
But hey, I'm gonna stop this discussion right here, because this isn't the purpose of this topic.

That is a lie.

http://www.neogaf.com/forum/showpost.php?p=81066277&postcount=9421(The original statement. I was addressing the fact taht most game that weren' ports or launch games(for reason I explained late) were 1080p on the Wii U, and it pertained to retail and eshop titles).
http://www.neogaf.com/forum/showpost.php?p=81098173&postcount=9453
http://www.neogaf.com/forum/showpost.php?p=81068425&postcount=9423
http://www.neogaf.com/forum/showpost.php?p=81083177&postcount=9438


I stated it multiple times. As I just told you, when I said "so many"(not simply many, another augmentation on your part that alters context) it was a statement made only in reference to where I said it before, yet here you are still insisting on this fabrication you created and twisting my words even after I explained it. I told you clearly and yet here you are going on and on attacking this made up version of my statement that never happened and refusing the verify anything you are claiming.

"My" statement never changed, only your revisions to it.
 
Thats not the claim nor is it nonsense, its a statement. I even posted the list on the page before I stated it.

It is a fact that at the moment, most Wii U games that aren't launch games, or ports are 1080p.



How is stating the existence of this information nonsense?

It's a safe bet that many of those games are NOT 1080p. Where is it from?
 
Status
Not open for further replies.
Top Bottom