WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Isn't one of the few things Nintendo has said specifically about the CPU/GPU is that no silicon is being wasted on Wii mode? In other words there's nothing specifically dedicated to BC that isn't being used in Wii U mode or did I misinterpret that?

No I distinctly remember reading that.
 
Just wanted to ask if there is someone with good knowledge about DS and 3DS gpu architecture. I know they are handhelds and obviously uses diferrent architectures, but Wii U borrowing the second screen idea from the DS and 3DS, maybe Nintendo applied gpu stuff from those systems. I have been following the thread but may have missed if someone mentioned something about this.

Good read by the way. I can always rely on the good ol Latte thread for some free entertainment.
 
Just wanted to ask if there is someone with good knowledge about DS and 3DS gpu architecture. I know they are handhelds and obviously uses diferrent architectures, but Wii U borrowing the second screen idea from the DS and 3DS, maybe Nintendo applied gpu stuff from those systems. I have been following the thread but may have missed if someone mentioned something about this.

Good read by the way. I can always rely on the good ol Latte thread for some free entertainment.

I have bits and pieces for the DS. It has two 2D cores that are basically enhanced GBA GPUs, and a 3D core that is very unique in that there is a hard vertex limit per frame. For 3D games, the main screen usually uses the 3D-core, while the secondary screen uses one of the 2D-cores. For games that displays 3D of both screens, the 3D-core renders both screens and splits the max framerate from 60 to 30fps (similar to what happens when you are using two Wii U controllers).

For the 3DS, I don't know as much about it, but it does have the Pica200 for its GPU. I'm unsure on how it handles BC with the DS.
 
Just wanted to ask if there is someone with good knowledge about DS and 3DS gpu architecture. I know they are handhelds and obviously uses diferrent architectures, but Wii U borrowing the second screen idea from the DS and 3DS, maybe Nintendo applied gpu stuff from those systems. I have been following the thread but may have missed if someone mentioned something about this.

Good read by the way. I can always rely on the good ol Latte thread for some free entertainment.

Highly unlikely. Different manufacturer's for a start and worlds apart in terms of performance and architecture.
 
Is BG the guy with no tech knowledge but has info and analysis from his insider friend?

He came back but admitted he had no engineering or tech background to speak of and how he was just reiterating stuff from his friend who may or may not have an engineering background themselves.

Which is why I'm confused as to why, after he's been factually wrong so many times, he's still being held in high regard. Is it because he writes huge posts, and too often people just believe what little point he bolded (Regardless of the unlikelihood of the rest of the material)?

I stopped going to these threads, mostly because I was chased out by witch hunters who refused to be realistic in their expectations. Fourth Storm is pretty realistic, and pretty much started these 'investigations' into the hardware, so it's almost hilarious that some of the rambling users have snapped back and begun to reject him, also.

I was hoping you would come back and ask me what the facts were since I showed up that way you could get a proper understanding unless you didn't want that to begin with. Anyway I'll go ahead and make it easy for you.

The things I stated as fact that I recall were that Wii U games were most likely going to be targeting 720p, Wii U would have 2GB of memory, and that (at the time) it was highly likely based on all the info I got that 1GB of memory would go to games and 1GB to OS. As for not having a tech background I went to school for Computer Engineering Technology, but I was not able to finish. I used the upcoming round of consoles to help re-acclimate myself with current tech. And if there was anything above my head I made sure to point that out. So no background? No. Rusty background. Definitely.

Now from there performance was speculation based on what little tidbits I could scrape up and I tried to make it known as such. If Wii U games are supposed to look noticeably better than PS360 games then that means Wii U's performance should be 2-3x them based on BS multipliers. I talked both about a low power version with changes to gain extra performance elsewhere and a version with more raw power. I chose to focus on the raw power angle due to what we could learn about the dev kit GPU.

Now if there's anything I missed, feel free to ask me. But don't go misremembering things, saying them as fact, and not bothering to try and learn/remember the truth. :)

If people liked me then that's probably because first and foremost I tried to have fun with the whole process leading up to launch, and tried to answer any and all questions asked to me. When I know I'm wrong I have no problem admitting it. But I'm not going to say I'm wrong just because someone else thinks I am. The thing is these are just consoles. I don't take this as serious as some do. I don't need the consoles to be something just to justify an educated guess. I just want to learn and discuss the unknown till it becomes known...or till my current job doesn't make the money I like and I need to devote myself to improving my situation. ;)

Indeed, it works out for Brazos, since that only has 4 ROPs. Llano is tricky because there aren't any two identical blocks with the size/shape/SRAM requirements to be ROPs. What you've labeled as W seems a pretty good candidate though (and looks enough like the RV770 ROPs as well, even though that photo is so fuzzy it's hard to be certain). It might be as happened on the latest 360 processors and the single W block has 8 ROPs on Llano. It is a 32nm chip compared to the others, so perhaps that allowed for it.

That seems to be asking a lot for a standard GPU since Brazos is 40nm and has virtually the same characteristics. Plus/however like I mentioned originally, Xenos shows us at even at a larger fab, you don't need two blocks for one ROP when it's customized. I don't see them just duplicating a standard block like that, that's relatively unchanged. Also W in Llano is on the complete opposite side of the DDR3 interface according to this. And we know there's no eDRAM.

http://cdn.wccftech.com/wp-content/uploads/2011/04/amd_llano_die_block_diagram.png

If you noticed I was able to find similar blocks in Llano to all five blocks I consider the graphics engine. And on top of that they were all placed near each other like in Latte. The block most similar to U in Llano that you consider to be L2 cache in Latte is not close to W into Llano.


There's definitely something extra going on in J1. My theory is that there is extra logic sprinkled throughout many of the blocks which allow them to be used for Wii mode. Hence, fatter shaders and the one oversized J block. I don't know what I would equate the interpolators to on Flipper, though. The texture coordinate generator maybe?

Other than size (J1 is almost as big as the block you labeled as the CP) efficiency is another reason why I don't see them being Interpolators. I think I mentioned this in our discussion awhile back, but here is an article dated 9-23-09.

http://techreport.com/review/17618/amd-radeon-hd-5870-graphics-processor/5

Notable by their absence are the interpolation units traditionally found in the setup engine. These fixed-function interpolators have given way to a long-term trend in graphics processors; they've been replaced by the shader processors. AMD has added interpolation instructions to its shader cores as a means of implementing a new DirectX 11 feature called pull-model interpolation, which gives developers more direct control over interpolation (and thus over texture and shader filtering.) The shader core offers higher mathematical precision than the old fixed-function hardware, and it has many times the compute power for linear interpolation, as well. AMD CTO Eric Demers pointed out in his introduction to the Cypress architecture that the RV770's interpolation hardware had become a performance-limiting step in some texture filtering tests, and using the SIMDs for interpolation should bypass that bottleneck.

That's not too long after development on Latte supposedly started. I have a very tough time seeing AMD letting Nintendo keep something in their design that was already considered obsolete making them essentially into unnecessary space consumers when we see that space is a luxury.

As to your question...that's a good question, haha. I'm not familiar with all the functions of each block in Flipper/Hollywood handles to properly assess that.

Isn't one of the few things Nintendo has said specifically about the CPU/GPU is that no silicon is being wasted on Wii mode? In other words there's nothing specifically dedicated to BC that isn't being used in Wii U mode or did I misinterpret that?

What they said was that when they thought they needed to add Wii components 1:1 with Wii U components, the AMD/IBM designers working with them were already familiar with Broadway and Hollywood which allowed them to modify certain Wii U components for BC instead.
 
What they said was that when they thought they needed to add Wii components 1:1 with Wii U components, the AMD/IBM designers working with them were already familiar with Broadway and Hollywood which allowed them to modify certain Wii U components for BC instead.
So would that make the consensus opinion that Latte has fixed function shaders that replicate TEV functions in both Wii U and Wii modes? I've read both theories floating around, the first that it's fixed function and the second that TEV is being emulated into modern shaders on the GPU, and both seem to have merit.

Fixed function would make things a lot easier for Nintendo's internal teams but I can't imagine that 3rd party devs would be happy with a split programmable/fixed function GPU. Although since 3rd party devs actually don't seem to be happy maybe that's exactly what it is.
 
What they said was that when they thought they needed to add Wii components 1:1 with Wii U components, the AMD/IBM designers working with them were already familiar with Broadway and Hollywood which allowed them to modify certain Wii U components for BC instead.


Thanks for that. If that's what Nintendo actually said, (that they designed the Wii U, then modified Wii U parts in order to achieve fluid backwards compatibility as opposed to gutting/sacrificing for BC's sake) then I think that this idea that the core Wii U design is held back because of BC is negated. They built the machine they wanted to build it, and the console does not make sacrifices in order for people to play their Wii game collection.
 
So would that make the consensus opinion that Latte has fixed function shaders that replicate TEV functions in both Wii U and Wii modes? I've read both theories floating around, the first that it's fixed function and the second that TEV is being emulated into modern shaders on the GPU, and both seem to have merit.

Fixed function would make things a lot easier for Nintendo's internal teams but I can't imagine that 3rd party devs would be happy with a split programmable/fixed function GPU. Although since 3rd party devs actually don't seem to be happy maybe that's exactly what it is.
That was a theory eariler, but we now know that there is some logic and a tiny 8-bit chip translating TEV codes into something the Radeon architecture would understand.
 
That was a theory eariler, but we now know that there is some logic and a tiny 8-bit chip translating TEV codes into something the Radeon architecture would understand.
I wonder if that means TEV could still be used by Nintendo 1st party devs for Wii U development or if it's restricted only to BC?
 
A few pages back, there was a topic about game polygon counts. Now that we have some solid data about PS4's Shadowfall, lets go back to that for bit.

The polygon counts for the character models and the environment is in range or a little beyond what current-gen games have shown.

Characters models: 40k max with 7 LOsD (for example, it will switch to a lower 20k model 6 feet away from the camera.)

The City Landscape, as pretty as it is, is only 500k polygons for the whole visible geometry. with lighting [so shadows] its 670k. It's also only uses 1700 draw calls for GPU in city landscape scene.

While the game is still in development, the devs decided to not throw a ton of polygons on the screen, but to use them cleverly (they did a good job). The major difference between the current-gen and Killzone Shadowfall is the good use of lighting and more access to RAM.

The Wii U is suppose to have many of the graphical features that the PS4, so we may be able to see some amazing games running at or a little beyond "current-gen" polygon levels. The Wii U doesn't have nearly as much RAM as the PS4 though, so the RAM it does have would have to be more conservatively and cleverly used.
 
A few pages that, there was a topic about game polygon counts. Now that we have some solid data about PS4's Shadowfall, lets go back to that for bit.

The polygon counts for the character models and the environment is in range or a little beyond what current-gen games have shown.

Characters models: 40k max with 7 LOD (for example, it will switch to a lower 20k model 6 feet away from the camera.

The City Landscape, as pretty as it is, is only 500k polygons for the whole visible geometry. with lighting [so shadows] its 670k. It also only uses 1700 draw calls for GPU in city landscape scene.

While the game is still in development, the devs decided to not throw a ton of polygons on the screen, but to use them cleverly (they did a good job). The major difference between the current-gen and Killzone Shadowfall is the good use of lighting and more access to RAM.

The Wii U is suppose to have many of the graphical features that the PS4, so we may be able to see some amazing games running at or a little beyond "current-gen" polygon levels. The Wii U doesn't have nearly as much RAM as the PS4 though, so the RAM it does have would have to be more conservatively and cleverly used.

If developers really utilize WiiU's graphical strengths then yeah we should no problem. There are some aspects of WiiU that cant be argued that it does have over current gen period. Beautiful great games are in the eyes and hands of the developers when its all said and done
 
The Nintendo direct for the Wii U tomorrow should yield some interesting analyzation data. Maybe they have an even more optimized X for the showing.

datscalexlqf1.gif


Then there is the one announced for e3 time next month as well. Now we can see what the next level of Nintendo's character art will look like.
 
"The Wii U is suppose to have many of the graphical features that the PS4"

Dont hold your breath and ill be waiting to see these games. As it stands the games they have now dont even compare visually to three year old games on the ps3, nor anything mentioned or shown in the future. A monolith game shown in a bland open world with no enemies on screen doesnt do anything for me.

One step at a time. First lets see if they can surpass the visual fidelity and graphics of games currently on ps3 like GOW, KZ and Uc. That is yet to be seen.
 
"The Wii U is suppose to have many of the graphical features that the PS4"

Dont hold your breath and ill be waiting to see these games. As it stands the games they have now dont even compare visually to three year old games on the ps3, nor anything mentioned or shown in the future. A monolith game shown in a bland open world with no enemies on screen doesnt do anything for me.

One step at a time. First lets see if they can surpass the visual fidelity and graphics of games currently on ps3 like GOW, KZ and Uc. That is yet to be seen.

Funny, The screens people were bragging about for Killzone 4 showed the exact same thing.

Only it was 80% cutscene graphics were Monoliths X was 100% gameplay.

No one is arguing that the PS4 isn't stronger, but still, every feature that it can do the Wii U should be able to. It may not be able to do it as well, but they are there.
 
"The Wii U is suppose to have many of the graphical features that the PS4"

Dont hold your breath and ill be waiting to see these games. As it stands the games they have now dont even compare visually to three year old games on the ps3, nor anything mentioned or shown in the future. A monolith game shown in a bland open world with no enemies on screen doesnt do anything for me.

One step at a time. First lets see if they can surpass the visual fidelity and graphics of games currently on ps3 like GOW, KZ and Uc. That is yet to be seen.
Why are you posting this in this thread?
 
What is "80% cutscene graphics"?

And what are you guys defining as bland?

Guys? To me, neither looks, bland (of course I'm not a fan""person""). I'm just countering his statement using his own logic.

Some people take much offense to anyone claiming any aspect of Nintendo hardware beats or even compares to Sony hardware. I wouldn't be surprised to see people "honestly" believe the graphics of the Wii U are beneath PS2, and actually try to prove. I've seen people try to prove that the PS1 had better graphics than the Wii.

If developers really utilize WiiU's graphical strengths then yeah we should no problem. There are some aspects of WiiU that cant be argued that it does have over current gen period. Beautiful great games are in the eyes and hands of the developers when its all said and done

If it has multiple tesselators it will definitely be interesting.

Has anyone tried to contact some of the devs that are more open to discussion about the their use of the hardware? I don't use Twitter so I can't ask. We honestly should be sending Kamiya more questions. He seems to give the most responses. The main thing I want to ask him is if the Bayonetta modal shown in the trailer is the player modal.
 
Funny, The screens people were bragging about for Killzone 4 showed the exact same thing.

Only it was 80% cutscene graphics were Monoliths X was 100% gameplay.

No one is arguing that the PS4 isn't stronger, but still, every feature that it can do the Wii U should be able to. It may not be able to do it as well, but they are there.

Gameplay of what? What monolith gameplat? They just showed a guy running by water and the other shots was of a sole character. As i stated that does nothing for me. Im not disputing it wont look good but i doubt itll look better than most ps3 heavy hitters out there from a few years ago.

Funny the gameplay of kz sf i saw during the jimmy fallon show sure fooled me!!!! That gameplay footage with all those enemies was eye candy to me and def. was not cutscenes. Wrong game maybe that was deep down you were referring too.
 
Guys? To me, neither looks, bland (of course I'm not a fan""person""). I'm just countering his statement using his own logic.

Some people take much offense to anyone claiming any aspect of Nintendo hardware beats Sony hardware. I wouldn't be surprised to see people "honestly" believe the graphics of the Wii U are beneath PS2, and actually try to prove. I've seen people try to prove that the PS1 had better graphics than the Wii. Though this is all offtopic.
I'm confused as to what point you're making. Nevermind.
 
You know, there is a question I've been meaning to ask that hasn't come up. What are the Wii U's anti-aliasing capabilities, and does outputting at 1080p or 720p with 4xMSAA produce less aliased characters.

I think one of the biggest problems with the Wii was that people didn't understand the difference between the aliasing present in the game and the graphical fidelity demonstrated much like when you show a lower quality video of something and people attack the video quality likes its how the game actually looks. They ignored all graphical games and just slammed it with(look at the jaggies!). I noticed this often when I saw people try to say that Smash Bros Melee had better graphics than Smash Bros Brawl. Increase in polygon count, texture resolution, texture details and texture effects just got completely ignored.

In my understanding, most people would prefer a low detail, simplistic character modal with no aliasing to an extremely detailed character with 3X the polygon count that has a lot of edges.

I've also seen that the Deus Ex Wii U version will have added anti-aliasing on top of a completely new and enhanced lighting system in its improvement over the last gen console. Where would the Wii U's capabilities stand for this?
 
Really should keep comparison to other next gen consoles out of here. While wiiu may punch above its weight lets be real for a min. The "tablet or netbook" cpu in the ps720 people joke about on here uses more power than the entire wiiu system.
 
Really should keep comparison to other next gen consoles out of here. While wiiu may punch above its weight lets be real for a min. The "tablet or netbook" cpu in the ps720 people joke about on here uses more power than the entire wiiu system.

Anything can use power, but how much "use" is makes of that power is another story.

Also, what you say makes no sense. Regardless of what people want to be true, the PS4 and Xbox3 will be the primary competition against the Wii U in the long run. It will be compared to them until the next system from Nintendo is announced in 5 years.

As per my first comment, there is one thing I can definitely say the Wii U will beat the other two next gen consoles in. Efficiency. Per watt, no other console does quite as much. It may not be the F1 next gen, but electric bill certainly won't be as taxed by it.
 
That is easily answered. As I just stated and you referened from Factor 5, it was never maxed out.

There is no question that 99% of the devs on the Wii didn't go in 100%. No one attempted to "push" the hardware. Over half the games look like subpar PS2 games and ran worse. Hardly anyone used normal mapping or bump mapping and no one used its anti-aliasing capabilities. The best looking game were the ones you could play from Gamecube discs aside from a handful a Wii titles. The best looking game was MG2 and Nintendo has never ever "pushed" their hardware. Their aim has always been to be creative as opposed to technical in both their hardware and games.

We know the Wii was capable of really good phsyics because of Elebits and Boom Blocks but those were rarities. We know the Wii was capable of massive use of Normal Mapping thanks to that homebrewer from B3D. We know the Wii was capable of HDR because of the devs of Cursed Mountain. We never quite saw someone go all the way and push the Wii for all it can get under optimal conditions though.

The simplest answer is that we never saw all that the Wii can do and now we probably never will.

I'm sure that the Wii U still has miles to go, though. Even now, they are still optimizing the OS and hardware, so increased performance and subsequently better looking games are guaranteed. Heck, we haven't even seen what its tesselator can do yet. I'll be nice when Shin'en drops some screen form their next Wii U game.
I think that Nintendo themself pushed the Wii pretty good. I think we saw about top level what the Wii could do from games like Super Mario Galaxy 2 and Zelda Skyward Sword.
 
From the tech breakdowns, we can agree that it's realtime, and GG have been nice enough to shed light on that. On lesser platforms they would have to be baked. So i guess it could be aesthetically done on other systems, but Its not the same thing and is a bad argument.
 
I think that Nintendo themself pushed the Wii pretty good. I think we saw about top level what the Wii could do from games like Super Mario Galaxy 2 and Zelda Skyward Sword.

Nintendo never "pushes" their hardware. What we saw from Nintendo is what "should have been" the middle ground for what Wii games looked like. Shin'en were the only people who matched it.

When it came to really showing of its tech, such a normal mapping, parallax mapping, HDR and so on, Nintendo didn't push any of that. When it can to pushing polygon detail, Nintendo didn't push any of that. They visually appealling game, but the Wii could do a lot more on a technical level.

If people expect Nintendo to be the ones to show off what the Wii U can do, then we will never see it.
 
Anything can use power, but how much "use" is makes of that power is another story.

Also, what you say makes no sense. Regardless of what people want to be true, the PS4 and Xbox3 will be the primary competition against the Wii U in the long run. It will be compared to them until the next system from Nintendo is announced in 5 years.

As per my first comment, there is one thing I can definitely say the Wii U will beat the other two next gen consoles in. Efficiency. Per watt, no other console does quite as much. It may not be the F1 next gen, but electric bill certainly won't be as taxed by it.
Its using less power because its order of magnitude less powerful.

Its impossible to say it more efficient at this point.
 
Its using less power because its order of magnitude less powerful.

Its impossible to say it more efficient at this point.

That is a complete lie. We know exactly how much power the parts use and what they are minimumally capable of. The Wii U doesn't even exceed 40 watts yet. I still have the old chart showing the comparison from last gen.

next-gen_console_power_lg.jpg


The Wii U itself is doing more than the last gen consoles and it only requires 1/4 the power. I think 37 watts was the highest I've seen stated. It will never exceed 75, so at the worst the Wii U can do, it will still be the most efficient.
 
I've also seen that the Deus Ex Wii U version will have added anti-aliasing on top of a completely new and enhanced lighting system in its improvement over the last gen console. Where would the Wii U's capabilities stand for this?
Well, the eDRAM is large enough to support the big backbuffer for a 720p MSAA render. Is the render output bandwidth high enough to make it worth it? Who knows.

does outputting at 1080p or 720p with 4xMSAA produce less aliased characters.
If geometric sampling is your issue, 4xMSAA does; you're effectively sampling polygon edges at 1440p.

Obviously 1080p will win with stuff like per-pixel shader aliasing.

//=================

Its using less power because its order of magnitude less powerful.

Its impossible to say it more efficient at this point.
Not really. Nintendo's love for low clocks means that their performance/watt is probably actually really good.

The Wii U itself is doing more than the last gen console and it only requires 1/4 the power.
Well, that's not a fair comparison either, given how the silicon industry has advanced. WiiU is using parts on an even smaller node than the 360 slim.
 
From the tech breakdowns, we can agree that it's realtime, and GG have been nice enough to shed light on that. On lesser platforms they would have to be baked. So i guess it could be aesthetically done on other systems, but Its not the same thing and is a bad argument.

Indeed, seeing their "Matrix" (vertices) mode is really awesome.

i9ogHZ3w4G9s3.gif


The buildings in the distance aren't all impressive, but it still shows quite a bit.
 
Well, that's not a fair comparison either, given how the silicon industry has advanced. WiiU is using parts on an even smaller node than the 360 slim.

I believe its on a smaller node than the Wii as well, so its even more efficient than the Wii, but I doubt the same can be said for the HD 7XXX's in the other two next gen consoles and the Jaguar CPUs. I'd say, we ran rest asured the Wii U hardware will be the more efficient of the lot.

Indeed, seeing their "Matrix" (vertices) mode is really awesome.

i9ogHZ3w4G9s3.gif


The buildings in the distance aren't all impressive, but it still shows quite a bit.

Matrix mode? It just look like a wireframe with green illuminated vertices. Is that supposed to be a technical feat and why are you advertising it in here? That has nothing to do with the Wii U GPU
 
That is a complete lie. We know exactly how much power the parts use and what they are minimumally capable of. The Wii U doesn't even exceed 40 watts yet. I still

The Wii U itself is doing more than the last gen consoles and it only requires 1/4 the power. I think 37 watts was the highest I've seen stated. It will never exceed 75, so at the worst the Wii U can do, it will still be the most efficient.
when people say efficient that is performance per watt. Not use less power.

Also I was speaking of next gen consoles. You have 2 @ 28nm and the other at 40nm.
 
That is a complete lie. We know exactly how much power the parts use and what they are minimumally capable of. The Wii U doesn't even exceed 40 watts yet. I still have the old chart showing the comparison from last gen.

next-gen_console_power_lg.jpg


The Wii U itself is doing more than the last gen consoles and it only requires 1/4 the power. I think 37 watts was the highest I've seen stated and it will never exceed 75 so at the worst the Wii U can do, it will still be the most efficient.

Of course Wii U is more efficient than PS3 and 360. But PS4 and the next Xbox are not out yet, so we can't make comparisons.

If geometric sampling is your issue, 4xMSAA does; you're effectively sampling polygon edges at 1440p.

Obviously 1080p will win with stuff like per-pixel shader aliasing.

This is true as long as MSAA works as perfectly as it should. In many modern engines it unfortunately doesn't, especially when deferred rendering/shading is involved.
I'm not deep enough into the matter to explain it myself, but from what I've heard it could get better with DX10+ compliant hardware.

Not really. Nintendo's love for low clocks means that their performance/watt is probably actually really good.

True, but PS4's and Nextbox's clocks are not rumored to be very high either. And they will be manufactured in a smaller process from the start, this should help for the performance/watt ratio.
 
Matrix mode? It just look like a wireframe with green illuminated vertices. Is that supposed to be a technical feat and why are you advertising it in here? That has nothing to do with the Wii U GPU

They called it "matrix mode" because of the end resulting look. It's part of their debug process. But whatever.
 
Still, it's difficult to understand how the Wii was essentially maxed out as quickly as it was. SMG1/2 were pretty much the high point of the system graphically and were released very early in the Wii's lifespan. Every console prior to the Wii saw gradual improvements over time so in my opinion there was other than hardware strength holding it back from its potential and some miscalculation creating an artificial bottleneck that Nintendo didn't intend seems like a good guess.

I always figured just that nobody cared about "maxing" any Wii. It was far behind anyway, and few FPS came out for it which are the most graphically demanding games.

Anyways on a totally separate topic, I used to often bookmark posts I thought were particularly egregious or the poster would eat crow later. A lot of them dealt with Wii U and what a monster it was going to be in the early days. The other day I was perusing a few of these and they were pretty funny. There was some pretty damning stuff by BG. I will probably post a few later as I dont have time to mess with it now.

Just remember there was a time where more or less a ton of people thought Wii U was going to be more powerful than Durango. Nobody came out and said it but that's basically what they were getting at.

It's just kind of sad to look back at how Wii U undershot our worst expectations by a large amount. Even mine, and I was it's #1 detractor. I really expected it to at least be better than PS360 easily. So far I haven't seen that.

I remember specifically one guy's post, it was in response to that thing that Crytek was working with Wii U or whatever, and he said something like "Crytek? what kind of monster are you building here Nintendo" or something like that. That was the attitude back then. Wii U was a monster.

Oh another thing was power, I think BG had it using 100 watts or something, in one of the bookmarked posts. The reality was 33. 33.

I think the rumored Durango specs was everything Wii U should have been from an engineering standpoint. The whole "good enough" thing.

ooh, here's one lol

http://www.neogaf.com/forum/showpost.php?p=34561383&postcount=115

ooh here's a good one

http://www.neogaf.com/forum/showpost.php?p=34897079&postcount=5331

more

http://www.neogaf.com/forum/showpost.php?p=35065775&postcount=480

BG 100 watts

http://www.neogaf.com/forum/showpost.php?p=34726131&postcount=3759

What are you making Nintendo

http://www.neogaf.com/forum/showpost.php?p=34725551&postcount=3755

Nintendo's generation to lose

http://www.neogaf.com/forum/showpost.php?p=34627143&postcount=3012

Here's a good one. "special guy knows this", RV770, >3ghz CPU, etc

http://www.neogaf.com/forum/showpost.php?p=34555060&postcount=2318

"I'm gonna laugh my ass off and bump this thread when PS4/720 games dont look better than Wii U"

http://www.neogaf.com/forum/showpost.php?p=42217305&postcount=944

Here's BG predicting e6760 and 600-800 gflops

http://www.neogaf.com/forum/showpost.php?p=41597923&postcount=2367

Here's a good reggie article "faster processors and prettier pictures wont motivate consumers, they need to react to us with Wii U"

http://news.cnet.com/8301-10797_3-5...fils-aime-microsoft-sony-need-to-react-to-us/



anyways it's ot to this thread so i guess i wont belabor it. it's just interesting how attitudes have changed.
 
Good times. My hope was seeing any 7th gen game, which we would inevitably see ports of, at 1080p.

Some proponents like to crow about the Wii U's power and how obviously more powerful it is and point to NFS and Trine as significantly improved ports yet we don't see them run above 720p.
 
A note on the power consumption.

I keep seeing people talk about how they "technically" have room to overclock because it uses ~33 watts when in use.

Has anyone actually measured the power consumption with all the USB ports in use as well as the disc spinning?
 
I always figured just that nobody cared about "maxing" any Wii. It was far behind anyway, and few FPS came out for it which are the most graphically demanding games.

Anyways on a totally separate topic, I used to often bookmark posts I thought were particularly egregious or the poster would eat crow later. A lot of them dealt with Wii U and what a monster it was going to be in the early days. The other day I was perusing a few of these and they were pretty funny. There was some pretty damning stuff by BG. I will probably post a few later as I dont have time to mess with it now.

Just remember there was a time where more or less a ton of people thought Wii U was going to be more powerful than Durango. Nobody came out and said it but that's basically what they were getting at.

It's just kind of sad to look back at how Wii U undershot our worst expectations by a large amount. Even mine, and I was it's #1 detractor. I really expected it to at least be better than PS360 easily. So far I haven't seen that.

I remember specifically one guy's post, it was in response to that thing that Crytek was working with Wii U or whatever, and he said something like "Crytek? what kind of monster are you building here Nintendo" or something like that. That was the attitude back then. Wii U was a monster.

Oh another thing was power, I think BG had it using 100 watts or something, in one of the bookmarked posts. The reality was 33. 33.

I think the rumored Durango specs was everything Wii U should have been from an engineering standpoint. The whole "good enough" thing.

BG had 'called' me out a few posts ago, so I took a look at his post history and the back sledding was amazing.

Do a google search of him and 'dev kit' and he was going around on Gamespot and GameFAQs claiming he had all this access to the dev kit and surprises, but he's been called out enough and he's busy "gettin' Money" playing for the Money team with an inside basketball court AND an outside basketball court to reply.

http://www.ign.com/boards/threads/wiiu-devkit-pics-wiiu-early-target-specs.452522923/
http://www.gamefaqs.com/boards/631516-wii-u/61019975?page=2
http://www.neogaf.com/forum/showthread.php?p=41938848&highlight=#post41938848
http://www.neogaf.com/forum/showpost.php?p=38549553&postcount=1
http://www.neogaf.com/forum/showthread.php?p=42351669&highlight=#post42351669
 
"The Wii U is suppose to have many of the graphical features that the PS4"

Dont hold your breath and ill be waiting to see these games. As it stands the games they have now dont even compare visually to three year old games on the ps3, nor anything mentioned or shown in the future. A monolith game shown in a bland open world with no enemies on screen doesnt do anything for me.

One step at a time. First lets see if they can surpass the visual fidelity and graphics of games currently on ps3 like GOW, KZ and Uc. That is yet to be seen.

Seems that my topic has been derailed. I specifically meant that Latte is based on a DX10.1-equilivant processor, that has been further enhanced to include features that are seen in DX11. This is mostly due to Latte being designed for OpenGL rather than Direct X.

There are several reasons why we have not seen games that takes full advantage of Wii U's full feature set, including devs still focused on DX9 machines as a base until recently.
 
specialguy said:

most of our best guesses for how powerful the Wii U GPU is came from the initial round of leaks calling it an R700 equivalent. we're still largely going off the fumes of those rumors so seeing people speak up about how powerful people thought it was back in the day is weird. we don't know a whole lot more then we did a long time ago and i'm under the same impression now that i was when i used to browse those threads.
 
most of our best guesses for how powerful the Wii U GPU is came from the initial round of leaks calling it an R700 equivalent. we're still largely going off the fumes of those rumors so seeing people speak up about how powerful people thought it was back in the day is weird. we don't know a whole lot more then we did a long time ago and i'm under the same impression now that i was when i used to browse those threads.

We do have an understanding of what it "should" be capable of any why now. That is a huge jump from back then.
 
A note on the power consumption.

I keep seeing people talk about how they "technically" have room to overclock because it uses ~33 watts when in use.

Has anyone actually measured the power consumption with all the USB ports in use as well as the disc spinning?

I believe the 33 watts test was with the disc drive in use. The external hard drive would probably push it upwards, and so would any other USB accessories. I think Iwata said the power draw would push up to 45w or so,
 
Seems that my topic has been derailed. I specifically meant that Latte is based on a DX10.1-equilivant processor, that has been further enhanced to include features that are seen in DX11. This is mostly due to Latte being designed for OpenGL rather than Direct X.

There are several reasons why we have not seen games that takes full advantage of Wii U's full feature set, including devs still focused on DX9 machines as a base until recently.

Can you explain this further because I don't think it matters like you think it does.
 
Can you explain this further because I don't think it matters like you think it does.

The GX2 API is based on OpenGL with some Nintendo specific features. It's rumored that it's based off of OpenGL 3.3, except we know it has more compute and tessellation features added in, as well as whatever functions Nintendo has customized themselves. I'm actually not entirely certain where it fits in the DirectX equivalency framework, but it's far more modern than what is on the 360 or PS3, for instance.
 
Nintendo never "pushes" their hardware. What we saw from Nintendo is what "should have been" the middle ground for what Wii games looked like. Shin'en were the only people who matched it.

When it came to really showing of its tech, such a normal mapping, parallax mapping, HDR and so on, Nintendo didn't push any of that. When it can to pushing polygon detail, Nintendo didn't push any of that. They visually appealling game, but the Wii could do a lot more on a technical level.

If people expect Nintendo to be the ones to show off what the Wii U can do, then we will never see it.
How do you know that the Wii is capable of that much more than what Nintendo did?
 
most of our best guesses for how powerful the Wii U GPU is came from the initial round of leaks calling it an R700 equivalent. we're still largely going off the fumes of those rumors so seeing people speak up about how powerful people thought it was back in the day is weird. we don't know a whole lot more then we did a long time ago and i'm under the same impression now that i was when i used to browse those threads.

It was also fueled by hype posts by "insiders" like BG and Ideaman. When Akham ad llhere (probably misspelled the names heh) said to not get too excited about the specs they were pounced on in the wust threads(llhere not so much but he wasn't as blunt about it).
 
most of our best guesses for how powerful the Wii U GPU is came from the initial round of leaks calling it an R700 equivalent. we're still largely going off the fumes of those rumors so seeing people speak up about how powerful people thought it was back in the day is weird. we don't know a whole lot more then we did a long time ago and i'm under the same impression now that i was when i used to browse those threads.

Remember when some AMD rep said it was capable of 1 terraflop?
 
Status
Not open for further replies.
Top Bottom