WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
What's with the assumption occasionally thrown around here that the WiiU is massively more efficient when compared to anything else out there?

I'm sure it's very well designed and all, but it seems as if some people here expect it to outperform its (assumed) specs by a factor of five.

The notion that Nintendo has access to technology drawing less than 40w that outperforms or matches 200w systems, I don't get it.
Is this a serious post?
 
What's with the assumption occasionally thrown around here that the WiiU is massively more efficient when compared to anything else out there?

I'm sure it's very well designed and all, but it seems as if some people here expect it to outperform its (assumed) specs by a factor of five.

The notion that Nintendo has access to technology drawing less than 40w that outperforms or matches 200w systems, I don't get it.

Not even the most hopeful here are suggesting that. But I get your point, especially at 45 and 40nm processes rather than 28 like the other two, that's two fabrication process generations away.

Regarding the Tori Tori dev on texture compression, like we already discussed dozens of pages ago, every modern Radeon (and separately geforce) has something like that. The Wii U may well have such a feature, but that developer didn't say if the other two don't (and I think that team only works on wii u anyways). Really, Nintendo would be in the GPU business if half of this wishing was true :P
 
I was agreeing with you. I said the business case isn't there for ports (in general). If there was a business case, it wouldn't be impossible.

Gotcha. I thought you were claiming it's a fact that PS4/xbone ports would be way more expensive than PS360 ports.

Re-reading it, you weren't necessarily saying that; I misread the line about the 360 I think.

Carry on..
 
What's with the assumption occasionally thrown around here that the WiiU is massively more efficient when compared to anything else out there?

I'm sure it's very well designed and all, but it seems as if some people here expect it to outperform its (assumed) specs by a factor of five.

The notion that Nintendo has access to technology drawing less than 40w that outperforms or matches 200w systems, I don't get it.

The Xbox Slim IIRC is down to like 90 watts, Ps3 slim is on a similar level.
 
Not even the most hopeful here are suggesting that. But I get your point, especially at 45 and 40nm processes rather than 28 like the other two, that's two fabrication process generations away.

Regarding the Tori Tori dev on texture compression, like we already discussed dozens of pages ago, every modern Radeon (and separately geforce) has something like that. The Wii U may well have such a feature, but that developer didn't say if the other two don't (and I think that team only works on wii u anyways). Really, Nintendo would be in the GPU business if half of this wishing was true :P

Well yes, i exaggerated for effect, but it was suggested by the poster above me that development tricks might negate much of the PS4/XBONE's ram advantage. Both these console are likely to have at least 5 GB of available compared to the WiiU's 2GB total.

Your post is pretty much what I was getting at.

EDIT: And by "getting at" I mean "This is what i was trying to say".
 
The Xbox Slim IIRC is down to like 90 watts, Ps3 slim is on a similar level.

Those are both on 45nm, if I recall correctly? But obviously, those are high clocked long pipeline in order architectures from when IBM favored clock speed over all (incidentally during the terrible Pentium 4 days) so even on 45nm they're doubling the Wii U draw due to the clock speeds (3.06GHz vs 1.2, far more different than the 1.2 vs 1.6 of the Wii U vs the other 8Gers)


What I want to know is when did "watts" start to equal performance?


They don't. But if you know what fabrication plant it's coming from, you can start to get a good ballpark. It's not exact, but say, if we're looking at Intel CPUs on the 45nm generation, we can guesstimate the performance of the 17W parts compared to the 35W, it doesn't scale directly but it can give a good overview.

Only with separate process technologies are the numbers thrown off, ie a 17W Haswell part on 22nm blows the pants off 35W 45nm Core 2 parts.

No one is saying watts = performance. But knowing the total system draws 33W, the CPU having far less than that at its disposal (if the 5:1 transistor ratio of GPU to CPU is an indication, even 10W to the CPU and 20 to the GPU is overly optimistic, the disk drive and everything else still need power), and being on a two generation old 45nm fabrication plant for the CPU and 40 for the GPU, that tells us something at least.

tl;dr People trying to pinpoint exact performance from that are silly, but people ignoring it entirely are just as much or even more so. It gives us a ballpark, no more no less.
 
What I want to know is when did "watts" start to equal performance?

Well I guess the implication is that power draw increases with performance.
While you can make a very well thought out and efficient design, there is a minimum of power needing to be used per unit of performance.

The WiiU is probably very efficient, but it is also very small. So there are limits to how much power you can pack into such a small package; heat concerns and all that jazz.
 
Not even the most hopeful here are suggesting that. But I get your point, especially at 45 and 40nm processes rather than 28 like the other two, that's two fabrication process generations away.

Regarding the Tori Tori dev on texture compression, like we already discussed dozens of pages ago, every modern Radeon (and separately geforce) has something like that. The Wii U may well have such a feature, but that developer didn't say if the other two don't (and I think that team only works on wii u anyways). Really, Nintendo would be in the GPU business if half of this wishing was true :P

The dev doesn't really need to mention it, I wouldn't doubt XB1 or PS4 has texture compression. But that doesn't mean it isn't different tech or that its exclusive tech.

I mean hardware compression technology isn't limited to what AMD or Nvidia provides.
 
The dev doesn't really need to mention it, I wouldn't doubt XB1 or PS4 has texture compression. But that doesn't mean it isn't different tech or that its exclusive tech.

I don't know it is like people already forgot how the HD twins shared multiplats with PCs with 8GB+ memory. Crysis is mostly a PC franchise and it was made for the 512MB HD twins. I am sure there are a lot of ways around the RAM difference, most devs should be experts by now going by the long running HD twins. I am not implying it will look the same, simply that the memory difference does not seem to be a barrier for Wii U next gen multiplats. Also the Wii U has more memory than the HD twins and could free up more in the future.

I was not implying it was secret sauce, just pointing at the dev comment and acknowledging the feature.
 
I don't know it is like people already forgot how the HD twins shared multiplats with PCs with 8GB+ memory. Crysis is mostly a PC franchise and it was made for the 512MB HD twins. I am sure there are a lot of ways around the RAM difference, most devs should be experts by now going by the long running HD twins. I am not implying it will look the same, simply that the memory difference does not seem to be a barrier for Wii U next gen multiplats. Also the Wii U has more memory than the HD twins and could free up more in the future.

I was not implying it was secret sauce, just pointing at the dev comment and acknowledging the feature.

I think the biggest issue won't be the lack of memory but more the lack of bandwidth, because of this I doubt youll get anything like, like for like next gen ports, you might get a simiarlish downport but it all depends on what is stressed in what system.
 
The dev doesn't really need to mention it, I wouldn't doubt XB1 or PS4 has texture compression. But that doesn't mean it isn't different tech or that its exclusive tech.

I mean hardware compression technology isn't limited to what AMD or Nvidia provides.

Oh, for sure. But that doesn't imply if it's better, worse, equivalent, etc. AMD and Nvidia are good examples again since they both came up with memory compression independently but ended up near the same ratios. Now we don't even think about it since the effective memory use by both ends up nearly the same under the same loads. Nintendo may have made a different solution, but radically better than two competing GPU companies? I suspect not, I suspect they're pretty close to the same.
 
Does a cold CPU perform more instructions per seconds per watt than a hot CPU?
If it does then running the CPU at a low clock would be more efficient!
 
No, I'm absolutely 100% certain that I said you twisting what is being said and making assumption without attempting to verify is what's messing up the thread, and not the bolded which you have twisted it to be(once again) and assumed while refusing to verify if true(coincidence?).

Coming in here with negative intentions is also a reason after that.


Not trying to get caught in the debate here, but it appeared to me Stevie was supporting you in some ways

And he also knows about the Wii U power pretty well and is someone who consistently correct misperceptions regarding the Wii u

My two cents; I will now back away slowly
 
Not trying to get caught in the debate here, but it appeared to me Stevie was supporting you in some ways

And he also knows about the Wii U power pretty well and is someone who consistently correct misperceptions regarding the Wii u

My two cents; I will now back away slowly

I'm not trying to cause any rift (moreso than this thread already has, it's a bloody mess) but simply that extreme viewpoints are why topics like religion or politics never go anywhere. Reality is almost always somewhere in the middle.
 
Not even the most hopeful here are suggesting that. But I get your point, especially at 45 and 40nm processes rather than 28 like the other two, that's two fabrication process generations away.

Regarding the Tori Tori dev on texture compression, like we already discussed dozens of pages ago, every modern Radeon (and separately geforce) has something like that. The Wii U may well have such a feature, but that developer didn't say if the other two don't (and I think that team only works on wii u anyways). Really, Nintendo would be in the GPU business if half of this wishing was true :P
Well, there's also something that has to be pointed. When people talks about "Nintendo specific features" they don't mean that those are graphical features only known or discovered by Nintendo, but techniques PRIORITIZED by Nintendo.
In other words, when both AMD or Nvidia launch a line of GPUs have to be VERY cautious about what is expected by the industry, and the features that will be implemented on the next DirectX iteration its something that nearly determines the whole feature-set of a certain GPU (only in some cases). AMD knows this very well, they were the ones pushing for a tessellation unit on a GPU and spent 3 generations of GPUs with a tessellation unit on them that wasn't used beyond some demos made by ATI themselves.

Nintendo, as someone that owns the whole "production chain" (in the sense that both hardware and software APIs are made by them) can ask for customizations that no one would made on the PC scenario because of fear of wasting silicone in something not implemented on the general graphical API used by everyone.
They may want to prioritize some graphical effects that they feel are the most important, thus making a "hardware unit" specialized on it, and that goes from texture compression algorithms to some shaders directly implemented by the GPU.

The N3DS for example is a pretty clear example of that. In order to gain efficiency they ditched out the whole programmability of the fragment shaders and coupled some vertex shaders with a number of fixed functions that they considered to be the most important and good enough to produce good graphics. By doing that they ended with a much more limited solution in terms of programmability, but a much more better one in terms of graphical power for transistor used.

Nintendo loves to customize their hardware, but on the other hand, you have all the 3rd parties that want the most PC-ised solution possible, in order to reduce the difficulty to port their games between platforms.

Regarding the dual-engine issue, I don't know if the WiiU has it, but what I have read is that the GC had 3 rasterizers (and that would match it's higher polygonal output compared to the original Xbox if that one only had one rasterizer to work with) and the same goes for the Wii.
If the WiiU is emulating the Wii at the same clocks (and this has been confirmed), and if there was an optimal case where the GC rasterizers could rasterize 1 polygon per clock each, then there is no way that there are less than 3 rasterizers on the WiiU (modern rasterizers are better approaching the 1 tri/clk ideal, but if you want more than 1 tri/clk you need more than one rasterizer).

This could be something interesting and worth of a more in-depth discussion.

Regards!
 
Well I guess the implication is that power draw increases with performance.
While you can make a very well thought out and efficient design, there is a minimum of power needing to be used per unit of performance.

The WiiU is probably very efficient, but it is also very small. So there are limits to how much power you can pack into such a small package; heat concerns and all that jazz.

You want get a 800Mhz GPU in WiiU case, but a 550Mhz GPU in a case smaller than X360 Slim at 40nm is nothing to sneeze at. Voltage/Amps=Watts, so a increase in voltage to the GPU for a increased clock rate would increase watts used, but shaders, tmu, cu, rops and memory have there on separate set of parameters that affect performance. They do add to power requirements.
 
I'm not trying to cause any rift (moreso than this thread already has, it's a bloody mess) but simply that extreme viewpoints are why topics like religion or politics never go anywhere. Reality is almost always somewhere in the middle.

Well said. I personally enjoy your posts and perspective, since you do have knowledge behind the curtain. Most of the level headed discussion that made this thread excellent in the beginning has disappeared unfortunately

On the flip side I do feel some people are too "extreme" on the other side of the Wii u debate here and do turn down krixx's points of value where they emerge. He is interested in discussion though I wish he was more open to other perspectives rather than writing them off.

(Continues to back away slowly)
 
Well, there's also something that has to be pointed. When people talks about "Nintendo specific features" they don't mean that those are graphical features only known or discovered by Nintendo, but techniques PRIORITIZED by Nintendo.
In other words, when both AMD or Nvidia launch a line of GPUs have to be VERY cautious about what is expected by the industry, and the features that will be implemented on the next DirectX iteration its something that nearly determines the whole feature-set of a certain GPU (only in some cases). AMD knows this very well, they were the ones pushing for a tessellation unit on a GPU and spent 3 generations of GPUs with a tessellation unit on them that wasn't used beyond some demos made by ATI themselves.

Nintendo, as someone that owns the whole "production chain" (in the sense that both hardware and software APIs are made by them) can ask for customizations that no one would made on the PC scenario because of fear of wasting silicone in something not implemented on the general graphical API used by everyone.
They may want to prioritize some graphical effects that they feel are the most important, thus making a "hardware unit" specialized on it, and that goes from texture compression algorithms to some shaders directly implemented by the GPU.

The N3DS for example is a pretty clear example of that. In order to gain efficiency they ditched out the whole programmability of the fragment shaders and coupled some vertex shaders with a number of fixed functions that they considered to be the most important and good enough to produce good graphics. By doing that they ended with a much more limited solution in terms of programmability, but a much more better one in terms of graphical power for transistor used.

Nintendo loves to customize their hardware, but on the other hand, you have all the 3rd parties that want the most PC-ised solution possible, in order to reduce the difficulty to port their games between platforms.

Regarding the dual-engine issue, I don't know if the WiiU has it, but what I have read is that the GC had 3 rasterizers (and that would match it's higher polygonal output compared to the original Xbox if that one only had one rasterizer to work with) and the same goes for the Wii.
If the WiiU is emulating the Wii at the same clocks (and this has been confirmed), and if there was an optimal case where the GC rasterizers could rasterize 1 polygon per clock each, then there is no way that there are less than 3 rasterizers on the WiiU (modern rasterizers are better approaching the 1 tri/clk ideal, but if you want more than 1 tri/clk you need more than one rasterizer).

This could be something interesting and worth of a more in-depth discussion.

Regards!

I actually brought up the 3 rasterizer thing long ago when the dual engine discussion was still fresh, but it went completely ignored.

One thing that is absolutely certain about the GPU is that is has a lot of duplicate components visible on the die. Seeing what appears to be a 2x increase in polygon count at framerate/resolutions higher than the mean of last gen is hard to explain without the Wii U at least having the ability to draw 2 polygons per cycle. It could also be tessellation, but that is unverifiable at the moment.

Another thing I would like to explore is the Broadway CPU family ability to calculate vertices in conjunction with the GPU.

Well said. I personally enjoy your posts and perspective, since you do have knowledge behind the curtain. Most of the level headed discussion that made this thread excellent in the beginning has disappeared unfortunately

On the flip side I do feel some people are too "extreme" on the other side of the Wii u debate here and do turn down krixx's points of value where they emerge. He is interested in discussion though I wish he was more open to other perspectives rather than writing them off.

(Continues to back away slowly)

I'm open to all perspectives, so long as they are grounded and substantiated with correlating media, verifiable professional claims, or documentation(which is rarely the case in recent times).
 
I think the biggest issue won't be the lack of memory but more the lack of bandwidth, because of this I doubt youll get anything like, like for like next gen ports, you might get a simiarlish downport but it all depends on what is stressed in what system.

Applies the same example, multiplat games have lived on the PC and HD twins, with the PC having more bandwidth. Maybe some games will not scale back well, but for most if the dev is willing to put on the effort IMO it is possible.
 
Oh, for sure. But that doesn't imply if it's better, worse, equivalent, etc. AMD and Nvidia are good examples again since they both came up with memory compression independently but ended up near the same ratios. Now we don't even think about it since the effective memory use by both ends up nearly the same under the same loads. Nintendo may have made a different solution, but radically better than two competing GPU companies? I suspect not, I suspect they're pretty close to the same.

I highly doubt that Nintendo made a better solution, but Nintendo R&D department would definitely develop a contract with a company specializing in hardware compression technology. I mean who else would of chose Mosys 1T-SRAM for embedded memory. I believe GC texture compression tech S3TC is 8:1 but its not AMD or at the time ATI tech.
 
I'm open to all perspectives, so long as they are grounded and substantiated with correlating media, verifiable professional claims, or documentation(which is rarely the case in recent times).

Unfortunately, the hardware is still under extremely strict NDA and I don't see that going away any time soon.

You might get the occasional folk willing to drop a hint here and there, but not if the discussion is a mess of extreme viewpoints (and not if they'll get yelled out of the room, as per the past). ;)
 
That sounds awfully like texture compression ...

Why would an established developer point out such a feature if the Gamecube, the Wii, and the 360 used texture compression?

Nintendo really placed their bets with eDRAM, and what we dont know is what technologies were developed or brought in to support that.
 
You want get a 800Mhz GPU in WiiU case, but a 550Mhz GPU in a case smaller than X360 Slim at 40nm is nothing to sneeze at. Voltage/Amps=Watts, so a increase in voltage to the GPU for a increased clock rate would increase watts used, but shaders, tmu, cu, rops and memory have there on separate set of parameters that affect performance. They do add to power requirements.

Oh, I'm not saying its bad for what it is. I assume that an increase in shader units etc. would also require more die space upping heat consumption.

Nintendo deliberatley designed an HD home console to be as small and silent as possible, and they succeeded.
 
M°°nblade;82140045 said:
Oh, I do ackownledge the fact that there's a difference in experience between Wii U and PS360 hardware.
But I think the significance of that was greatly exaggerated when I see how, almost a year further and developers using final dev kits, the Wii U versions of multiplatform games have barely improved, relative to their PS360 versions.
You have people in this thread saying how they believe the Wii U is definitely more than twice as powerful as PS360 hardware, yet the releases it gets, one year after launch, still are on par or marginally better. I know nothing of game development but I doubt optimising your engine for the Wii U will someday give you a 100% framerate boost.

Multiplatform games are not the best tool to judge the maximum power or capabilities of a console, but they don't have to. It doesn't disqualify them at all as a meaningful way to compare hardware. Comparing how the same game looks during the exact same scene, at how many frames a second it runs on different hardware tells something about performance relative to each other. Battlefield 3 and Bioshock infinite don't exactly show the maximum power of a 7990 or a Titan either. But these games are used in benchmarks because they do tell something about the difference between graphics cards. Based on these benchmarks it's safe to say they are more powerful than an AMD 6950 HD eg..

Comparing different games (and this includes exclusives) is entirely pointless. Sure, Halo 4 tells more of the maximum power of the XBox360 than a multiplatform title. But what can Halo 4 versus Uncharted 3 tell you about the hardware difference between the Xbox 360 and the PS3? Are you comparing art design or polygons on screen? Nobody knows because you can't measure them or compare them directly. It's all subjective.

Thanks for the reply. I understand better what you mean about using multiplatform titles as a reference but I still think its disingenuous to do so for a couple reasons. Please allow me to explain.

First off, you say you acknowledge the difference in experience between Wii U and PS360 hardware. In doing so we are looking at two different architectural methodologies, GPU vs CPU centric respectively. That might not be a problem since they can accomplish a similar end, but leads me to the second point of why it may not be an apples to apple comparison.

Consider this snippet of an excellent post a couple pages back comparing GC and Xbox
I definitely wouldn't say Xbox was the most powerful; that was the notion back then (helped by Microsoft itself, as it has been said that they gave developers incentives just so their version of the game in multiplatform was better; which would make sense, otherwise why would they bother?

Combine the architectural differences (which likely result in greater effort for porting) with the comparatively miniscule Wii U install base, with the arguably lower buying power of Wii U's demographic and I can't think of a single reason why devs would do anything above and beyond for a multiplatform Wii U release. As i quoted above, in the past it was the same with Xbox. The largest incentive (even if that incentive was Microsoft paying out) got the prettiest version of a multiplatform game. Critical to my point here, the prettiest version was NOT the most powerful hardware.

Finally as you say comparing Halo 4 to Uncharted 3, I agree it would be meaningless. As you say different design etc. But that's not a good analogy. We were originally comparing Bayo 1 to Bayo 2. Similar design, art style, developer, not sure what engine it is on but may share a lot there too.

Now I don't mean to come across as pugnacious here or anything but I'd say while Bayo 1 to Bayo 2 may not be a perfect comparison, for the reasons I mention above its probably better than a PS360 mutliplat or port. I still think question is deciding to what extent Platinum can take knowledge of PS360 and Bayo 1 and apply it to Wii U Bayo 2. Sort of to your point I still would be curious to see a PS4/XBone multiplat (without a PS/360 version) ported to Wii U. I reckon the similar feature sets and architectures would result in a prettier Wii U version than if the game had originally targetted PS360. How do you feel about that hypothetical scenario?
 
I think they did more than succeeded... They nailed it. I know Wii U isn't what some people wanted in a next gen console but for what it was designed to do they hit a home run.

Well, I would argue that having as small of a console as possible isn't all that useful by itself. Having one that is too big can of course be a liability, but as long as it fits under most peoples home theater setups it really does not matter.

The small size also limits total hardware power, giving us something that by most accounts is a somewhat more powerful than current gen console in a noticeably smaller box.

Of course if you for whatever reason need it to be very small and silent then it did hit the mark with you.
 
Totally agree with you. Go all the way or don't go at all. Funny to me the recently indie game confirmed to use DX11 type "effects" for Wii U... The developers said it was worht the "EFFORT" on Wii U. People seem to think all developers are created equal they are not. Some are passionate enough to put their all into a game and some aren't. I believe Nintendo developed this console as a middle of the road console but powerful and feature rich enough to get next gen ports. The reality is Nintendo was Aiming to make sure what happened last gen(Wii vs ps360) doesn't happen this one and it shouldn't as long as developers give a damn which is not looking good. The console is modern enough to have mulitplats with xb1 and ps4.

Many big "AAA" western devs simply don't want to work on lower end hardware, however. Their publishers seem to be taking the same stance, unfortunately, for (sometimes) business reasons.

Waaghals said:
Of course if you for whatever reason need it to be very small and silent then it did hit the mark with you.

Here on this forum we tend to be very myopic in this regard. You and I might not care about such things (and believe me, I have a 750w dual-gpu high end gaming PC in this case sitting in my home theatre:

6218.jpg

But to some people, a small diminutive and quiet console is a very good thing. A lot of the mass market doesn't like jet airplanes in their living room. A lot of Japan tends to agree with this way of thinking. Certainly it's important to Nintendo as well. There will probably be future reasons why they like smaller hardware in their consoles as well.
 
Why would an established developer point out such a feature if the Gamecube, the Wii, and the 360 used texture compression?

Nintendo really placed their bets with eDRAM, and what we dont know is what technologies were developed or brought in to support that.

Because the Wii U supports formats that neither the Wii, the GameCube nor the XBOX360 supported thats why. The Wii U is leagues ahead of the GameCube and Wii in this regards and ahead but not quite as much of the XBOX360 too.
 
Because the Wii U supports formats that neither the Wii, the GameCube nor the XBOX360 supported thats why. The Wii U is leagues ahead of the GameCube and Wii in this regards and ahead but not quite as much of the XBOX360 too.

So what format is that, in regards to compression.
You think Nintendo opted for something else besides S3 Texture Compression (S3TC) ?
 
So what format is that, in regards to compression.
You think Nintendo opted for something else besides S3 Texture Compression (S3TC) ?

Probably the newer BCn formats which are standard on all ATI/NV/(even intel?) cards these days, they allow for much higher texture compression and better quality then the older formats which the Wii / Gamecube supported.
 

And Chess too! don't forget about the 1080p/60fps Chess game...

... anyway, this thread has been very silly for a while. Unless techy folk like Blu, Thraktor etc. have anything of relevance to the thread's title to post, I'm not sure it's worth posting anything at all.

All I've read for he last several days is mostly a childish to-and-fro 'debate' based on fabricated lists/screenshots/youtube videos/wild assertions... i.e. bloody nonsense.

I didn't chuck $50 into this project to read juvenilia.

JB
 
Haven't been part of this discussion, but here's my two cents.

I don't blame Nintendo for trying to make a small and quiet console, but I figure they sacrificed just a little too much for this goal. I tend to figure that the Wii U needed to at least be able to match PS4/X1 visuals at 720p. This is just a guestimate, but they probably could have done this had they been willing to go up to 100 watts or so. It would've been a bit larger and louder, but probably still would've been the smallest and quietest of the three machines.

Granted, it might've cost a bit more if they went this route, and there's no guarantee a bit of extra horsepower would've made a difference in how things have played out.
 
But to some people, a small diminutive and quiet console is a very good thing. A lot of the mass market doesn't like jet airplanes in their living room. A lot of Japan tends to agree with this way of thinking. Certainly it's important to Nintendo as well. There will probably be future reasons why they like smaller hardware in their consoles as well.


As I said, there are limits to how big and noisy a console can get and still see widespread acceptance. I'm just not sure the limit goes at a 40w featherweight like the WiiU.

At no point did I ever insinuate that a massive gaming pc would work as a home console; too big and too noisy.

Both the X360 and the PS3 sold pretty damn well while being big (and in the 360's case, noisy). The sound-wary mass market really hasn't pushed the WiiU to its chest.

So my argument (without derailing this thread too much) is that there is a point of diminishing returns for size and noise when it comes to a gaming console, (with the possibility of a bonus if the box is so unbelievably small and slick as to make it all seem like magic).
 
I tend to figure that the Wii U needed to at least be able to match PS4/X1 visuals at 720p. This is just a guestimate, but they probably could have done this had they been willing to go up to 100 watts or so. It would've been a bit larger and louder, but probably still would've been the smallest and quietest of the three machines.

Granted, it might've cost a bit more if they went this route, and there's no guarantee a bit of extra horsepower would've made a difference in how things have played out.

1) A lot of the PS4One's more demanding games are going to be 720p throughout the generation (and certainly many of them less than 1080, even at launch)
2) Going up 100w would put it as high as the PS4One
3) It would've been a LOT larger/louder.
4) It would've been more expensive than the PS4One, and Nintendo doesn't traditionally eat $100 or $200 per console. Hence it would be bombing even worse than it is now.

People don't seem to look past their own personal and (again) myopic reasons for wanting Nintendo to make an identical console. Sony and MS have other revenue sources and have less of a problem eating costs. Nintendo only has gaming. They can't put out a $500 identi-box. And they can't sell it for $350 to make it even remotely palatable for their general audience. Core gamers would never buy a Nintendo box to play Call of Duty with their friends, and nobody wins this way.

Yes, a good chunk of the cost was the controller, but Nintendo always tries new ideas in this regard, whether it succeeds or fails. Many of their successes end up in their competitions' products in one way or another. Hell, even with this sales dud in the Wii U, "second screen gaming" is going to be a big thing this generation. Everyone's gearing up for it. If they put out a console with a bog-standard dual analog pad and no new ideas, it wouldn't really be standard Nintendo practice, Gamecube notwithstanding.

Waaghals said:
Both the X360 and the PS3 sold pretty damn well while being big (and in the 360's case, noisy).

A case of mistaken memory. The PS3 and 360 sold awful for a very long time after launch. They were hardcore-only platforms for the beginning of their lives and their sales reflected as such. Being expensive, large and loud monsters don't win you points with the wives, as an example. Luckily, they both released smaller, cheaper and quieter units (with some mass market software/hardware), not coincidentally, before their sales started being not so shit.

But these discussions have already been made many times...
 
1) A lot of the PS4One's games are going to be 720p
2) Going up 100w would put it as high as the PS4One
3) It would've been a LOT larger/louder.
4) It would've been more expensive than the PS4One, and Nintendo doesn't traditionally eat $100 or $200 per console. Hence it would be bombing even worse than it is now.

People don't seem to look past their own personal and (again) myopic reasons for wanting Nintendo to make an identical console. Sony and MS have other revenue sources and have less of a problem eating costs. Nintendo only has gaming. They can't put out a $500 identi-box. And they can't sell it for $350 to make it even remotely palatable for their general audience. Core gamers would never buy a Nintendo box to play Call of Duty with their friends, and nobody wins this way.

also to note when reading this is when wii dried up and when wiiu came out, so talking costs to make it a year ago.
 
They look great, and the game itself looks phenomenal (my favourite artistic direction for the 3D sonics by far). However those are bullshots, unless Sega's found a way to do 1080p/60 with 8xMSAA. Which, obviously, is a suspect expectation.

I thought this Sonic is Nintendo dev development.
 
1)



A case of mistaken memory. The PS3 and 360 sold awful for a very long time after launch. They were hardcore-only platforms for the beginning of their lives and their sales reflected as such. Being expensive, large and loud monsters don't win you points with the wives, as an example. Luckily, they both released smaller, cheaper and quieter units, not coincidentally, before their sales started being not so shit.

Did they sell worse than the WiiU?
Call of Duty took off before either console got a slim/quite version. The X360 was in fact very noisy at the time, and still did well beyond the people posting on GAF.

I don't disagree that being really noisy hurts your chances with the mass market, but I maintain that it happens far above the WiiU's size/noise level. The X360 was unbelievably noisy.

In addition I am not particularly pleased that you focused only on that single sentence in order to dismiss my entire argument.
 
Did they sell worse than the WiiU?
Call of Duty took off before either console got a slim/quite version. The X360 was in fact very noisy at the time, and still did well beyond the people posting on GAF.

I don't disagree that being really noisy hurts your chances with the mass market, but I maintain that it happens far above the WiiU's size/noise level. The X360 was unbelievably noisy.

In addition I am not particularly pleased that you focused only on that single sentence in order to dismiss my entire argument.

I focused on that sentence because I didn't disagree with your points. The Wii U is definitely beyond the "this is annoyingly large/loud" by a large degree. But that was Nintendo's priority in making it (small, quiet, e tc). The 360 sold "ok" while it was still loud and noisy, but it didn't really take off until slim/kinect.
 
I focused on that sentence because I didn't disagree with your points. The Wii U is definitely beyond the "this is annoyingly large/loud" by a large degree. But that was Nintendo's priority in making it (small, quiet, e tc). The 360 sold "ok" while it was still loud and noisy, but it didn't really take off until slim/kinect.

Thanks for the clarification.

Then we more or less agree, maybe I did not express myself clearly enough.
 
I focused on that sentence because I didn't disagree with your points. The Wii U is definitely beyond the "this is annoyingly large/loud" by a large degree. But that was Nintendo's priority in making it (small, quiet, e tc). The 360 sold "ok" while it was still loud and noisy, but it didn't really take off until slim/kinect.

The Wii and Wii U are small so that they can sit along side your PS3/4 XB360/1; a complimentary console offering something different. Or to sit under your tv and not look overpowering as it is used for casual gaming
 
That sounds awfully like texture compression ...

Yup, but you've got to remember that Nintendo must have a dog's bollocks compression algorithm going by what developers managed to squeeze into 40MB for WiiWare games last gen. I don't think that the developer would have mentioned it if it was just your bog standard texture compression found with your average GPU.

One thing that people might also remember is Ancel praising the RAM setup, he said that the Wii U had almost unlimited RAM if I remember correctly, so that combined with the above leads me to believe that Nintendo have something going on (or maybe plural somethings) to stop what was then believed to be a 2GB difference in RAM for the PS4 and a 6GB difference in RAM for the One being a huge problem for developers porting between the different machines.

I did toy with the idea of Nintendo using onboard flash as swapspace which would explain where some of the 5GB of flash disappeared to after that first update but can't see that happening because a) it wouldn't be fast enough to be of much use and b) constant writing, deleting and writing would degrade the flash too quickly, give it a year or two and it would be pretty much buggered.

I know everyone rolls their eyes whenever someone mentions 'secret sauce' but there has to be something going on with the GPU that we're not aware of because Bayonetta 2 particularly the Gomorrah boss fight, shouldn't be possible on a GPU pushing the lower amount of flops that some people are attributing to it.
 
Yup, but you've got to remember that Nintendo must have a dog's bollocks compression algorithm going by what developers managed to squeeze into 40MB for WiiWare games last gen. I don't think that the developer would have mentioned it if it was just your bog standard texture compression found with your average GPU.

One thing that people might also remember is Ancel praising the RAM setup, he said that the Wii U had almost unlimited RAM if I remember correctly, so that combined with the above leads me to believe that Nintendo have something going on (or maybe plural somethings) to stop what was then believed to be a 2GB difference in RAM for the PS4 and a 6GB difference in RAM for the One being a huge problem for developers porting between the different machines.

I did toy with the idea of Nintendo using onboard flash as swapspace which would explain where some of the 5GB of flash disappeared to after that first update but can't see that happening because a) it wouldn't be fast enough to be of much use and b) constant writing, deleting and writing would degrade the flash too quickly, give it a year or two and it would be pretty much buggered.

I know everyone rolls their eyes whenever someone mentions 'secret sauce' but there has to be something going on with the GPU that we're not aware of because Bayonetta 2 particularly the Gomorrah boss fight, shouldn't be possible on a GPU pushing the lower amount of flops that some people are attributing to it.

A lot more than that wouldn't be possible at those numbers with the other specs backing it, and the devs have only spoken praise for the Wii U memory. I've yet to here see any developer complain about it.

I think the poorly written OS is the reason for the huge amount of internal storage being eaten. Using my knowledge of programming, they probably developed all of the facets of the OS independently as oposed to conjunctively and combined them all at the release date which is a terrible idea. Everytime a new program or clicked, it most load up from start to finish and completely the previous. This would explain the long load times at launch as well. The system seems to be bottleneck on the software side of thing. I'd image that the OS footprint will drop with this next major performance update.

This is why I don't put much stock in claims that limit the Wii U's capabilities. I doubt that its even possible to reach the hardware's limit with the current firmware. Let the Wii U Firmware hit 4.0 or 5.0 and then we will be able to see all that it has to offer. The subsystem's are too inefficient now.

Say for example, the info we acquired that the CPU doesn't auto-delegate tasks to idle cores(which was backed by the Project C.A.R.S. dev logs and one of Criterion's comments). I'd image the GPU has some similar firmware issues.
 
Yup, but you've got to remember that Nintendo must have a dog's bollocks compression algorithm going by what developers managed to squeeze into 40MB for WiiWare games last gen. I don't think that the developer would have mentioned it if it was just your bog standard texture compression found with your average GPU.

One thing that people might also remember is Ancel praising the RAM setup, he said that the Wii U had almost unlimited RAM if I remember correctly, so that combined with the above leads me to believe that Nintendo have something going on (or maybe plural somethings) to stop what was then believed to be a 2GB difference in RAM for the PS4 and a 6GB difference in RAM for the One being a huge problem for developers porting between the different machines.

I did toy with the idea of Nintendo using onboard flash as swapspace which would explain where some of the 5GB of flash disappeared to after that first update but can't see that happening because a) it wouldn't be fast enough to be of much use and b) constant writing, deleting and writing would degrade the flash too quickly, give it a year or two and it would be pretty much buggered.

I know everyone rolls their eyes whenever someone mentions 'secret sauce' but there has to be something going on with the GPU that we're not aware of because Bayonetta 2 particularly the Gomorrah boss fight, shouldn't be possible on a GPU pushing the lower amount of flops that some people are attributing to it.

My thoughts lead me to this, the point of texture compression is to save bandwidth which affects performance. Now the speed at which decompression happens is important because if its slow then you get texture pop in or you'll see texture slowly being rendered. The eDRAM bandwidth is important under that circumstance, it seems logical to me, if you want your GPU fed fast it will require very high speed bandwidth.

There's a chance WiiU has texture compression similar to ASTC.

=e.m.wikipedia.org/wiki/Adaptive_Scalable_Texture_Compression
 
1) A lot of the PS4One's more demanding games are going to be 720p throughout the generation (and certainly many of them less than 1080, even at launch)
2) Going up 100w would put it as high as the PS4One
3) It would've been a LOT larger/louder.
4) It would've been more expensive than the PS4One, and Nintendo doesn't traditionally eat $100 or $200 per console. Hence it would be bombing even worse than it is now.

People don't seem to look past their own personal and (again) myopic reasons for wanting Nintendo to make an identical console. Sony and MS have other revenue sources and have less of a problem eating costs. Nintendo only has gaming. They can't put out a $500 identi-box. And they can't sell it for $350 to make it even remotely palatable for their general audience. Core gamers would never buy a Nintendo box to play Call of Duty with their friends, and nobody wins this way.

Yes, a good chunk of the cost was the controller, but Nintendo always tries new ideas in this regard, whether it succeeds or fails. Many of their successes end up in their competitions' products in one way or another. Hell, even with this sales dud in the Wii U, "second screen gaming" is going to be a big thing this generation. Everyone's gearing up for it. If they put out a console with a bog-standard dual analog pad and no new ideas, it wouldn't really be standard Nintendo practice, Gamecube notwithstanding.



A case of mistaken memory. The PS3 and 360 sold awful for a very long time after launch. They were hardcore-only platforms for the beginning of their lives and their sales reflected as such. Being expensive, large and loud monsters don't win you points with the wives, as an example. Luckily, they both released smaller, cheaper and quieter units (with some mass market software/hardware), not coincidentally, before their sales started being not so shit.

But these discussions have already been made many times...

1: I've heard of no launch titles so far, that're going to be running at 720p. Eventually there will be, yes, however I was referring to matching 1080p launch games, at 720p.

2: From what I've read of the specs, no, the PS4 isn't going to be a hundred watt machine. It'll certainly have considerably lower wattage than the launch PS3/360, but that much horsepower at a hundred watt package would be an engineering marvel.

3: Larger and louder, sure. It wouldn't have been as large or as loud as the PS4/X1 however, which themselves won't be as loud as launch PS360's.

4: Cost is a legitimate issue, but it's one Nintendo has to find a way to deal with.

The fact of the matter is, you cannot expect multi-platform support for a console that isn't capable of playing the same games with only fairly minimal work. The Wii U is a very interesting design, but even with a ton of optimizing, you couldn't expect many PS4/X1 games to run on it. While it's true that Nintendo doesn't need to rely as heavily on 3rd party support as Microsoft or Sony, I think they're past the point where they alone can guarantee the success of a platform.

Basically, Nintendo needs to do what's necessary to secure 3rd party support in the future. If that means making their consoles a bit more like their competitors, then that's what needs to happen.
 
I don't know it is like people already forgot how the HD twins shared multiplats with PCs with 8GB+ memory. Crysis is mostly a PC franchise and it was made for the 512MB HD twins. I am sure there are a lot of ways around the RAM difference, most devs should be experts by now going by the long running HD twins. I am not implying it will look the same, simply that the memory difference does not seem to be a barrier for Wii U next gen multiplats. Also the Wii U has more memory than the HD twins and could free up more in the future.

I was not implying it was secret sauce, just pointing at the dev comment and acknowledging the feature.

A couple of things of note:

PC games are almost always 32-bit applications which have a hard limit of ~2gb of memory usage. DX10/11 reduce memory usage significantly (compared to DX9) to the point where a game like Crysis would be bumping against the 2gb ceiling in DX9 mode vs. ~1.3gb in DX10 mode with everything maxed out. Even where the PCs have 8gb of RAM in them, the games themselves haven't been using that much memory. The more realistic comparison is of ~450mb to ~2gb of memory while disregarding overheads.

It otherwise just depends on how the game allocates its memory. A game that allocates 512mb to gameworld content and 4.5gb to textures could probably be cut down for the Wii U while a game that allocates 3gb to the gameworld and 2gb to textures would most likely not. The former might be a Call of Duty while the latter could be a Grand Theft Auto.

Regarding texture compression, it has been a requirement for DirectX compliancy since DirectX 6 or 7.

Edit: I also highly doubt that the Latte is a 176gflop part. I've seen at least one case where a Wii U version runs at 720p while both 360/PS3 variants are sub-HD. 352gflops seems like a shoe-in for a HD4600 part.
 
What makes you think it's a 4600 part..? Nothing from the die shot represents that possibility if I remember correctly..?
 
Status
Not open for further replies.
Top Bottom