Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
:lol

Sorry, but there's not even remotely a chance of those happening.

Hooray, spec discussion!

I'll admit I chose the 700Mhz largely because it's a simple 1/4th of the CPU clock speed, but I wouldn't say it's completely outside of the realm of possibility. The R700 line that the Wii U's GPU is apparently based on topped out at 850Mhz, and a possible die-shrink combined with the fact that the Wii U is in a considerably larger casing than the Wii (i.e. more heat to dissipate) would indicate that 700Mhz is in the range of possible values there.

The 1T-SRAM was a bit of a guess, but Nintendo have a penchant for fast RAM, and as Mosys's biggest customer, only Nintendo and Mosys themselves really know how much 512Mb of 1T-SRAM would cost to produce at those quantities these days. That said, I'd say the total RAM will be 1.5GB, perhaps with a smaller amount of 1T-SRAM.
 
Wait. Your game of the year is decided by how the graphics look, not how the game plays?
What a time we live in.
dude do you see that footage? It looks exceedingly amazing and the monsters and environment look exceedingly charming. It adds a lot to a game like that. And a game with potential to be old school arcadey fun that also looks that amazing? How could it not shoot up when most of what we get these days is the same shit over and over.
 
Hmm, here's a "making of" trailer for the killer freaks trailer. Pretty interesting stuff, though it tells us nothing about the game itself.

So are the images from sec5 to sec15 what the game is supposed to look like (target renders)? Can't remember the original footage showed anything celshaded.
 
DQ X better have a massive graphics upgrade on the Wii-U

Won't happen.
FFXI, to this day, is 100% identical to the PS2 version on PC and 360 in all but render resolution.
 
What are you talking about, its already been stated it looks better.

Doesn't conflict with what he said. Dolphin games look vastly superior to their native counterparts.

I'm still expecting more than a resolution bump, though.
 
Doesn't conflict with what he said. Dolphin games look vastly superior to their native counterparts.

I'm still expecting more than a resolution bump, though.

Im not talking about a resolution bump:

The Wii U version of Dragon Quest X will have better graphics. Both versions - Wii and Wii U - will be compatible with each other.

The Wii U version is said to be graphically superior, though both versions can work together.

Saito stated that “The Wii version is totally fun! The Wii U version is totally fun! And it’s also extremely beautiful.

These statements dont sound like this title will have a simple resolution enhancement over the Wii.
 
I'll admit I chose the 700Mhz largely because it's a simple 1/4th of the CPU clock speed, but I wouldn't say it's completely outside of the realm of possibility. The R700 line that the Wii U's GPU is apparently based on topped out at 850Mhz, and a possible die-shrink combined with the fact that the Wii U is in a considerably larger casing than the Wii (i.e. more heat to dissipate) would indicate that 700Mhz is in the range of possible values there.
I think the Wii U GPU will clock 600-750 MHz. It's more probable that Nintendo will choose to up the clock speed to increase performance instead of shipping with a larger chip.

Wikipedia lists AMD's 7750M as a 768 SPU/48 TMU/16 ROP GPU at 650 MHz on a 28 nm node. It has a TDP of 36 watts. I'm just mentioning that here as it indicates what AMD can currently get done within a power envelope that seems suitable to the Wii U. A chip like that is also quite comparable to the RV770 line, although this chip seems a bit more powerful than the RV770LE. I expect the Wii U GPU to be slightly less well equiped as the 7750M (probably 512 SPU /32 TMU/16 ROP) and possibly made on 32 nm instead of 28 nm.

The 1T-SRAM was a bit of a guess, but Nintendo have a penchant for fast RAM, and as Mosys's biggest customer, only Nintendo and Mosys themselves really know how much 512Mb of 1T-SRAM would cost to produce at those quantities these days. That said, I'd say the total RAM will be 1.5GB, perhaps with a smaller amount of 1T-SRAM.
If there's any 1T-SRAM in there at all (which is doubtful), they can do about 330 MB of the stuff for the same chip size as in the GameCube. So if there's anything in there it's more likely to be 192 or 256MB. If Nintendo goes that route, they need to find a way to solve the problems the PS3 has with 256 MBs of non-video RAM (it's something Bethesda couldn't handle, and therefore PS3 Skyrim is shit).
 
Here's a thought: The restriction on the number of controllers the Wii U will be able to support is due to technical limitations of the streaming tech. Streaming an 800x480 image, lag-free, to a moving controller with perfect reliability is a pretty tricky technical problem. Doing the same for two controllers is a very difficult problem, while three and four controllers could well enter the realm of physical impossibility.

So, why not just use wireless for the first Wii U pad, and limit further controllers to wired operation? Put three Wii U pad ports on the console, each outputting video and power to each controller and receiving control data from them. This has the added advantage of making additional controllers cheaper to make, as they don't need either the streaming tech or internal batteries. Bundled Wii U Pads would have cables to hook up to the Wii U ports, both for charging and so that you can bring your Wii U pad over to a friend's house for multiplayer.

It would certainly be a little awkward for secondary controllers to have to be wired, but compared to the alternative of not supporting more than one, or possibly two, controllers at all, it wouldn't be a bad option.
 
Streaming tech is only part of the problem. 4x 854x480 is lower than 1x 1920x1080, and standards already exist that transmit that resolution just fine, in more complex settings than the Upad.

The thing is that Nintendo probably doesn't intend to sell/market the Upad separately as it is probably too expensive to be successful for a second controller. In that's case it's better to not market it at all (and only sell/provide them to owners whose Upad has broken) so that game designers don't build games around multiple Upads. Furthermore, having to draw one HD framebuffer and multiple SD framebuffers on the same GPU may also degrade performance too much.

Limiting the Upads that need to be provided images by the Upad also simplifies the streaming technology, and could bring down costs. Streaming so many pixels is a solved problem, however.
 
Here's a thought: The restriction on the number of controllers the Wii U will be able to support is due to technical limitations of the streaming tech. Streaming an 800x480 image, lag-free, to a moving controller with perfect reliability is a pretty tricky technical problem. Doing the same for two controllers is a very difficult problem, while three and four controllers could well enter the realm of physical impossibility.

So, why not just use wireless for the first Wii U pad, and limit further controllers to wired operation? Put three Wii U pad ports on the console, each outputting video and power to each controller and receiving control data from them. This has the added advantage of making additional controllers cheaper to make, as they don't need either the streaming tech or internal batteries. Bundled Wii U Pads would have cables to hook up to the Wii U ports, both for charging and so that you can bring your Wii U pad over to a friend's house for multiplayer.

It would certainly be a little awkward for secondary controllers to have to be wired, but compared to the alternative of not supporting more than one, or possibly two, controllers at all, it wouldn't be a bad option.

I really like this solution, but people are going to hate wires making a come back.

I think Nintendo should have gone with a stronger CPU / GPU or a chip that is dedicated to streaming 4 uPads

What if they could work it out that each uPad can stream different things. even play different games that are stored on the console at the same time?

A Family with 4 kids could buy one console and buy 3 extra uPads

So 4 kids playing different games being streamed by the hardware while mom and dad watching TV.

When mom and dad leave the room the kids take over the TV for 4 player Mario Kart / Smash Bros battle

this tech maybe costly but this is the direction this streaming to multiscreens is headed.
Like the Wiimote Nintendo may have come up with it but someone else can easily out match their efforts because they had to be conservative to keep the price low.

Sony or Microsoft can easily take this idea and run with it
 
Streaming tech is only part of the problem. 4x 854x480 is lower than 1x 1920x1080, and standards already exist that transmit that resolution just fine, in more complex settings than the Upad.

The thing is that Nintendo probably doesn't intend to sell/market the Upad separately as it is probably too expensive to be successful for a second controller. In that's case it's better to not market it at all (and only sell/provide them to owners whose Upad has broken) so that game designers don't build games around multiple Upads. Furthermore, having to draw one HD framebuffer and multiple SD framebuffers on the same GPU may also degrade performance too much.

Limiting the Upads that need to be provided images by the Upad also simplifies the streaming technology, and could bring down costs. Streaming so many pixels is a solved problem, however.

If they can get this tech right and each uPad can play a different game or using a different app at the same time on from one WiiU box sell the uPad at $99-$129
 
Well there is still the I/O controller, hardware codec, and a dedicated module for the wireless stream (the latter two according to the patent). So that should take a good deal of the burden from the what could have been on the CPU and/or GPU.
 
I think the Wii U GPU will clock 600-750 MHz. It's more probable that Nintendo will choose to up the clock speed to increase performance instead of shipping with a larger chip.

Wikipedia lists AMD's 7750M as a 768 SPU/48 TMU/16 ROP GPU at 650 MHz on a 28 nm node. It has a TDP of 36 watts. I'm just mentioning that here as it indicates what AMD can currently get done within a power envelope that seems suitable to the Wii U. A chip like that is also quite comparable to the RV770 line, although this chip seems a bit more powerful than the RV770LE. I expect the Wii U GPU to be slightly less well equiped as the 7750M (probably 512 SPU /32 TMU/16 ROP) and possibly made on 32 nm instead of 28 nm.

I definitely expect something around these specs, probably more likely around 600 MHz than 700 MHz. Together with the 32MB of eDRAM/1T-SRAM bgassassin mentioned a number of times I think this would be quite powerful and probably closer to the next Xbox than to PS360.

If there's any 1T-SRAM in there at all (which is doubtful), they can do about 330 MB of the stuff for the same chip size as in the GameCube. So if there's anything in there it's more likely to be 192 or 256MB. If Nintendo goes that route, they need to find a way to solve the problems the PS3 has with 256 MBs of non-video RAM (it's something Bethesda couldn't handle, and therefore PS3 Skyrim is shit).

I don't believe in such high amounts of 1T-SRAM, GDDR5 would be much cheaper probably and has more than enough bandwidth. The 96MB/128MB numbers of the Wiiudaily rumors are the maximum I would expect (and this amount would actually make sense as framebuffer for deffered rendering).
 
Well there is still the I/O controller, hardware codec, and a dedicated module for the wireless stream (the latter two according to the patent). So that should take a good deal of the burden from the what could have been on the CPU and/or GPU.
Sure, but the fact remains that rendering on Wii U by its very design requires more CPU/GPU because it has multiple screens to render to. This is pretty much always going to mean the main screen will never be at "full potential".
 
If there's any 1T-SRAM in there at all (which is doubtful), they can do about 330 MB of the stuff for the same chip size as in the GameCube. So if there's anything in there it's more likely to be 192 or 256MB. If Nintendo goes that route, they need to find a way to solve the problems the PS3 has with 256 MBs of non-video RAM (it's something Bethesda couldn't handle, and therefore PS3 Skyrim is shit).

There definitely will be some 1T-SRAM, as Mosys have confirmed as much. It could simply be 24MB to provide hardware BC with Wii, though, or a similar amount as a framebuffer for the GPU, perhaps. I'm not an expert on GPU technology, but I think it could even be both; serving as VRAM for Wii BC and a framebuffer in Wii U mode.

Also, in my post above, I was assuming that both the GDDR5 and 1T-SRAM were accessible by both the CPU and GPU, so it wouldn't have quite the same limitations as the PS3.

Edit: Actually, just to confirm, where did you get the 330MB from? I did a quick calculation using the assumption that transistor density increases by a factor of 2 every 2 years (don't know if that's actually the case for RAM) and I got an increase of about 45 times from the GC launch to the Wii U launch (2 to the power of 5.5), which would give 1080MB in the same chip size as the GC's 24MB.

Streaming tech is only part of the problem. 4x 854x480 is lower than 1x 1920x1080, and standards already exist that transmit that resolution just fine, in more complex settings than the Upad.

[...]

Streaming so many pixels is a solved problem, however.

Streaming is far from a solved problem, at least in the form Nintendo needs it. Current wireless streaming tech is designed for transfer between stationary devices, and even then is far from reliable in the real world. Nintendo not only need to stream to a device that will be constantly moving, but needs to be 100% reliable even in worst-case conditions for background interference (ie wireless routers, other wireless-enabled games consoles, AppleTVs, etc. right next to the Wii U and mobile phones, tablets, etc. right beside the controller, all transmitting/receiving on the 2.4/5Ghz bands simultaneously). I'm not even talking 99.9% reliability here, 100% is absolutely essential for Nintendo to not have a RRoD-esque debacle on their hands. This is just about doable for a single 854x480 stream (~300Mbps without even including the ample ECC/other overhead needed). It's a big stretch for two screens, and borderline impossible for three or four screens for Nintendo given their need for 100% reliability.

The thing is that Nintendo probably doesn't intend to sell/market the Upad separately as it is probably too expensive to be successful for a second controller. In that's case it's better to not market it at all (and only sell/provide them to owners whose Upad has broken) so that game designers don't build games around multiple Upads. Furthermore, having to draw one HD framebuffer and multiple SD framebuffers on the same GPU may also degrade performance too much.

I'd agree with you that Nintendo didn't intend to support more than one Wii U pad both for technical and cost reasons. This is fairly clear from the fact that all the multiplayer games they showed at E3 last year were all asynchronous, with only one Wii U pad. However there were quite a lot of people (both in the press and fans) asking why they can't play with two or more Wii U pads, and Nintendo eventually relented and gave a non-committal "we're looking into it" response. My guess is that Nintendo went back and looked at ways they might be able to support two or more controllers (and sell them at a reasonable price) to prevent large swathes of potential customers thinking to themselves "Wow, that'd be great with Madden for each player to have his own screen!" only to be extremely disappointed when they find out that it can't do that. If they do support two or more controllers, my guess is they either do it wired, or reduce the framerate and/or colour depth of the controllers in multiplayer mode to fit within the available bandwidth of whatever streaming tech they're using.

More screens will require a larger framebuffer, but that's not too difficult to add, particularly as AMD have so much experience in multi-display technology.

I don't believe in such high amounts of 1T-SRAM, GDDR5 would be much cheaper probably and has more than enough bandwidth. The 96MB/128MB numbers of the Wiiudaily rumors are the maximum I would expect (and this amount would actually make sense as framebuffer for deffered rendering).

RAM is another thing I'm not an expert in, but I think the appeal of 1T-SRAM is not just the bandwidth (which is quite high), but also a much lower latency than DRAM. It also has an incredibly stable transfer rate, which makes code easier to optimise.

Sure, but the fact remains that rendering on Wii U by its very design requires more CPU/GPU because it has multiple screens to render to. This is pretty much always going to mean the main screen will never be at "full potential".

Not necessarily. Of course, if you want to pump out separate high quality 3D visuals to the TV and four controllers there's going to be a significant performance hit, but provided the hardware is appropriately designed it should be able to output 2D maps/inventories/etc. to a couple of Wii U pads without a noticeable effect on the main graphical output.
 
There definitely will be some 1T-SRAM, as Mosys have confirmed as much. It could simply be 24MB to provide hardware BC with Wii, though, or a similar amount as a framebuffer for the GPU, perhaps. I'm not an expert on GPU technology, but I think it could even be both; serving as VRAM for Wii BC and a framebuffer in Wii U mode.
There's no need to use 1T-SRAM if they want backwards compatibility, because if they have 32MB of EDRAM they can use that to emulate the low latency of 1T-SRAM as well.

I never found that MoSys rumour too convincing. Could anybody link to the source? lherre felt that it was most probable there was going to be 1 GB in the final unit so that probably meant there wasn't a significant amount of 1T-SRAM (or any other stuff) in the devkit. Wonder if that's going to stay the same.

Also, in my post above, I was assuming that both the GDDR5 and 1T-SRAM were accessible by both the CPU and GPU, so it wouldn't have quite the same limitations as the PS3.
From what I've read, the PS3 CPU can actually access the VRAM. It's just not very fast, but it would be quite an expense to make it work fast.

Streaming is far from a solved problem, at least in the form Nintendo needs it. Current wireless streaming tech is designed for transfer between stationary devices, and even then is far from reliable in the real world. Nintendo not only need to stream to a device that will be constantly moving, but needs to be 100% reliable even in worst-case conditions for background interference (ie wireless routers, other wireless-enabled games consoles, AppleTVs, etc. right next to the Wii U and mobile phones, tablets, etc. right beside the controller, all transmitting/receiving on the 2.4/5Ghz bands simultaneously). I'm not even talking 99.9% reliability here, 100% is absolutely essential for Nintendo to not have a RRoD-esque debacle on their hands. This is just about doable for a single 854x480 stream (~300Mbps without even including the ample ECC/other overhead needed). It's a big stretch for two screens, and borderline impossible for three or four screens for Nintendo given their need for 100% reliability.
There's no such thing as 100% reliability in wireless technology. Nintendo doesn't even need it. When transmitting what is video for practical purposes, odd pixels, out-of-order frames or lost data are gone from the screen within a blink of an eye. Nintendo may have high standards, but it doesn't take 100% reliability for this to work. Far from it, even.

The WirelessHD standard pretty much defines everything that's needed for Nintendo here. It includes a 7 GHz band (that doesn't interfere with other common radios), a video codec (although Nintendo may well specialize that) and it even goes for 10 meters without a line of sight. The specification was finalized in 2008, so that's why I called it a solved problem.

More screens will require a larger framebuffer, but that's not too difficult to add, particularly as AMD have so much experience in multi-display technology.
With the Wii U, this isn't as trivial as it is with normal graphics cards. The Wii U's framebuffer will be probably be of fixed size (32MB of EDRAM we've heard), with a special range dedicated to the tablet screen. In that scenario you can't 'just' allocate more room like you do on a normal graphics card, as the location of each pixel on each screen is predetermined in hardware. It's not impossible to get his to work, but fitting the extra screen(s) in EDRAM will cause a performance hit.

Azak said:
Sure, but the fact remains that rendering on Wii U by its very design requires more CPU/GPU because it has multiple screens to render to. This is pretty much always going to mean the main screen will never be at "full potential".
Nah. Because you can't focus your attention at both screens at the same time, there will be few cases when really complex stuff is happening on both screens. For nearly all games the Upad will probably feature some fairly simple 2D graphics, maybe with the occasional FX thrown in. This won't raise the load on the CPU much (in most cases not at all, I suspect) as it's the same logic that needs to be done for stuff you'd otherwise show on the TV, and it won't have a significant load on the GPU's polygon, texture or shading performance either. The screen probably eats away 2 MB of framebuffer space, but that's about the only way it affects the main screen graphics at all.
 
Im not talking about a resolution bump:

These statements dont sound like this title will have a simple resolution enhancement over the Wii.
It may not sound like a simple resolution enhancement thing, but it's exactly whats going to happen, WiiU version will be the exact same just running in HD and will be held back by Wii so bad is going to be sad, just like FFXI.
 
How about some speculation on third party support.

How many units of the WiiU must Nintendo sell, in your opinion, for it to become the
de facto console to develop on (In other words, the 360 of this gen, or the PS2 of the last gen). Assuming that Microsoft launches its new console mid to late 2013.
 
Im not talking about a resolution bump:







These statements dont sound like this title will have a simple resolution enhancement over the Wii.

Smells like PR bullshit to me. I'll be pleasantly surprised if there are any significant improvements to the Wii U version beyond HD resolution.
 
It may not sound like a simple resolution enhancement thing, but it's exactly whats going to happen, WiiU version will be the exact same just running in HD and will be held back by Wii so bad is going to be sad, just like FFXI.

0385149f-d43f-4941-85db-4ca2c914e08c.jpg
 
How about some speculation on third party support.

How many units of the WiiU must Nintendo sell, in your opinion, for it to become the
de facto console to develop on (In other words, the 360 of this gen, or the PS2 of the last gen). Assuming that Microsoft launches its new console mid to late 2013.

Judging by this gen, double the sellthrough of the PS2.
 
Nah. Because you can't focus your attention at both screens at the same time, there will be few cases when really complex stuff is happening on both screens. For nearly all games the Upad will probably feature some fairly simple 2D graphics, maybe with the occasional FX thrown in. This won't raise the load on the CPU much (in most cases not at all, I suspect) as it's the same logic that needs to be done for stuff you'd otherwise show on the TV, and it won't have a significant load on the GPU's polygon, texture or shading performance either. The screen probably eats away 2 MB of framebuffer space, but that's about the only way it affects the main screen graphics at all.
Nothing's free though, so it will have some impact, albeit small if it's just a static screen. So technically I still stand by my statement but yes in a lot of situations it will have little practical impact if you're just viewing something like an inventory.

However, I think Nintendo will want to show off the AR type "window into the screen" games. In this case it's dual rendering, but I guess if the game is as simple as what they were showing at E3, it wouldn't impact it.
 
From what I've read, the PS3 CPU can actually access the VRAM. It's just not very fast, but it would be quite an expense to make it work fast.
'Not very fast' is surely one way of putting it.
 
How many units of the WiiU must Nintendo sell, in your opinion, for it to become the
de facto console to develop on (In other words, the 360 of this gen, or the PS2 of the last gen). Assuming that Microsoft launches its new console mid to late 2013.
I doubt numbers from the past make sense for next gen. Nintendo needs to have feature parity with the other console - being capable of doing the same graphics (in principle) and have an 'ecosystem' for patches, DLC and online gaming. That's what they need if they want any ports at all. Both seem to be worked on according to rumours, so that's alright.

If it is to become the default console, it has to sell better than the Wii, and it has to be sold to different people. 3rd party games have had a tendency to be outcompeted by 1st party games on Nintendo consoles.
 
I think the Wii U GPU will clock 600-750 MHz. It's more probable that Nintendo will choose to up the clock speed to increase performance instead of shipping with a larger chip.

Wikipedia lists AMD's 7750M as a 768 SPU/48 TMU/16 ROP GPU at 650 MHz on a 28 nm node. It has a TDP of 36 watts. I'm just mentioning that here as it indicates what AMD can currently get done within a power envelope that seems suitable to the Wii U. A chip like that is also quite comparable to the RV770 line, although this chip seems a bit more powerful than the RV770LE. I expect the Wii U GPU to be slightly less well equiped as the 7750M (probably 512 SPU /32 TMU/16 ROP) and possibly made on 32 nm instead of 28 nm.

If there's any 1T-SRAM in there at all (which is doubtful), they can do about 330 MB of the stuff for the same chip size as in the GameCube. So if there's anything in there it's more likely to be 192 or 256MB. If Nintendo goes that route, they need to find a way to solve the problems the PS3 has with 256 MBs of non-video RAM (it's something Bethesda couldn't handle, and therefore PS3 Skyrim is shit).

7750M specs on Wikipedia are total BS. If those specs are true then I'll be jumping in joy. 2 TFLOP card with 65W TDP. :D
 
Nothing's free though, so it will have some impact, albeit small if it's just a static screen. So technically I still stand by my statement but yes in a lot of situations it will have little practical impact if you're just viewing something like an inventory.
Let me put it this way - you will never hear a developer saying "we wanted to use a fancy shader effect there, but we couldn't because the tablet took away that tiny bit of processing power from the GPU" or "we wanted to put 100 more enemies on screen, but we couldn't because the Upad logic was too intensive". The computing demands for the Upad in most graphically intensive games will probably be on the scale of what the Nintendo DS could do, which is incredibly insignificant in a Wii U.
However, I think Nintendo will want to show off the AR type "window into the screen" games. In this case it's dual rendering, but I guess if the game is as simple as what they were showing at E3, it wouldn't impact it.
Yeah, this AR type of game could make an impact. I can see it used in graphically intense games like Metroid as well. There's probably a lot of stuff they can compensate though, because when you use the AR on focussing something in particular, you don't have to render that particular part of the scene twice from scratch. What do you think blu?
 
I think the Wii U GPU will clock 600-750 MHz. It's more probable that Nintendo will choose to up the clock speed to increase performance instead of shipping with a larger chip.

Wikipedia lists AMD's 7750M as a 768 SPU/48 TMU/16 ROP GPU at 650 MHz on a 28 nm node. It has a TDP of 36 watts. I'm just mentioning that here as it indicates what AMD can currently get done within a power envelope that seems suitable to the Wii U. A chip like that is also quite comparable to the RV770 line, although this chip seems a bit more powerful than the RV770LE.

Oh wow, someone else who saw that design and said the same thing I did. Granted I was thinking desktop parts with the Cape Verde Pro/HD 7750, but this laptop part is certainly very interesting.

I had put the thought of something that modern (7000 series) being designed for Wii U out of my head before seeing this and thinking Nintendo's customization for console parts could look something like that when all was said and done. The notes about its size, efficiency and TDP only furthered the idea in my head.
 
Oh wow, someone else who saw that design and said the same thing I did. Granted I was thinking desktop parts with the Cape Verde Pro/HD 7750, but this laptop part is certainly very interesting.

I had put the thought of something that modern (7000 series) being designed for Wii U out of my head before seeing this and thinking Nintendo's customization for console parts could look something like that when all was said and done. The notes about its size, efficiency and TDP only furthered the idea in my head.

Beware of any gflops or tdp numbers for the card, those are probably fake.
 
Let me put it this way - you will never hear a developer saying "we wanted to use a fancy shader effect there, but we couldn't because the tablet took away that tiny bit of processing power from the GPU" or "we wanted to put 100 more enemies on screen, but we couldn't because the Upad logic was too intensive".


And lets not forget the WiiU does not need the WiiU controller to play games.
We still have the option of using the Wiimote and chuck.
Which many, still love to use for their FPSs.
 
A little hyperbolic, but true to some extent :/

Hyperbole? Ha!

"Umm.. Your hardware sales are great, you're innovative, you've offered us a very generous {moneyhat + developer support} package, and you've resolved a lot of issues.. but.. Reggie looked at us a bit funny that one time. And you were a bit mean twenty years ago. So we're gonna go with every other platform instead. They have issues too, no doubt, but we're willing to forgive those issues. Thanks. Buh-bye."
 
And lets not forget the WiiU does not need the WiiU controller to play games.
We still have the option of using the Wiimote and chuck.
Which many, still love to use for their FPSs.

Using IR in FPSs? *raises hand*

Not sure how they'll deal with that, but it's something I'm gonna ask about at E3 this year.
 
7750M specs on Wikipedia are total BS. If those specs are true then I'll be jumping in joy. 2 TFLOP card with 65W TDP. :D
Hmm. I thought the mobile parts were based on actual leaks or information, but I can't find them anymore :/ They don't sound unrealistic though. Neither does 2TFLOPS on 65W - it's a new computing power focused architecture on a smaller production node.
 
Hmm. I thought the mobile parts were based on actual leaks or information, but I can't find them anymore :/ They don't sound unrealistic though. Neither does 2TFLOPS on 65W - it's a new computing power focused architecture on a smaller production node.

The wiki states that the 6970m has a 75 TDP, but in the real world it is 100W.
I wouldn't be surprised if the 7970m is a 90W part in the real world.

With that said a 7750m part with a TDP of 50W is still realistic for the Wii-U, but the cooling solution will be probably more expensive than what Nintendo is looking for.
 
Hell, IR control for FPS' is awesome, but Nintendo will have a hard time convincing the core audience.

Honestly, they really don't need to. As long as the option exists, then the "core" guys can play whatever way they want. Many Wii games do this already, so nothing changes. Even Treyach can simply port over their Wii control option and everyone is happy.
 
Hooray, spec discussion!

I'll admit I chose the 700Mhz largely because it's a simple 1/4th of the CPU clock speed, but I wouldn't say it's completely outside of the realm of possibility. The R700 line that the Wii U's GPU is apparently based on topped out at 850Mhz, and a possible die-shrink combined with the fact that the Wii U is in a considerably larger casing than the Wii (i.e. more heat to dissipate) would indicate that 700Mhz is in the range of possible values there.

The 1T-SRAM was a bit of a guess, but Nintendo have a penchant for fast RAM, and as Mosys's biggest customer, only Nintendo and Mosys themselves really know how much 512Mb of 1T-SRAM would cost to produce at those quantities these days. That said, I'd say the total RAM will be 1.5GB, perhaps with a smaller amount of 1T-SRAM.

Unless the size increases, 500MHz is the max for 800SPUs. This is a TINY console.

There will probably only be one pool pf RAM, and 512MB of 1T-SRAM sounds too expensive.
 
Also, the Wii U isn't using a laptop GPU...
Laptop GPUs are an indication of what the state of technology is when it comes to performance per watt. The Wii U won't have an off-the-shelf GPU, but if an off-the-shelf GPU does 35W for some level of raw power, then we know that the Wii U GPU can also get near that raw power for around 35W.

Too bad that Wikipedia information isn't curated good enough.
BurntPork said:
Unless the size increases, 500MHz is the max for 800SPUs. This is a TINY console.
There isn't some solid relation between SPUs, clock speed and power usage that anyone of us could really claim that. Mobile Juniper (800 SPU VLIW5) parts did ~50W at 600+ MHz, so who knows what they can achieve on 28 or 32 nm.
 
I do wonder if Nintendo will release a wireless Classic Controller Pro for the WiiU that isn't dependant on the Wii controller
If so, then I will regret buying my CC Pro a couple of months ago for the purpose of practicing Smash with it. I gotta get used to it since Wii U unfortunately doesn't use GCN controllers.
 
Laptop GPUs are an indication of what the state of technology is when it comes to performance per watt. The Wii U won't have an off-the-shelf GPU, but if an off-the-shelf GPU does 35W for some level of raw power, then we know that the Wii U GPU can also get near that raw power for around 35W.

Too bad that Wikipedia information isn't curated good enough.

If Nintendo's willing to only have a couple hundred thousand Wii Us made every month, sure, why not?
 
Status
Not open for further replies.
Top Bottom