If there's any 1T-SRAM in there at all (which is doubtful), they can do about 330 MB of the stuff for the same chip size as in the GameCube. So if there's anything in there it's more likely to be 192 or 256MB. If Nintendo goes that route, they need to find a way to solve the problems the PS3 has with 256 MBs of non-video RAM (it's something Bethesda couldn't handle, and therefore PS3 Skyrim is shit).
There definitely will be some 1T-SRAM, as Mosys have confirmed as much. It could simply be 24MB to provide hardware BC with Wii, though, or a similar amount as a framebuffer for the GPU, perhaps. I'm not an expert on GPU technology, but I think it could even be both; serving as VRAM for Wii BC and a framebuffer in Wii U mode.
Also, in my post above, I was assuming that both the GDDR5 and 1T-SRAM were accessible by both the CPU and GPU, so it wouldn't have quite the same limitations as the PS3.
Edit: Actually, just to confirm, where did you get the 330MB from? I did a quick calculation using the assumption that transistor density increases by a factor of 2 every 2 years (don't know if that's actually the case for RAM) and I got an increase of about 45 times from the GC launch to the Wii U launch (2 to the power of 5.5), which would give 1080MB in the same chip size as the GC's 24MB.
Streaming tech is only part of the problem. 4x 854x480 is lower than 1x 1920x1080, and standards already exist that transmit that resolution just fine, in more complex settings than the Upad.
[...]
Streaming so many pixels is a solved problem, however.
Streaming is far from a solved problem, at least in the form Nintendo needs it. Current wireless streaming tech is designed for transfer between stationary devices, and even then is far from reliable in the real world. Nintendo not only need to stream to a device that will be constantly moving, but needs to be 100% reliable even in worst-case conditions for background interference (ie wireless routers, other wireless-enabled games consoles, AppleTVs, etc. right next to the Wii U and mobile phones, tablets, etc. right beside the controller, all transmitting/receiving on the 2.4/5Ghz bands simultaneously). I'm not even talking 99.9% reliability here, 100% is absolutely essential for Nintendo to not have a RRoD-esque debacle on their hands. This is just about doable for a single 854x480 stream (~300Mbps
without even including the ample ECC/other overhead needed). It's a big stretch for two screens, and borderline impossible for three or four screens for Nintendo given their need for 100% reliability.
The thing is that Nintendo probably doesn't intend to sell/market the Upad separately as it is probably too expensive to be successful for a second controller. In that's case it's better to not market it at all (and only sell/provide them to owners whose Upad has broken) so that game designers don't build games around multiple Upads. Furthermore, having to draw one HD framebuffer and multiple SD framebuffers on the same GPU may also degrade performance too much.
I'd agree with you that Nintendo
didn't intend to support more than one Wii U pad both for technical and cost reasons. This is fairly clear from the fact that all the multiplayer games they showed at E3 last year were all asynchronous, with only one Wii U pad. However there were quite a lot of people (both in the press and fans) asking why they can't play with two or more Wii U pads, and Nintendo eventually relented and gave a non-committal "we're looking into it" response. My guess is that Nintendo went back and looked at ways they might be able to support two or more controllers (and sell them at a reasonable price) to prevent large swathes of potential customers thinking to themselves "Wow, that'd be great with Madden for each player to have his own screen!" only to be extremely disappointed when they find out that it can't do that. If they do support two or more controllers, my guess is they either do it wired, or reduce the framerate and/or colour depth of the controllers in multiplayer mode to fit within the available bandwidth of whatever streaming tech they're using.
More screens will require a larger framebuffer, but that's not too difficult to add, particularly as AMD have so much experience in multi-display technology.
I don't believe in such high amounts of 1T-SRAM, GDDR5 would be much cheaper probably and has more than enough bandwidth. The 96MB/128MB numbers of the Wiiudaily rumors are the maximum I would expect (and this amount would actually make sense as framebuffer for deffered rendering).
RAM is another thing I'm not an expert in, but I think the appeal of 1T-SRAM is not just the bandwidth (which is quite high), but also a much lower latency than DRAM. It also has an incredibly stable transfer rate, which makes code easier to optimise.
Sure, but the fact remains that rendering on Wii U by its very design requires more CPU/GPU because it has multiple screens to render to. This is pretty much always going to mean the main screen will never be at "full potential".
Not necessarily. Of course, if you want to pump out separate high quality 3D visuals to the TV and four controllers there's going to be a significant performance hit, but provided the hardware is appropriately designed it should be able to output 2D maps/inventories/etc. to a couple of Wii U pads without a noticeable effect on the main graphical output.