I was going to chime in on that as well as I don't see it being used as a texture cache. IMO if anything it will be a pseudo-L3 cache if devs choose not to use it exclusively as a framebuffer.
You can still be very creative with it. A big framebuffer is never too much so the kind of offscreen framebuffer rendering Donkey Kong Country Returns uses becomes more viable along with stuff like rendering extra image/light masks making pseudo-HDR and the like very cheap (
thinking SotC here), layered rendering, shadow sampling and blurring, and possibly even acelerating effects like fur shading (since that effect basically repeats a very simple texture in various passages through polygon shells; it's bandwidth and fillrate dependent and doing it on the framebuffer could be cheap and spare RAM bandwidth as opposed to do it on the main RAM bank); also in the advent of sub-HD games a big framebuffer could allow the game to be upscaled and the HUD/fonts to be applied on top (as an extra passage), instead of rescaled along with the whole image preventing image degradation/sub-HD being so obvious.
And I'm sure there are other clever uses possible.
But it's certainly for assisting features and alternative GPU channel-feeding of smaller things (like you said: cache) rather than streaming.
Also FS's post made me think back to when I brought up S3TC and it's usage, but I didn't get much info to satisfy what I was trying to figure out so maybe you can chime in on this. Do you see it as a viable implementation into the GPU for Wii U and if so how would it benefit a modern hardware environment? Nintendo renewed their license a couple years ago.
Nintendo probably
licensed S3TC themselves back in 2010 because 3DS GPU, the Pica200 originally didn't have S3TC texture compression compatibility. I don't know if the final one does, but it's regarded as a custom implementation and made a lot of sense to do so (both for cartridge and ram economy).
As for the Wii U, it's definitely in as every ATi GPU has it, that and post enhancements like 3Dc (BC4 and BC5). Hopefully also BC6 and BC7 who were implemented recently with the DirectX 11 standard as a means to make it future proof.
This is a good article to catch up.
S3TC itself is BC1, BC2 and BC3, hasn't been updated since 1999 but it's still the basis for everything related to texture compression. 3Dc's implementation (BC4 and 5) are only used for normal maps, and BC6 and 7 are pretty much unknown to most developers still (BC6 being for HDR and BC7 being the new high quality state of art compression method).
Fair enough on the 1 gigabit GDDR3 chips (was it you who I debated this with before?). However, even if they did go w/ GDDR3, that's still 12 chips for 1.5 GB or 16 chips for 2 GB. It's completely unfeasible.
No, wasn't me
Fairly sure they won't go higher than 8 chips though.
I appreciated your framebuffer analysis. I'm not knowledgeable enough on how texture caching works to say any more about it. All I know is that Gamecube and Wii used it to store commonly used textures (ground textures, for example)
Technically, not the framebuffer.
Gamecube and Wii had 3 MB 1T-SRAM embedded, 2 MB framebuffer and a 1 MB texture buffer. The framebuffer was pretty short as is hence the dreadful amounts of dithering we had to deal with (rendering 640x480 with Z-buffer takes 2.3 MB per frame) that's one of the reasons Anti-Aliasing on the GC/Wii wasn't widespread at all. No sane developer would take that 2 MB pool and use it for textures.
But they had a 1 MB buffer for it and since they had it, might as well use it. Speaking of which, pretty sure the Fur Shading we've seen somewhat extensively on the GC/Wii (Starfox Adventures, Donkey Konga, Mario Galaxy 1/2, Madden, Metroid Other M) was done by storing the repeating texture in that 1 MB buffer.