AMD's Tessellation performance pre GCN was horrible so I wouldn't put much faith in the Wii U's being used for much if at all.
This has been discussed to hell and back before.
Tesselation performance before GCN wasn't horrible; the thing is, it went through various generations/revisions; R5xx and Xenos had the first hardware tesselation unit (well, we could talk truform on R2xx but I'll pass), and yes, it sucked, it didn't support vertex compression so you had to decompress whatever you were pulling (providing it was compressed) and end up with bloated geometry due to the added detail; that... with limited RAM effectively defeated the purpose.
No warranty that would be the case had the console had double the RAM or so. Anywho, tradeoffs were inappropriate.
I'll skip ahead, but if I remember correctly GCN/HD7xxx is considered Gen 3+; HD6xxx were generation 3 and they were still VLIW4/5 (depending on the chip); it's was an DirectX 11/OpenGL 4.1 implementation at that point and thus it had to comply to all the demanded standards; no, they didn't suck.
As for Wii U, we know it fished above R700 on stuff like the Eyefinity implementation, they could have updated the Tesselator or not, anywho, worse case scenario it's a Gen 2 part, at best it's Gen 2+ or even Gen 3; either way it's infinitely more useful than what Xenos had.
Shin'ens next game uses tesselation. They already announced that on Twitter. Nintendo dosen't implent stuff into their chips if they aren't useable. If it has tesselation, it will be used in games.
Haha, yes they do; and this part is derivative from a "mass" product not originally meant for Nintendo, they surely didn't go out of their way to nuke everything they thought they wouldn't use. Gamecube could output stereoscopic 3D, for instance; SNES could pull 512x512, and only one game used (not a nintendo published one, looked horrible), I could go on.
And in the end, even if they document it, it depends on the developers whether a feature is attractive or not.