Given all that I have seen, I'm leaning heavily towards this being derived from the HD5550. The number just fall right into place.
I don't understand why people are so hung up on insisting that it is a 4XXX chip. Its like they don't even want to humor the idea of it being a more capable chip than the low end guess. That entire line of options is just completely dismissed without study.
For this basic comparison I will be using the HD 46XX since Digital Foundry was so "certain" that this lower end chip is the bases and since most people I've encounter believe it to be the 100% unquestionable, end of story, truth... Its as if people have some stock in Latte being as bad a possible.
http://www.amd.com/uk/products/desk...hd-5000/hd-5550/Pages/hd-5550-overview.aspx#2
http://www.amd.com/us/products/desk.../Pages/ati-radeon-hd-4600-specifications.aspx
Just a short comparison.
1. Latte is 40 nm
46XX - 55 nm
5550 - 40 nm
2. Latte uses DX11 equivalent graphics
46XX - DX10.1
5550 - DX11
3. Latte is clocked at 550 Mhz
4650 - 600/725 MHz
5550 - 550 Mhz
4. Latte is theorized at 352 Gigaflops
46XX - 384 GFLOPS
5550 - 352 Gigaflops
5. While neither of the chip are as low on energy consumption as the Wii U, the HD5550 has a lower watt draw and we know that Nintendo was targeting energy efficiency and lower cost.
6. The HD5550 has much larger and more efficient multi-display support.
I also remember some mention of 320 stream processor which the HD5550 has as well, but I can't find it.
In the first post it list "Trisetup and rasterizer (R800 dropped that and delegated the workload to SPs)" but we know that is is a heavily customized chip, that that could very well be the result of customization to lower cost and make it more compatible with the Wii.
Now this is a long shot in dark right here, but another thing of note is the RAM supported by the HD46XX was GDDR3/DDR3/DDR and for the HD5550 is was GDDR5/DDR3. GDDR3 was used for the Wii making it the more natural choice to use it for the Wii U, but there is no GDDR3 HD5550 model. DDR3 would have been the cheaper and more energy efficient option of the two RAM types is supported. This may explain why they went with DDR3 over GGDDR3. The DDR3 version of the HD4650 seems to have the worst energy consumption and the highest clock.
http://www.cnet.com/graphics-cards/ati-radeon-hd-4650/4507-8902_7-33780428.html
The biggest thing that makes me lean towards the HD5550 is performance per watt. The HD5550 is all around superior. The HD 46XX is just too power hungry. Finally, the actual "price" of the H5550 is far lower than the HD46XX. The loest I saw a HD4650 for was $95 while the highest I saw a HD5550 for was $59. Why would Nintendo go for a higher cost, less energy efficient card with inferior multi-display tech, and lower real world visual performance? Combine this with the specs and its just common sense to me.
Well, that's my very non-professional analysis.