Now, the latest Xbox revision is a good, reliable design - but it can still get very warm to the touch. So the question is simple: how can Wii U be twice as powerful as the Xbox 360 when it's got to cram in more advanced silicon with millions more transistors into an area that's tiny by comparison? Won't it overheat horribly? Where's the room for the substantial cooling assembly it would require?
In theory, we could look at laptops here as an example of getting powerful chips working in smaller areas. The problem here is that high-power mobile GPUs are highly 'binned' - they're the pick of the production crop of processors destined for a broad range of different graphics cards. Mobile parts are typically the very best chips, the cream of the crop, capable of great performance at low voltages. Nintendo would not have this luxury on a mass-production item with a single design, where high efficiency is the key to keeping costs down as much as possible.
Realistically, short of a major architectural shift to components based on smartphone tech - and lots of it - the idea of Wii U possessing next-gen rendering capabilities doesn't make a lot of sense. We know that there's no transition to mobile tech because the IBM CPU is an off-shoot of an existing line and the firm doesn't make mobile CPUs. Similarly, while AMD has produced smartphone GPUs, none of them get close to the performance of the Xbox 360's Xenos GPU. That being the case, the chances are that it's a customised variant of an existing PC Radeon part: Japanese sources have previously hinted at a connection to the Radeon HD 4000 series - and a lower-end chip from that range would be a good fit.