I don't quite understand why Cell would be so hard to emulate with powerful multi-core CPUs these days. Why is having CELL a requirement for backwards compatibility?
Cell is fundamentally different compared to other CPUs in many ways. Where other CPUs tend to be 'good at everything', there is only one core in Cell that is as general purpose as that. The other cores are focused at heavily parallel number crunching. For gaming purposes, it has seemed that this number crunching is helpful to a degree but not a solution that pays itself back in terms of the associated costs. However, its number crunching performance is still so large that it cannot be properly emulated, and the peculiar and highly parallel way it needs to be programmed make it very hard for another more conventional CPU to do it exactly the same way. Although a new CPU will probably exceed Cell even in number crunching capabilities, a conventional CPU cannot replicate the way Cell does its number crunching close enough to make emulation possible. There really is no easy fix for this. To run Cell code, you still need Cell.
You are right that including Cell does not solve backwards compatibility completely. Considering they used a DX9 Nvidia chip last time, and will very likely use a DX11 AMD part next time, emulation might be so difficult that they won't even bother.
Again, where is this info coming from? AMD can't even design their own flagship CPU correctly.
Even ignoring the fact that that last part is simple FUD (losing to the best CPU maker in the world doesn't mean they suck), AMD's CPU and GPU teams are still very much distinct. Without question AMD's GPU team has been the best GPU maker in terms of effective GPUs since 2007. That might change with the new Geforces, but AMD still very much has the upper hand this time.
Nvidia has screwed up twice with Microsoft and Sony. Microsoft is not going back to Nvidia since they got a bad deal with Nvidia over the royalties on the NV2A chip in the original Xbox. Sony can't be satisfied at all by the RSX design, which is outdated compared to a chip launching a year earlier (while Nvidia was even ahead of AMD at the time) and other hardware faults (it doesn't have a proper hardware scaler). Rumours has it that an AMD GPU is in all three next-gen consoles, and engineers at Nvidia have indicated that they have no idea what's going to be in the next-gen consoles either.
Nvidia's future lineup of cards seems to be even more powerful than AMD, and they're not at a competitive disadvantage which would cause Sony to switch to AMD.
These decisions aren't made based on fancy roadmaps for PC gamers to drool over. If the companies are as close in competition as AMD and Nvidia, the deal isn't even made on performance but on the best deal. Nvidia has twice shown to be a bad deal.