Looking through this thread, I see that there was a lot of concern about the performance of the CPU in the new consoles, based on comparisons of FLOPS and clockspeeds and so forth. I think, however, that there may be a more useful way of thinking about CPU performance (which is, don't think about it).
First, forget about CPUs and GPUs. What really matters is the total performance of the system on different workloads. Workloads can generally (if you're willing to make a gross oversimplification) be classified as one of two types, either "embarrassingly" parallel floating-point, or branch-heavy integer."Embarrassingly" parallel floating-point workloads are the sort that GPUs are well suited for, and you can measure performance on them in FLOPS (if you like mostly meaningless synthetic numbers). Examples of these in games would be things like graphics and physics. Branch heavy integer workloads are what CPUs are typically good at. Examples from games would be things like AI and basic gameplay logic. Integer performance is typically measured in MIPS (again, a highly meaningless synthetic number) but people don't talk about it much any more, since FLOPS are way sexier.
So, looking at the numbers for the PS3 and the PS4, we see that the PS3 has roughly 356 GFLOPS total performance on floating-point workloads (176 from the RSX and 180 from the CELL) and 6400 MIPS on integer workloads. In comparison, the PS4 has about 1900 GLOPS total performance on floating-point workloads (1800 from the GPU, 100 from the CPU) and 25600 MIPS on integer workloads.
That means that by these arbitrary synthetic measurements, the PS4 is roughly 5x as fast as the PS3 on floating-point workloads, and 4x as fast on integer workloads. That is to say, despite the "weak" CPU in the PS4, the gains over the PS3 on both traditionally GPU friendly tasks and CPU friendly tasks are very close.
It's also important to note that for various reasons these synthetic numbers dramatically under-represent the performance gain of the PS4 over the PS3 when running real code. Specifically, the performance of the PS4 on integer workloads will be dramatically better in comparison to the PS3 than these numbers would indicate. That's because these calculations were done assuming an IPC (instructions per cycle) of 2 for both the Cell's PPE and the Jaguar cores in the PS4. This is fair for synthetic numbers, since both are dual-issue, but in real life the much, much stronger front-end on the Jaguar will give it a significant edge in IPC.
tl;dr the PS4 CPU is fine, since the Jaguar cores in the PS4 kick the snot out of the PS3's one measly PPE, and anything you would do on the SPEs you'll now be doing on the GPU.