The main idea about the cloud computing is improved utilization of the computing resources. This is true because resources are shared between all users. Home computer appliances (tablets, game consoles, PCs, laptops) are offline for most of the time. When a device is not used, it's compute cycles are totally wasted. In a cloud server farm, the computers are running all the time, 24/7 at full utilization.
Lets assume for a moment that an average console owner plays on average 10 hours per week (this includes also casual players and those who bought the console mainly for BluRay/netflix/TV purposes). That is 10 hours / (7*24 hours) = 0.06. 6% of the total time. This means that on average 16.8 consoles could share the same cloud computing unit.
Lets assume our cloud servers are built using 8 core Sandy Bridge E server CPUs. Sandy has around 1.5x IPC compared to Jaguar (according to various recent Jaguar benchmarks), and has around 2x clocks. In total a single Sandy Bridge E is around 6x CPU processing power compared to a 4 core Jaguar (or approx 3x compared to a 8 core Jaguar). A single 1U rack server could be equipped with four of these CPUs. An industry standard rack is 42U tall. So it has a total of 168 of these Sandy Bridge E processors. A single processor can serve 16.8 consoles (average 6% online time per user). Thus a single rack can serve 2822 sold consoles (if the target is 3x CPU performance).
Result: For each 2822 consoles sold, the console manufacturer needs to add a single new server rack to one of their existing server farms. That's not impossible or insane by any means, especially if they are selling the extra cloud processing time to research facilities, universities and big companies (finance, etc). That would balance the cloud utilization (around the clock).
Latency is course another matter. Cloud is not good for tasks that require single frame latency. However games have lots of tasks that are latency tolerant. It's true that the could has too high latency to directly render graphics, but it could be used to bake lighting to static scenery (lightmaps & probes). This is especially good for dynamic game worlds (and for user created content). Partially baked lighting reduces the lighting cost a lot. Thus the GPU doesn't need to process as many dynamic lights per frame. Cloud will be a big thing in the next 10 years. Xbox One is a pioneer in this area in regards to gaming. Is it too early? Only time will tell. The average speed of internet connections has been improving a lot every year. We have to assume that these consoles will be around for many many years, and lots of things will happen during that time.