Remember that
"low settings screenshot thread" for PC games? This is drastically simplified, but imagine that 360/PS3 games are running 720p, but on the lowest settings, but Xbox One games are running at 720p with High/Very High settings. Which could affect:
-resolution of the source textures (are you stuck with 512x512 textures due to memory reasons, or can everything now be 2048x2048?)
-number and quality of enemies/particles/other objects on screen (does everything need to fade out instantly, or can you have more persistent "carnage"?)
-lighting and shadow quality
-anti-aliasing
-draw distance, and streaming of large environments
-*insert other fancy graphics buzzword here*
-and of course, how this all performs, framerate-wise, in the end.
And that's just on the graphics side, which doesn't include user interface tweaks, built-in OS support for extra features that would have to be manually done on the 360/PS3 version, and so on.
So even if the "native resolution" stayed at 720p, there's still plenty of other things that could make someone want to upgrade.
Again, not saying that somehow 1080p would be useless or whatever, and of course, if all else is equal, having native 1080p graphics is nice as well. And I'm not saying "durrr the Xbox One is balanced and the PS4 is not durrr" or anything like that (I completely recognize that PS4 is likely to be able to do all the same visual tricks, and with likely better resolution/performance). All I was saying (which you seem to agree with!) is that resolution
by itself is not the only factor to take into account.
The post that started this conversation was why would someone buy a new generation version of a game over a 360. The Xbox One can still do far better graphics than a 360 (and obviously plenty of other non-graphics related features), even if the resolution is exactly the same. That was the only point I was responding to.