MightyHedgehog,
Seeing the difference of 720p on a display that is native versus a 720p image signal being upscaned to 1080i...is pretty mostly non-existent, IMO. Granted, I've only seen it run that way a few times.
I haven't seen enough examples either, so I can't really make a determination. The following will detail some of my worries though.
Technically, the difference between the resolutions of the native display and what resolution the signal actually is before conversion should produce a bit more detail. Still, the scaling chip inside of the X360 converts on its own if you choose to output to 1080i from a native internal resolution of 720p, supposedly.
In the case of converting from a higher native resolution to a lower display one, you get some anti-aliasing (I think this is super-sampling). The output hardware does interpolation to approximate the picture. This basically softens (blurs) the picture, thereby reducing aliasing. I think its a relatively inexpensive computation, and is what the DC did? Normally you would want to go from a resolution that's higher in both directions, but I think the DC was 480p to 480i, and therefore only done horizontally? Also, I think the DC had special output hardware to do this, it wasn't using the GPU for this?
In the case of going from a lower native resolution to a higher output resolution however (which is what XBOX 360 would be doing when going from 720p to 1080i either internally or in the display itself), things get a little tricky. In this case, you are doing extrapolation, ie. you have to make up (guess) the pixel data to fill in the extra resolution. I'm under the impression that there are many ways to do this, with varying results. At the minimum, I believe it's more computationly expensive, and more likely to result in artifacts (mostly aliasing). I don't know if the 360 GPU does this conversion itself, or if there is dedicated hardware (I assume) ... but that brings up a few potential problems. Either there is potentially more aliasing, or if there is no dedicated output hardware, more aliasing and extra resource use. No matter what, the level of detail is going to be less than 1080p/i native.
In the case of PS3, there would be higher actual detail, less aliasing (due to the higher res, and lack of conversion artifacts), and less extra resource usage? Also, if I'm correct about the DCs horizontal super-sampling and PS3 does the same, some cheap AA.
There's also no reason that devs could not create all of their assets for 1080i/p and then have the conversion process take it from there. As I understand it, the way that the embedded DRAM on the Xenos can deal with framebuffers that exceed 720p by splitting the data out and dealing with it internally, piece by piece.
While they could, it would probably be the exception - not the rule, since its not mandated, and would be more expensive due to multiple passes for frame rendering, and then extra conversion? Granted the downconversion might not matter since it's handled by dedicated output hw I assume, but the extra passes doesn't sound all that promising.
The only thing that looks like it stopping anything on a comparative scale with RSX is that its video display output can put out 1080p, while the MS-designed display chip/scaler in the X360 would need to be altered to allow for output at 1080p.
I think my replies above cover my worries.