That's a difference in numbers, but what's the tangible (visual) difference in a modern architecture with similar feature set? Is it same game assets with 900p vs 1080p resolution? And if equal resolution, how much assets have to be dialed back? is it 4xAA vs 2xAA, some shadow quality or SSAO? I believe it's too complicated to quantify.
I truly believe that Bungie set a proper target for Destiny current gen version, and another target for last gen version. They designated assets to run at 1080p on PS4 and 900p on Xbox one, which seems reasonable given the power gap and the drivers that Bungie had at hand when the assessment was made. Then later MS did some drastic changes to the SDK by removing Kinect requirements plus optimizing esram utilization.
Bottom line is, without any changes, the original assessment had a 1080p PS4 version vs a 900p XBO version with equal assets, but given the last minute changes plus help from MS, Bungie was able to update the Xbox One version's resolution. I don't think this has anything to do, or had any effect on the PS4 version which would have looked the same had MS made changes or not. The only difference is that with MS doing nothing, Destiny would have been 1080p on PS4 and 900p on Xbox One, and everyone would have been happy, vs what we have now.