It's awesome. Just know that. Heck, probably 2/3 of technical stuff goes over my head too. I sometimes feel bad asking questions to answers I don't fully grasp, but I figure someone has to.
Well I figured that the surface has no set resolution. Like asking what resolution the wireframe of the TV has. The question makes no sense, of course. I guess I'm wondering what highest resolution is on VR goggles currently that this simulation could run on?
RGB color space is more than just fine. Euro TVs supported it anyway and consumer NTSC TVs upsample back to RGB out of necessity. Down sampling to YUV is just extra work for lost bi-pixel color data. It's almost as pointless as simulating dithering and/or pixel depth down sizing in limited memory, IMO from a purely technical standpoint. Not to say that you have to be concerned with my standards in your personal project, though. Whatever it is you're after, I still find this fascinating for its potential.
Question: Just for the sake of versatility, would it be possible to capture game footage from a real console on the fly and impose that into your simulation? Not that I demand to see you do it, but just wondering if that's possible in any practical sense.
I definitely would love to see how much more you progress with this. Thanks for sharing!
Right now I'm just using native res pngs grabbed from Google images. I haven't messed with video textures in UE4 yet, but that's should be the quickiest way to get some motion and I'll try it later. Hooking an actual emulator would take a lot more work, like porting something like Retroarch into an UE4 plugin. It's possible, but I won't have time for something that big for a while.
Hooking a real console could work with any video capture plugin, which I'm pretty sure already exists.
About resolution, it would depend on the VR device. The screens I posted aren't downsampled or anything like that, I took them directly from my desktop.