So the Pro is going to be a scam pretty much and this gen is doomed to remain as shitty as it is now
its really hard to beat dlss at absurdly low internal resolutions like 480p/720p/1080p
people who mock dlss being unusable at 1080p or 1440p/4kultra performance modes being bad are just entitled actually. 1440p dlss performance (720p) will often look better than regular upscalers that upscale from 1080p. it is rather insane how good DLSS is the lower you go. if Sony can match that somehow all good. but PS5 users will feel scammed then so I will feel bad for suggesting one of my friends to get a PS5 (first console of his life). if they can't make it software based and backwards compatible, that will be a big kick in the gut to an entire userbase. because DLSS is present on rtx 2070 from 2018 and will keep being a performance and image quality enhancer at the same time for such cards
I can safely say 4k dlss performance in last of us part 1, moving or not, looks better than native 1440p on PS5. even ratchet and clank at 1440p dlss performance looks equal to the ps5's ray traced performance mode that is supposed to be running around 1000-1200p and upscale to 1440p. and that was with IGTI, probably best upscaler sony studios have
it was even more funny when I tried 1080p dlss quality versus 1440p ray tracing performance mode on PS5 and 1080p dlss quality still ended up having better temporal stability and presentation than PS5, lol (but of course 1080p output was noticably blurrier). IGTI, FSR and all of that is a joke. only xess is decent but it is still far cry from DLSS.
people mock 1440p dlss perf or 1080p dlss quality because of 720p internals but actually most end users are super fine with it and these modes being "presentably" good gives insane longevity to older cards and gives some possibilities to some cards that would otherwise be impossible (such as being able to run ray traced alan wake 2 at above 30 fps with 1440p output)
this 1440p dlss quality output looks damn near 1440p and GPU retains 30+ fps with ray tracing.
PS5 and Xbox Series X in Quality Mode render at a resolution of approximately 2259x1270 and appear to use FSR 2 to reconstruct a 3840x2160 resolution.
with regular FSR, 1270p and alike does not really look that much above 1440p actually. so while 1440p dlss quality presents a 1440p-like image, FSR is the opposite and barely improves the image quality compared to its internal resolution.
this creates a situation where 3070 can get similar resolution image quality but with ray tracing at similar framerate targets. which potentially makes the effective power difference between them more than %100 (considering how heavy ray tracing is in this game compared to raster). despite rasterization results would show you 3070 is barely above %30-40 compared to ps5. but if you factor in how good DLSS is doing of a job at lower resolutions, and add ray tracing into the mix, and have the same 30-40 fps target, voila, you get insanely effective ray tracing render over consoles.
with cyberpunk you can easily shoot above 40 fps with 1440p dlss performance, and it looks good (but not with ray reconstruction in cyberpunk.) it also saves a lot of VRAM which helps these VRAM limited GPUs immensely. but that is another topic... (and clarification: dogtown is much harder to run than base game with path tracing)
(another clarification is that my encoding is low quality and bitrate is not that high, and youtube is not helping either. so take image quality on these videos with a grain of salt. it looks much better in person)
Now: you may question why I'm obsessed with low input dlss resolutions. because to me, it makes it more worthwhile to get even more performance and use higher fidelity settings. to me, using 4k dlss quality and 4k fsr quality or regular upscalers, they all look presentable/decent with dlss having nice advantages. the REAL big advantages really start when you go to 1080p and below. so I feel like I get more mileage out of DLSS by targeting 720-1080p as much as possible. but that is just me.