• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro/PSSR Appears to Provide Better Image Reconstruction than DLSS Running on a 4090 GPU

ap_puff

Neo Member
Whether or not it's better than DLSS can't be shown in stills. Everyone knows the problem with temporal upscaling, which is ghosting and image stability. If PSSR is close to DLSS or, very improbably, better then that's a win in and of itself. 4k60 is a great thing for consoles to have, and it'll be better than 90% of gaming PCs.

Actually, I wouldn't be surprised if it matches or bears DLSS, Sony has some of the best image processing on the planet in their TVs.
 
Last edited:

Sentenza

Member
Yes PS5 pro is gonna be better than 4090 😂

Please think before making threads 😂
The hilarious part is this idea that AMD and Sony had a secret sauce for some type of image reconstruction far better than the industry-leading one (Nvidia), but they chose to kept it a secret so far, only to use it for the PS5 Pro rather than making it available on the same hardware on the PC side.
 

PaintTinJr

Member
I mean it's worth noting that DLSS wasn't developed by some massive team either. Last I remember there was like 5 authors of the paper.
And likely PSSR, as well as XeSSS, AutoSR and AppleSauce or whatever it's called - were all similarly staffed.
I expect similar for the non ML alternatives in UE, Unity, FSR ... too.


PS5 compute throughput that you could throw at AI would be about 20.5 TOP/s (41T if it had Int8 ops - but while officially never confirmed, some insiders say it doesn't).
Pro has 300 - and the leaked document stated around 2ms for the upscale. If it scaled linearly - PS5 would take up to 30ms just to upscale a frame (even with 41T - it'd still be around 15ms) - so prohibitively expensive in either case.
Going by the GoW Ragnarok paper which uses ML AI by RPM FP16 to upscale 2K(2048x2048) to 4K(4096x4096) that was fully optimised by the end and runs in just 9.5ms.

Gow Ragnarok Paper said:
When processing 2 blocks at a time, it upsamples one 2k texture to 4k in around 9.5 milliseconds, shaving off over a whole millisecond.

So in theory if PS5 games were dropped to native resolution of 1280x720 which is 4.5x smaller than 2K textures that would get runtime down to ~2.5ms and produce an upscaled image of just below 1920x1080p(1920x993) while also leaving ~14ms per frame for rendering in a 60fps game.

Scaling the numbers for 30fps would double the time for upscaling to 5ms and would then allow for 1440p to 4K if I'm not mistaken, but AFAIK PSSR grades the sections of the picture and only uses ML AI inference in mostly high frequency data areas and uses cheaper FSR or equivalent techniques to do the remaining low frequency areas, so the relative cost of inference in PSSR might be much higher than the Ragnarok paper, making all that theorizing I just did completely pointless.

//edit
Actually I missed the last part where they got it even lower to 9.1ms
As I was preparing this presentation, I realized that it’s possible to bump this number up to 3 and get yet another small performance gain of around 0.4milliseconds. But as shown in the stats, we’re starting to get diminishing returns.
 
Last edited:

Ovech-King

Member
Not sure how it compares but knowing how good the resolution boost option (reality setting ?) is in their X1 ultimate tv chip , I wouldn’t be surprised they used the same kind of tech and it could be even better black magic than DLSS
 
Last edited:
Will need to see what it looks like in motion. DLSS is already better then FSR/XeSS but that lead widens a lot when in motion. FSR for example looks like trash in motion most of the time.
 

Gaiff

SBI’s Resident Gaslighter
You have absolutely no idea how good Nvidia engineers are huh lol
Remains to be seen. While NVIDIA's people are certainly far better engineers when it comes to AI development, Sony is also a movie company.
 

Senua

Member
Remains to be seen. While NVIDIA's people are certainly far better engineers when it comes to AI development, Sony is also a movie company.
We goin 24fps bois

Nah I'm sure it'll be a great solution but his post writing off Nvidia engineers because of Microsoft's failures is just weird
 
I was curious and went looking back on DF comparisons of PS5 resolution/fidelity modes with a 4090 DLAA max settings, and preliminary comparison suggests to me that PSSR is potentially a vastly superior solution. I gathered the limited samples from today's Cerny tech talk where base PS5 fidelity mode was compared to PS5 Pro that doubled frame rate in addition to improving effective resolution. For the DF comparisons, I have Spider-Man Miles Morales and Horizon Forbidden West where, again, Alex compares 4K DLAA max settings image quality to PS5 fidelity modes (unfortunately, DF didn't use Fidelity mode for their Ratchet PC analysis). I won't point out the differences, instead I'll let you be the judge and see which ML upscaling technique appears to provide more impressive uplifts. Of course this is early days and I'm not claiming PSSR is definitively better, as 3rd party games might be a completely different scenario. But I still think the possibility of even first party games on PS5 Pro besting top of the line GPUs in image quality is a big deal.

PS5 Pro PSSR vs PS5 Fidelity Mode (Spider-Man 2/Ratchet):

KIjK72i.jpeg


zOzepdj.jpeg



4090 DLAA Max Settings vs PS5 Fidelity Mode (Spider-Man Miles Morales):

cSSpz6O.jpeg
s4ITVl9.jpeg


4090 DLAA Max Settings vs PS5 Fidelity Mode (Horizon Forbidden West):

BSVWHuD.jpeg
FtyhvGb.jpeg


6oKF0nC.jpeg
HVU0879.jpeg
 

Senua

Member
You seem to have no idea how good SIE's ICE teams are, then, or their hardware engineers considering they made the best-designed console of the generation four years ago :)
Nvidia is THE AI computing company, their deep learning upscaler is second to none since they released 2.0 over 4 years ago. I do not underestimate Sony at all but they have a huge task to even match DLSS, let alone surpass it at their first go at it. I hope it's good for you guys because FSR2 just isnt cutting it
 

scydrex

Member
Wow. Absolute delusional. Ps5 pro is running on amd hardware. To date amd have failed to match dlss in almost every scenario. FSR3 is still no match for dlss. Yet somehow you think Cerny has magically made pssr better than dlss? Something amd’s hardware engineers have failed to do with even more powerful chips they have? Deluded
I think PSSR have dedicated hardware. FSR dosen't and DLSS does have dedicated hardware. If that's correct then you are saying that PSSR with dedicated hardware will be equal or worse than FSR without dedicated hardware.
 
Last edited:
I wouldnt think ICE has anything to do with it, if anything it's help from accumulated knowledge from Sony's TV and Camera divisions, they are pretty much the leaders in both for image quality

Yeah you're probably right. I can still see the ICE teams having some members chipping in to help out though.

Nvidia is THE AI computing company, their deep learning upscaler is second to none since they released 2.0 over 4 years ago. I do not underestimate Sony at all but they have a huge task to even match DLSS, let alone surpass it at their first go at it. I hope it's good for you guys because FSR2 just isnt cutting it

Nvidia being the first out the gate with dedicated hardware for image upscaling doesn't necessarily mean they have the best offerings in that space when including technologies currently in R&D. Yes PSSR does have a tough standard to match or exceed, but SIE have experience over three decades working specifically in the gaming industry and with a vast network of 3P partners that Nvidia could never match.

That's the kind of experience which'll really help push technologies like PSSR to their potential, provided other factors come together. It could very well fall short of ambitions, sure, but I'm more bullish on them at least making something on par with DLSS 2 or 3. And in any case, they'll iterated on it with future hardware like the (rumored) PS portable and PS6.
 

MrJangles

Neo Member
I was waiting for this. I remember in the first year of the PS5 its fanboys were claiming as fact that it offered superior graphical performance than a 3080ti.

How did that work out?
 

RespawnX

Member
The hilarious part is this idea that AMD and Sony had a secret sauce for some type of image reconstruction far better than the industry-leading one (Nvidia), but they chose to kept it a secret so far, only to use it for the PS5 Pro rather than making it available on the same hardware on the PC side.

Wait until they realize that Sony is using some enhanced version of FSR 3(.X) with it's own branding. Full resolution real time motion will show how good it is. In any case no upscaler will sale a console at such a price point.
 
Last edited:

Senua

Member
Yeah you're probably right. I can still see the ICE teams having some members chipping in to help out though.



Nvidia being the first out the gate with dedicated hardware for image upscaling doesn't necessarily mean they have the best offerings in that space when including technologies currently in R&D. Yes PSSR does have a tough standard to match or exceed, but SIE have experience over three decades working specifically in the gaming industry and with a vast network of 3P partners that Nvidia could never match.

That's the kind of experience which'll really help push technologies like PSSR to their potential, provided other factors come together. It could very well fall short of ambitions, sure, but I'm more bullish on them at least making something on par with DLSS 2 or 3. And in any case, they'll iterated on it with future hardware like the (rumored) PS portable and PS6.
Just seems like a load of hopium to me, but let's see.
 

Little Mac

Gold Member
Sorry for off topic but Returnal (til the 26th) and Death Stranding (til tmrw the 12th) are both on sale for under $29.99 on PSN. I have a bunch of games already that I need to play through but are these staples every ps5 owner should get eventually? If so I’ll buy at least one of them. Let me know brothers!
 

saintjules

Gold Member
Sorry for off topic but Returnal (til the 26th) and Death Stranding (til tmrw the 12th) are both on sale for under $29.99 on PSN. I have a bunch of games already that I need to play through but are these staples every ps5 owner should get eventually? If so I’ll buy at least one of them. Let me know brothers!

Returnal is my GOTG, but if you're not into rougelite/like games, or one's where difficulty is high that causes you frustration, it may not be for you.

As for DS, if you like how the sequel is shaping up, you might want to try the first game.

Can't go wrong with both, especially Returnal though.
 
Last edited:

diffusionx

Gold Member
We’ve know about PS5 Pro for years, we’ve known that PSSR was a core part of this system for years, we know that Sony has been investigating upscaling since CBR on PS4 Pro and anti aliasing algorithms since PS3 (MLAA). This isn’t like AMD farting out FSR to quickly have something to put next to Nvidia. They’ve been working on these problems for a long time, longer than Nvidia, and their work has influenced developments elsewhere.

There’s no reason not to think PSSR will be an excellent solution, outside of being a Nvidia fanboy.
 

Vick

Gold Member
Pro not handling those shadows very well in SM in that first shot
Aren't that how they should look? Im playing now Star Wars Outlaws and with RTXGI some shadows look like that, and without like on base PS5.
I don't think so, maybe?🤔
o0u9wCh.jpeg
What are people saying is the problem? The red arrow marked shadows in the left shot on PS5 are too hard for the distance away compared to depth cueing and the openness of the area for indirect secondary lighting to illuminate and soften the shadow IMO.

It might look less pleasing in digital imagery to have such weakened shadows, but I pretty sure it is more coherent to the scene's actual lighting simulation,
In the video it doesn't look like that, not sure what happened in the pic.
I think it's fair to say those could indeed be RT shadows.

d1fsXLm.png


It's more noticeable in the video as it's clear they are more defined near the trunk and progressively softer:



So on Pro apparently SM2 runs at 60fps the existing Fidelity Mode IQ and existing RT, with additional RT on top.

However I don't own SM2 so I can't check how shadows normally appear. Days Gone on PS4 featured PCSS-like shadows, so it could always be the case for the OG Spider-Man 2 as well as far as I know. It's hard to tell from that clip.
It's either RT shadows or a very peculiar artifact.

PSSR is much better than the standard quality mode.

H1VWj5O.png
pVuVkd1.png
s5gvYAw.jpeg
bv2Uvkv.jpeg
Pretty sure that's simply 30fps Mode motion blur. lol
 
Last edited:

Vick

Gold Member
I find it weird people doubt Sony's image processing skills.
It's interesting because, aside from producing the highest end professional cameras on the planet and their TV tech, when it comes to mass produced cheap electronic everyone in the Home Theatre community remembers Sony disrupting the BD player high end market in 2013 when they released their BDP-S790 at $229.99 with better image quality than the highest end Oppo BDP-103D.

Edit:

Who the hell is this Same ol G Same ol G guy laughing at our facts driven posts?
 
Last edited:
Going by the GoW Ragnarok paper which uses ML AI by RPM FP16 to upscale 2K(2048x2048) to 4K(4096x4096) that was fully optimised by the end and runs in just 9.5ms.



So in theory if PS5 games were dropped to native resolution of 1280x720 which is 4.5x smaller than 2K textures that would get runtime down to ~2.5ms and produce an upscaled image of just below 1920x1080p(1920x993) while also leaving ~14ms per frame for rendering in a 60fps game.

Scaling the numbers for 30fps would double the time for upscaling to 5ms and would then allow for 1440p to 4K if I'm not mistaken, but AFAIK PSSR grades the sections of the picture and only uses ML AI inference in mostly high frequency data areas and uses cheaper FSR or equivalent techniques to do the remaining low frequency areas, so the relative cost of inference in PSSR might be much higher than the Ragnarok paper, making all that theorizing I just did completely pointless.

//edit
Actually I missed the last part where they got it even lower to 9.1ms
In GoW they are upscaling only the textures (and not even all the textures if you read the paper), not the whole image which would require much more processing time, likely >30ms like fafalada said. Not practical at all.
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
It's interesting because, aside from producing the highest end professional cameras on the planet and their TV tech, when it comes to mass produced cheap electronic everyone in the Home Theatre community remembers Sony disrupting the BD player high end market in 2013 when they released their BDP-S790 at $229.99 with better image quality than the highest end Oppo BDP-130D.
Not to mention their Bravia range is the best for SD content because of their image processing skills.
It really wouldn't surprise me if they beat out the competition in this regard, because they do it in every other sector.
 

PaintTinJr

Member
In GoW they are upscaling only the textures (and not even all the textures if you read the paper), not the whole image which would require much more processing time, likely >30ms like fafalada said. Not practical at all.
That's not the point, you only need ML AI upscale 1 full native image per frame - as all the textures would be kraken compressed and in the final render already, so it would still be 2-5ms per frame.

But what might be the case, is that latency to BC7(block) compress the frame buffer per frame - which is 8bit quantizing with an ASIC - before doing an upscale could be too much for 16.6ms per frame when already losing 2ms to AI upscaling, or it might just not produce good enough results by diminishing the image quality with BC7 compression first, and rewriting the solution might lose performance when inference raw framebuffer data.
 

Vick

Gold Member
Not to mention their Bravia range is the best for SD content because of their image processing skills.
It really wouldn't surprise me if they beat out the competition in this regard, because they do it in every other sector.
Yeah, Bravia isn't surprising as Sony produces the highest end professional mastering monitors various industries, including obviously the movie industry, use to calibrate their content with.

Vincent from HDTVTest bought one their 30" professional monitors for £30,000 and has been using it for years to compare and determine which high end TV manufacturer gets closer. And none does obviously.
 

Zathalus

Member
Yeah, Bravia isn't surprising as Sony produces the highest end professional mastering monitors various industries, including obviously the movie industry, use to calibrate their content with.

Vincent from HDTVTest bought one their 30" professional monitors for £30,000 and has been using it for years to compare and determine which high end TV manufacturer gets closer. And none does obviously.
Bravia at the high end is great, I own the Bravia 9 85 inch myself. Best mini LED on the market. I wouldn’t touch Sony if I was budget limited though, the Bravia 3 by all accounts is terrible vs the competition. Pretty hard to compete with TCL and Hisense in the cheap TV market.
 

Same ol G

Member
It's interesting because, aside from producing the highest end professional cameras on the planet and their TV tech, when it comes to mass produced cheap electronic everyone in the Home Theatre community remembers Sony disrupting the BD player high end market in 2013 when they released their BDP-S790 at $229.99 with better image quality than the highest end Oppo BDP-103D.

Edit:

Who the hell is this Same ol G Same ol G guy laughing at our facts driven posts?
Cause you guys are funny, i'm laughing a lot since the reveal of this console.
 

sachos

Member
Yeah i noticed the comparisons they showed the fidelity modes looked weirdly blurrier during movement, like they turned off per object motion blur for the PS5 Pro. Its hard to tell what is YT compression and what not. Also, you are comparing to DLAA that renders at native resolution i dont think its a apples to apples comparison. Also the fact that it is running ina 4090 shohuldnt matter, if it was a 4060 the reconstruction would be the same no? Just slower.
 
Last edited:
Top Bottom