• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro/PSSR Appears to Provide Better Image Reconstruction than DLSS Running on a 4090 GPU

Gaiff

SBI’s Resident Gaslighter
We will see once games like wolverine and death stranding 2 release on both ofcourse the pc upgrade is never ending so eventually you will brute force 100+ frames but the spending is endless too lol
Don't need to wait. 45% better rendering and 28% more bandwidth won't allow anything to claw back 200% better rendering and 125% more bandwidth. Even now, you have games like TLOU Part I where the 4090 blows past 90fps at native 4K despite being horribly optimized on PC whereas the Pro needs to upscale from 1440p to hit 60.

The PS5 Pro will do very well, possibly even better than something like an RTX 4070 in some titles and even better than an RTX 4070S in first-party ones, but the 4090 is twice the power of the 4070, which the Pro is often compared to (and the 4070 on paper at least is a chunk above). The Pro beating the 4090 would be akin to the regular PS5 beating the 3090, which is wholly impossible.

This talk is seriously asinine. There's no point in comparing a monster $1600-2000 GPU to a console. Yes, it's the shiny new toy, and yes, it does cool stuff, but let's not turn into idiots and believe in fantasies.
 
Last edited:

Fafalada

Fafracer forever
definitely looks to be doing what the patent we looked at described, meaning if DF are pixel counting, then it is the cheaper non-AI inference hole filing they are counting at those resolution numbers.
Not saying it's impossible - but we're talking 300 INT8 TOPs hardware which is like - double what Turing series did, and even double the likes of 3070, and well above 3080. It doesn't look like it would need inference optimization on that scale given what NVidia got out of those ops.
Besides - I still say that patent was describing reconstruction of remote-rendered video-streams, not locally rendered content, ie. it would be aimed at very compute constrained devices (not just AI, but also general compute).
 

Gaiff

SBI’s Resident Gaslighter
And yet the monster fails the very simple time to crate metric popularized by Old Man Murray.
What can I say? That bespoke crate generation chip on the Pro was no exaggeration. Cerny meant business.
 
Last edited:

Bitstream

Member
Another predictably dishonest take from you. Now 120fps and 60fps is a tie because of bullshit reasons you made up.
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.
 
Last edited:

diffusionx

Gold Member
Don't need to wait. 45% better rendering and 28% more bandwidth won't allow anything to claw back 200% better rendering and 125% more bandwidth. Even now, you have games like TLOU Part I where the 4090 blows past 90fps at native 4K despite being horribly optimized on PC whereas the Pro needs to upscale from 1440p to hit 60.

The PS5 Pro will do very well, possibly even better than something like an RTX 4070 in some titles and even better than an RTX 4070S in first-party ones, but the 4090 is twice the power of the 4070, which the Pro is often compared to (and the 4070 on paper at least is a chunk above). The Pro beating the 4090 would be akin to the regular PS5 beating the 3090, which is wholly impossible.

This talk is seriously asinine. There's no point in comparing a monster $1600-2000 GPU to a console. Yes, it's the shiny new toy, yes, it does cool stuff, but let's not turn into idiots and believe in fantasies.
I don't get it. I'm getting the PS5 Pro because I like PS5 and want to play it in the best light, not because I seriously think it is going to compare to a GPU that is in a PC build that is a good four to five times more expensive.
 

Bojji

Member
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.

With vrr there is no problem with frame rate between 60-120fps, it's better than locked 60.

Check out Sony games like Gow Ragnarok or uncharted that have unlocked frame rate above 60. Less input lag and it's smoother.
 

ap_puff

Member
With vrr there is no problem with frame rate between 60-120fps, it's better than locked 60.

Check out Sony games like Gow Ragnarok or uncharted that have unlocked frame rate above 60. Less input lag and it's smoother.
Depends on how stable the framerate is, wild swings still feel bad even with VRR, you can detect the unevenness of the frame pacing. It feels really bad to drop from 120 to 80, for example.
 

Gaiff

SBI’s Resident Gaslighter
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.
This entirely depends on the game and has nothing to do with not being able to hit 120fps consistently. You can have an fps cap and hit 60 consistently without having an even frame pacing. Frame rate is merely the number of frames per second, but our eyes perceive many more than that. You can have 60 frames every second, but it doesn't mean that they're delivered every 16.66ms. You could have 1 frame at 8.33ms and then another at 33.32 and it could be like that for a whole second and repeat consistently. Your frame rate would still be 60fps, but it would be stuttery as hell. With VRR, this is massively mitigated. For instance, GOWR on PS5 has an unlocked high frame rate mode that generally hovers around 70-90fps. Doesn't come close to 120, but it's still incredibly smooth with very even frame times and no massive spikes.

Bloodborne hits 30 consistently but has awful frame pacing so it feels terrible anyway. Not being able to hit 120 consistently in no way means you will have bad frame pacing and it's certainly not better to lock your fps to 60 if you have a VRR display,

Edit: Hell, Rift Apart actually has an unlocked frame rate mode where it uses DRS and sits between 1080p-1440p and generally around 80fps. Still plays well without stutters or frame time spikes.



Most PS5 exclusives have a high frame rate mode with VRR displays at 120Hz that hits above 60fps but way below 120 and ais often the smoothest experience.
 
Last edited:

Bitstream

Member
This entirely depends on the game and has nothing to do with not being able to hit 120fps consistently. You can have an fps cap and hit 60 consistently without having an even frame pacing. Frame rate is merely the number of frames per second, but our eyes perceive many more than that. You can have 60 frames every second, but it doesn't mean that they're delivered every 16.66ms. You could have 1 frame at 8.33ms and then another at 33.32 and it could be like that for a whole second and repeat consistently. Your frame rate would still be 60fps, but it would be stuttery as hell. With VRR, this is massively mitigated. For instance, GOWR on PS5 has an unlocked high frame rate mode that generally hovers around 70-90fps. Doesn't come close to 120, but it's still incredibly smooth with very even frame times and no massive spikes.

Bloodborne hits 30 consistently but has awful frame pacing so it feels terrible anyway. Not being able to hit 120 consistently in no way means you will have bad frame pacing and it's certainly not better to lock your fps to 60 if you have a VRR display,

Edit: Hell, Rift Apart actually has an unlocked frame rate mode where it uses DRS and sits between 1080p-1440p and generally around 80fps. Still plays well without stutters or frame time spikes.


wrong again. Here's DF on why vrr doesn't save the day in the exact scenario I'm describing, inconsistently delivered frames.



You keep taking things I say and making the dumbest generalizations about them. when i say 60 fps solid, OF FUCKING COURSE I mean perfectly frame timed apart from one another. Why would I mean anything differently.

Do I look like a bitch to you?

pulp fiction does he look like a bitch GIF


Then stop trying to fuck me like one.

Pulp Fiction GIF
 

kevboard

Member
Pay attention Gaiff, cause you're not thinking about this logically. Everyone knows 120 solid is better than 60. HITTING 120 CONSISTENTLY is the problem. Without hitting 120 consistently, the spikes in frame pacing can make it feel like bloodborne's stutter. It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.

a framerate from 90 to 120 fps with VRR is perceptibly nearly perfectly smooth
 

Gaiff

SBI’s Resident Gaslighter
wrong again. Here's DF on why vrr doesn't save the day in the exact scenario I'm describing, inconsistently delivered frames.



You keep taking things I say and making the dumbest generalizations about them. when i say 60 fps solid, OF FUCKING COURSE I mean perfectly frame timed apart from one another. Why would I mean anything differently.

Oh, please. Stop: It's better to aim for a perfectly solid 60 than feel the struggle reaching for 120.

That's what you say and it's false.
Do I look like a bitch to you?

pulp fiction does he look like a bitch GIF


Then stop trying to fuck me like one.

Pulp Fiction GIF
Then stop saying nonsense. The video you linked specifically mentions if the game has bad frame pacing to begin with, VRR won't save you from that which I perfectly described. If the game has bad frame pacing, a 60fps cap also won't save you. In this specific debate, Rift Apart doesn't have bad frame pacing so there is absolutely no issue with going above 60fps without hitting 120. However, if you try that without VRR, you will get screen tearing and stutters.
 
Last edited:

kevboard

Member
If the frame pacing is good, yes. In a game like Jedi Survivor, it would still suck.

I mean, that goes without saying. as long as a game isn't fundamentally broken like Jedi Survivor, and as long as your GPU or CPU aren't acting up, 90 to 120 will be nearly perfectly smooth.
 

ap_puff

Member
This entirely depends on the game and has nothing to do with not being able to hit 120fps consistently. You can have an fps cap and hit 60 consistently without having an even frame pacing. Frame rate is merely the number of frames per second, but our eyes perceive many more than that. You can have 60 frames every second, but it doesn't mean that they're delivered every 16.66ms. You could have 1 frame at 8.33ms and then another at 33.32 and it could be like that for a whole second and repeat consistently. Your frame rate would still be 60fps, but it would be stuttery as hell. With VRR, this is massively mitigated. For instance, GOWR on PS5 has an unlocked high frame rate mode that generally hovers around 70-90fps. Doesn't come close to 120, but it's still incredibly smooth with very even frame times and no massive spikes.

Bloodborne hits 30 consistently but has awful frame pacing so it feels terrible anyway. Not being able to hit 120 consistently in no way means you will have bad frame pacing and it's certainly not better to lock your fps to 60 if you have a VRR display,

Edit: Hell, Rift Apart actually has an unlocked frame rate mode where it uses DRS and sits between 1080p-1440p and generally around 80fps. Still plays well without stutters or frame time spikes.



Most PS5 exclusives have a high frame rate mode with VRR displays at 120Hz that hits above 60fps but way below 120 and ais often the smoothest experience.

VRR does not help alleviate frame time spikes. It only eliminates screen tearing which reduces the perceptual window where inconsistent frametimes become bothersome. A steady 60fps is more pleasant than constantly bouncing between 80-120 fps, and a steady 120fps is better than a steady 60fps.
 
Last edited:

Kappa

Member
Man between the PS5 pro and switch 2 threads, you can really tell who has no idea what's so ever how hardware and software work
 
Last edited:

Bitstream

Member
VRR does not help alleviate frame time spikes. It only eliminates screen tearing which reduces the perceptual window where inconsistent frametimes become bothersome. A steady 60fps is more pleasant than constantly bouncing between 80-120 fps, and a steady 120fps is better than a steady 60fps.
This.
 

DrDryRub

Member
The problem with these stupid comparisons are that the games aren't being developed for the 4090 or anything in its class so no game will truly utilize what a 4070,4080 or 4090 can do. The games have a set but if textures and assets. A .ore powerful GPU won't magically make the assets be better than they were originally designed to be.
 

Justin9mm

Member
Don't need to wait. 45% better rendering and 28% more bandwidth won't allow anything to claw back 200% better rendering and 125% more bandwidth. Even now, you have games like TLOU Part I where the 4090 blows past 90fps at native 4K despite being horribly optimized on PC whereas the Pro needs to upscale from 1440p to hit 60.

The PS5 Pro will do very well, possibly even better than something like an RTX 4070 in some titles and even better than an RTX 4070S in first-party ones, but the 4090 is twice the power of the 4070, which the Pro is often compared to (and the 4070 on paper at least is a chunk above). The Pro beating the 4090 would be akin to the regular PS5 beating the 3090, which is wholly impossible.

This talk is seriously asinine. There's no point in comparing a monster $1600-2000 GPU to a console. Yes, it's the shiny new toy, yes, it does cool stuff, but let's not turn into idiots and believe in fantasies.
I think the one thing PS5 Pro might have going for it is a better optimised experience in some games but yeah, comparing a Pro to a 4090 on any level is ridiculous.
 

Kangx

Member
Taking a closer look at this comparison shot, The PS5 Pro Version has an additional Ray Traced Shadow to the right of the frame (green arrow), as well as additional foliage on the foreground (yellow arrow). The anti aliasing on the 4090 (Purple Arrows) is also of lower quality in comparison to the Pro.

1oU9CFJ.jpeg
Compare graphic with the 4090 is fool's errand. Most poster here largely mixed up graphic setting with upscaling characteristics. Fur, AF, raytracing, crowd size and etc. are graphic settings.

Compare upscaling method is good becuase depending on certain scenes one can look better than an other.

Obviously DLSS look better overall but not one person point out the most obvious characteristics the Alex always use to show motion artifacts. And there is obvious motion artifacts in the crowds for the DLSS image especially the one to the left of the red circle. Also, the tiles in the middle between the plant pots seem to exibit more details on the pro.

Further Analysis will require people with knowledge and zoom in. I suspect in the end, DLSS is still the superior image reconstruction overall. I don't think we need anymore discussion overall on whether which upscaling is better.
 
Last edited:
So Nvidia who IS powering the data centres of the world and who's deep learning knowledge through their engineers, software and tech, who kicked off the whole AI upscale boom and who have been solely about graphics since their inception have somehow been upstaged by hitherto unseen, untested and atm unknown by the public on a technical level AMD magic sauce for Sony.

Give your head a shake. And Nvidia and their technology, software and engineers are powering the AI revolution you muppet, what algorithms and deep learning models don't use tensor? You know as much about how PSSR works and performs as Kermit the frog and everyone else here on the forum, including me. Nada. But we do know Nvidia are out there with proven tech we can test and see now and what they've been updating and advancing for several years.

Will PSSR be better than DLSS, we don't know but it's highly unlikely due to the time lead of DLSS with powerful help from the tensor cores and several years of their engineers improving the tech.

Motion flow and checkerboard knowledge and some wizard engineers will help but you seriously think they will usurp a technology that's several iterations deep and has the most powerful AI upscale technologies behind it. Lol.

I have a PS5 and will be buying a Pro, I also have a 4090, I love games and hope PSSR is fantastic but I also don't go for childish warring. PSSR will not beat DLSS on all the evidence available, to you and me.

You know nothing about me and it's now clear you know nothing about graphics technology.

I won't waste my time with puerile platform warriors, high on their own Dunning-Kruger syndrome.
 
Here, there is a clear difference.



DLSS is better here. The lines on the floor around the feet of the Goon-4-Less soldier are just gone on the Pro, but this could be due to the much higher AF on PC (4x on PS5). Otherwise, DLSS is simply way better at reconstructing details. Ratchet's body is much more defined, the strands of fur on his tail are more visible, and everything just looks cleaner and higher res with DLSS. I honestly thought this was DLAA, but given the context of the video, it's more likely DLSS Quality.
What base resolution are they using here on the 4090? We know AI upscaling quality is defined by the native resolution (like the others upscaling techniques). We'll have definite answers when we'll know for sure both are using the same native resolution. If we want to judge the quality of their AI upscaling objectively obviously. If you want to judge the power of the GPU, well just use a native resolution on PC will settings set to max and a 4090 will obviously beat a 4070 like GPU.

But one thing is sure, if we start needing 4x zooms to see a difference, it means PSSR is doing pretty good and likely much better than DLSS1

OK I just looked and as expected the 4090 is running DLAA with a native 4K resolution in Ratchet (which runs at about 1440p native on Pro). And this is how they want to compare DLSS vs PSSR and say DLSS is still "king"?
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
VRR does not help alleviate frame time spikes. It only eliminates screen tearing which reduces the perceptual window where inconsistent frametimes become bothersome.
Yeah, so exactly like I said? However, if you try that without VRR, you will get screen tearing and stutters.
A steady 60fps is more pleasant than constantly bouncing between 80-120 fps, and a steady 120fps is better than a steady 60fps.
No, this is completely false. PlayStation has many games with unlocked fps and they often bounce around between 70-90 and are often described as the best performing and feeling mode. What year is this, 2008? You have people on this very site playing at high frame rates in tons of games and almost none of them locks the fps to 60. They let it go above without necessarily hitting 120. Not hitting 120fps does NOT mean you will get massive frame time spikes that will result in terrible stutters and a bad experience. As I said, bad frame times will happen regardless of whether or not a given cap is consistently hit. Jedi Survivor and Bloodborne are prime examples of this.

This is how John describes the high frame rate mode in GOWR

The reality, however, is that we're mostly looking at a performance between 80 to 90 frames per second in most scenarios. It can occasionally jump above and below this point, of course, but by and large, this is what you'll get during gameplay. Compared to 60fps, it's a significant jump, allowing more responsive and fluid gameplay. However, there's a catch - while you can enable this mode on any 120hz capable display, if you cannot utilize VRR, I would strongly suggest sticking with 60fps instead due to judder. Of course, if you can use VRR, this quickly becomes my preferred graphical mode. It really shines at such a high frame-rate and it's unlikely we'd have had access to this option if not for the fact that it's a cross-gen release.

So, no. What you're saying is blatantly incorrect.
What base resolution are they using here on the 4090? We know AI upscaling quality is defined by the native resolution (like the others upscaling techniques). We'll have definite answers when we'll know for sure both are using the same native resolution. If we want to judge the quality of their AI upscaling objectively obviously. If you want to judge the power of the GPU, well just use a native resolution on PC will settings set to max and a 4090 will obviously beat a 4070 like GPU.

But one thing is sure, if we start needing 4x zooms to see a difference, it means PSSR is doing pretty good and likely much better than DLSS1
The 4090 would have a base resolution of 1440p and upscale to 4K. And of course, it's better than DLSS1 and current FSR for that matter.
 
...

The 4090 would have a base resolution of 1440p and upscale to 4K. And of course, it's better than DLSS1 and current FSR for that matter.
I just checked the Ratchet comparisons. They are comparing DLAA at native 4k vs PSSR at about native 1440p and said DLSS is still "king". Oh you think they were not doing to do that kind of dishonest comparison?
 
Last edited:

Zathalus

Member
No, this is completely false. PlayStation has many games with unlocked fps and they often bounce around between 70-90 and are often described as the best performing and feeling mode. What year is this, 2008? You have people on this very site playing at high frame rates in tons of games and almost none of them locks the fps to 60. They let it go above without necessarily hitting 120. Not hitting 120fps does NOT mean you will get massive frame time spikes that will result in terrible stutters and a bad experience. As I said, bad frame times will happen regardless of whether or not a given cap is consistently hit. Jedi Survivor and Bloodborne are prime examples of this.

This is how John describes the high frame rate mode in GOWR
I think (with the posters later clarification) that 80-120fps doesn't feel great if the FPS is bouncing between those two extremes with regularity, as in within seconds of each other. But something like that is quite rare, usually the fps shouldn't fluctuate that bad and the difference in fps should be far more gradual.

I just checked the Ratchet comparisons. They are comparing DLAA at native 4k vs PSSR at native 1440p and said DLSS is still "king". Oh you think they were not doing to do that kind of dishonest comparison?
The original DF comparison was DLSS Quality vs PSSR, and PSSR was actually rendering at the higher resolution thanks to DRS. I think the comparison there was done to death with the conclusion being that DLSS is slightly better, but the image quality being close overall.
 

Gaiff

SBI’s Resident Gaslighter
I just checked the Ratchet comparisons. They are comparing DLAA at native 4k vs PSSR at native 1440p and said DLSS is still "king". Oh you think they were not doing to do that kind of dishonest comparison?
In the IGN video? It’s definitely DLSS. If it’s DLAA, they say as much. 22:35.



There was another video where they indeed compared DLAA with PSSR in the OP.
I think (with the posters later clarification) that 80-120fps doesn't feel great if the FPS is bouncing between those two extremes with regularity, as in within seconds of each other. But something like that is quite rare, usually the fps shouldn't fluctuate that bad and the difference in fps should be far more gradual.
This would make more sense, but most games do not have such massive load shifts where the frame rates suddenly and constantly goes up to 120 and then down to 80. Your frame rates will usually fall within a certain window.
 
Last edited:
DLSS is an algorithm. How it works is not hidden to anyone. All the major technologists in this field working on image reconstruction tech know how it works. Just as they know how FSR works and they'll eventually know how PSSR works once it's out in the wild and the documentation is in the hands of devs.

There's nothing specifically special about looking at an algorithm, understanding how it works and therefore how to make it better. Any engineer worth their socks is able to do this. What has held back FSR up until now hasn't been the software algorithm part of things, rather the lack of dedicated hardware support in silicon, e.g. tensor cores.

With PS5 Pro, AMD has closed the gap with the dedicated hardware support for AI computation (i.e. low precision matrix math arrays with large registers and a reasonable amount of on-die cache). Now they have the hardware, writing a software imagine reconstruction algorithm that performs better than DLSS is trivial.

It would not surprise me if DLSS is beaten by PSSR, and then the next iteration of DLSS comes out soon after and beats PSSR. Technology is always evolving, and what is implemented in actual processors in the wild is nothing close to the cutting edge in the research domain. It ALWAYS lags behind.
that is one of the most Dunning Kruger screaming pieces of text I´ve come across in quite a while....
Gordon Ramsay Facepalm GIF by Masterchef
 
Last edited:
In the IGN video? It’s definitely DLSS. If it’s DLAA, they say as much. 22:35.



There was another video where they indeed compared DLAA with PSSR in the OP.

This would make more sense, but most games do not have such massive load shifts where the frame rates suddenly and constantly goes up to 120 and then down to 80. Your frame rates will usually fall within a certain window.

Not sure about the difference between DLAA 4K and DLSS using a native 4K. But Ratchet is running at native 4K on the 4090 on the DF x IGN video (while it's about 1440p native on PS5 Pro). Which easily explains why there are more details on the leaves, Ratchet and elsewhere. native 4K is 125% more native pixels than 1440p. This is 4090 sheer power talking here, not quality of AI upscaling. And I bet they also used DLSS with a native resolution of 4K in all the others comparisons like in Horizon. They are dishonest because in the same segment Oliver is saying DLSS is still king while showing native 4K DLSS vs 1440P PSSR.

 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Not sure about the difference between DLAA 4K and DLSS using a native 4K. But Ratchet is running at native 4K on the 4090 on the DF x IGN video (while it's about 1440p native on PS5 Pro). Which easily explains why there are more details on the leaves, Ratchet and elsewhere. native 4K is 125% more native pixels than 1440p. This is 4090 sheer power talking here, not quality of AI upscaling. And I bet they also used DLSS with a native resolution of 4K in all the others comparisons like in Horizon.


He repeatedly says DLSS though and the tag says DLSS. They never use DLAA and DLSS interchangeably. DLAA is native resolution with AA applied on top, it’s not upscaling, which is what DLSS is. They’re comparing the different solutions and DLAA and DLSS aren’t the same thing.

It would be a first if they tried to pass off DLAA as DLSS.
 
He repeatedly says DLSS though and the tag says DLSS. They never use DLAA and DLSS interchangeably. DLAA is native resolution with AA applied on top, it’s not upscaling, which is what DLSS is. They’re comparing the different solutions and DLAA and DLSS aren’t the same thing.

It would be a first if they tried to pass off DLAA as DLSS.
Forget DLAA. I see you have been tricked too by all their editorial ploys

I just pixel counted the Ratchet footage and they somehow used a native resolution of native 4K with DLSS based on the Ratchet comparison I timestamped. Do you understand the trickery? They cleverly never say DLSS quality (so from 1440p) here of course. They set DLSS with a native resolution of 4K so they could say "DLSS" and make people believe the comparison was honest. But it's obviously not.
 

Gaiff

SBI’s Resident Gaslighter
Forget DLAA. I see you have been tricked too by all their editorial ploys

I just pixel counted the Ratchet footage and they somehow used a native resolution of native 4K with DLSS based on the Ratchet comparison I timestamped. Do you understand the trickery? They cleverly never say DLSS quality (so from 1440p) here of course. They set DLSS with a native resolution of 4K so they could say "DLSS" and make people believe the comparison was honest. But it's obviously not.
Then it’s not DLSS and is a blatant lie assuming you are correct. DLAA is DLSS without the upscaling part, y’know, the part where they reduce the input resolution. That’d be quite shocking as DF often uses DLAA in comparison shots and always say it’s DLAA.

Edit: What about Horizon?
 
Last edited:

Zathalus

Member
DLSS is an algorithm. How it works is not hidden to anyone.
Correction here, DLSS is completely black-boxed. Nobody outside Nvidia knows how it works exactly. While NVIDIA has explained the general principles of how DLSS works, and it is known to utilise temporal upscaling, the specific algorithms, training data, and model details are proprietary.
 

FireFly

Member
I just pixel counted the Ratchet footage and they somehow used a native resolution of native 4K with DLSS based on the Ratchet comparison I timestamped. Do you understand the trickery? They cleverly never say DLSS quality (so from 1440p) here of course. They set DLSS with a native resolution of 4K so they could say "DLSS" and make people believe the comparison was honest. But it's obviously not.
Are you sure you're counting the input and not the output resolution? You would only be able to get the input resolution on elements that are not upscaled.

Then it’s not DLSS and is a blatant lie assuming you are correct. DLAA is DLSS without the upscaling part, y’know, the part where they reduce the input resolution. That’d be quite shocking as DF often uses DLAA in comparison shots and always say it’s DLAA.

Edit: What about Horizon?
Technically DLAA does use upscaling to reach the higher than native resolution, which then gets downsampled. It's like SSAA with an upscaled image as the base.
 

PaintTinJr

Member
Not saying it's impossible - but we're talking 300 INT8 TOPs hardware which is like - double what Turing series did, and even double the likes of 3070, and well above 3080. It doesn't look like it would need inference optimization on that scale given what NVidia got out of those ops.
Besides - I still say that patent was describing reconstruction of remote-rendered video-streams, not locally rendered content, ie. it would be aimed at very compute constrained devices (not just AI, but also general compute).
The patent I read explicitly mentioned VR (ie high frame-rate, low latency local real-time graphics), and the visualization @ 33:30 in the state of play is intentionally brief and rolling off axis IMHO to visualize accurately but also continue to obscure the exact inner workings of the algorithm to non-PSSR engineers. The visualization also appears to be an in place technique, rather than a scaler like the hole filling patent describes, and IIRC no one from Sony or leaks has described it as an ML upscaler.

I understand your reasoning with the TOPS comparison but believe it only works if the Pro has that in addition to 67 half TF@ FP16 dual issue because working on that all being shared for rendering and PSSR 2ms per frame to redirect to PSSR (1/8 or about 37.5 TOPs of 300) isn't much when divided across 60fps IMO and would need cheap hole filling optimisation.

How representative the PSSR visualization is might be better gleamed from the images prior that briefly show chip block layout to represent the PS5 before extending a donut around the chip with new blocks to represent the revised Pro chip. If the blocks of the PS5 chip are accurate then it is probably fair to assume the visualizations are all good IMO - and vice versa.
 

Zathalus

Member
Not saying it's impossible - but we're talking 300 INT8 TOPs hardware which is like - double what Turing series did, and even double the likes of 3070, and well above 3080. It doesn't look like it would need inference optimization on that scale given what NVidia got out of those ops.
Besides - I still say that patent was describing reconstruction of remote-rendered video-streams, not locally rendered content, ie. it would be aimed at very compute constrained devices (not just AI, but also general compute).
3080 has around 500 INT8 TOPS. The Pro, based on RDNA4, is utilising sparsity for that TOPS figure. The Pro is close to a 3060Ti in terms of TOPS.
Technically DLAA does use upscaling to reach the higher than native resolution, which then gets downsampled. It's like SSAA with an upscaled image as the base.
As far as I know DLAA is basically regular TAA with a ML model on top of it to enhance image quality. Hence the cost being only a few percent higher then regular TAA.
 

FireFly

Member
As far as I know DLAA is basically regular TAA with a ML model on top of it to enhance image quality. Hence the cost being only a few percent higher then regular TAA.
You're right, I was confusing DLDSR with DLAA. DLAA is DLSS with the input and output resolutions the same, while DLDSR upscales to higher than native output resolutions.
 
Possibly, but based on this clip at least, DLSS is a cut above. It's specifically here, however. In other games, PSSR seemed really good.
It would be more fair to say ”this version of pssr” instead of ”this chip” since we can assume pssr will continue to evolve much like dlss even on this chip.
 

PaintTinJr

Member
3080 has around 500 INT8 TOPS. The Pro, based on RDNA4, is utilising sparsity for that TOPS figure. The Pro is close to a 3060Ti in terms of TOPS.
..
But that 500TOPs in terms of game use is theoretical AFAIK, the minute the card needs to do game rendering, and RT that TOPs number becomes far, far smaller and less efficient was how I understood it by the split of SMT to slower bus setup for RT - whereas the Ragnarok AI ML solution looks completely asynchronously integrated to gaming workloads on RDNA.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
3080 has around 500 INT8 TOPS. The Pro, based on RDNA4, is utilising sparsity for that TOPS figure. The Pro is close to a 3060Ti in terms of TOPS.
It would be exactly double, so 476 TOPS using sparsity, which was only introduced with Ampere. PS5 Pro also uses sparsity?
 

BlackTron

Member
Yes.

You'd be surprised how simple it is to improve an algorithm when you know how it works and you're specifically designing the underlying hardware to advance it.

Lay people on NeoGaf think this shit is rocket science. It's surprisingly simple when you understand the math behind it.

Wow! They should hire you!

Edit: on second thought IDK. You seem really smart...they might think you are overqualified for a task this easy.
 
Last edited:
Top Bottom