• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: AMD FSR 4 Upscaling Tested vs DLSS 3/4 & - A Big Leap Forward - RDNA 4 Delivers!

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Then what on earth are you doing here? :messenger_grinning_sweat: just kidding of course.



Indeed, although I am curious as to how things will progress moving forward, will Playstation iterate upon PSSR for the PS6 on the UDNA architecture, or will they pluck the latest iteration of FSR 4 or 5?

Glasses Why Dont We Have Both GIF by nounish ⌐◨-◨
 

yamaci17

Member
dlss 4 runs just fine even on a 2060 super


you can even get higher performance, higher image quality with lower VRAM usage going from quality to performance

so ps5 pro can't do what a mere 2060 super can do? I find this hard to believe
 
Last edited:

SweetTooth

Gold Member
Then what on earth are you doing here? :messenger_grinning_sweat: just kidding of course.



Indeed, although I am curious as to how things will progress moving forward, will Playstation iterate upon PSSR for the PS6 on the UDNA architecture, or will they pluck the latest iteration of FSR 4 or 5?

They have the right to do whatever they want! They developed both solutions
 
Last edited:

PaintTinJr

Member
They needed something that would work with the older version of Rdna (rdna 2 in the pro's case).
No, it is far more than that given that consoles can't afford to waste silicon and pass it onto the consumer while keeping costs down.

These techniques (DLSS3 or 4/FSR4) just take the AI matrix silicon(tensors) and use them for just the tiny amount of time between a frame's native render finishes and the 1-2ms before having to flip the frame from the back buffer to the front buffer, so are an 80-90% idle piece of silicon when not used for offline AI ML in commerce workloads as Quadro or Fire workstation GPUs.

The AI ML silicon in the PS5 Pro is integrated into the generalised WGP/CUs efficiently so developers can utilise the silicon 100% throughout the duration of a frame's rendering, and have made the solution so general that developers could create their own Generative Adversarial Network models that work at a sub render pass level in a deferred renderer, or operate per object type per frame or could replace PSSR entirely for their game and allow completely different AI inferenced rendering paradigms - not constrained by working at the end of a native frame or on just one frame, etc.
 
Last edited:

HeWhoWalks

Gold Member
I know... it's not my first "Boji on a PS5 Pro thread" experience... unfortunately.
But it’s not even a Pro thread. This should just be about FSR4 (which is currently a PC-only thing). That the Pro exists in this discussion is the real mystery here (well, not if you read one of the first posts in here).

As a fan and owner of both, these sorts of back and forths with needless chest bumping and trolling is how we end up here.
 
Last edited:

PaintTinJr

Member
But it’s not even a Pro thread. This should just be about FSR4 (which is currently a PC-only thing). That the Pro exists in this discussion is the real mystery here (well, not if you read one of the first posts in here).

As a fan and owner of both, these sorts of back and forths with needless chest bumping and trolling is how we end up here.
Do you equally object to DF comparing it to DLSS CNN and DLSS transformer model, then? As FSR4 isn't PC only, it is RDNA4 only.

edit: And in fact DLSS is Nvidia RTX only. The only semi cross over on different hardware is FSR1-3 which isn't ML , and XeSS is compatible with RDNA2-4, and RTX, but none of them are PC only as universal, rather than some GPU tech walled garden similar to PSSR. And every PSSR thread has made comparison with DLSS, FSR1-3 or XeSS, so why is the reverse not okay?
 
Last edited:
dlss 4 runs just fine even on a 2060 super

DLSS "performance" should boost frame rate more noticeably compared to "quality" mode. On my card going from DLSS quality to performance boost framerate by 30% compared to quality mode. On the RTX2060 performance is only 10% better compared to quality. I think the RTX2060 struggles to run DLSS.
 

yamaci17

Member
DLSS "performance" should boost frame rate more noticeably compared to "quality" mode. On my card going from DLSS quality to performance boost framerate by 30% compared to quality mode. On the RTX2060 performance is only 10% better compared to quality. I think the RTX2060 struggles to run DLSS.
no it doesn't struggle
performance improvements will depend on how the game interacts with upscaling
also what resolution, what game and what settings are you talking abut? dlss performance improvements at 4k and also with ray tracing will be different compared to raster




15% performance in ratchet from qualtiy to performance
10% performance in horizon
17% performance in black myth
42% performance in cyberpunk

also in the video I've posted 2060 super has

19% performance improvement in alan wake 2
14% performance in first descendants
19% performance in alan wake 2
10% performance in horizon
36% performance in cyberpunk

so it has nothing to do with 2060 super struggling to run DLSS or something. it just has to do with how much game benefits from upscaling in terms of performance.

unless you think 4070 super not getting 30% performance improvement in black myth, horizon and ratchet implies 4070 super struggles to run DLSS

you can even see the same with dlss 3 on the 4070 super. unless you also think 4070 super only getting 10% performance improvement with dlss 3 in horizon means that 4070 super also struggles to run ancient dlss 3

anyways that's not the point. dlss 4 performance looks and performs better on a 2060 super no matter what.
 
Last edited:

SKYF@ll

Member
no it doesn't struggle
performance improvements will depend on how the game interacts with upscaling
also what resolution, what game and what settings are you talking abut? dlss performance improvements at 4k and also with ray tracing will be different compared to raster
don't move goalposts

by your logic 4070 super also struggles to run dlss



15% performance in ratchet from qualtiy to performance
10% performance in horizon
17% performance in black myth
42% performance in cyberpunk

also in the video I've posted 2060 super has

19% performance improvement in alan wake 2
14% performance in first descendants
19% performance in alan wake 2
10% performance in horizon
36% performance in cyberpunk

so it has nothing to do with 2060 super struggling to run DLSS or something. it just has to do with how much game benefits from upscaling in terms of performance.

unless you think 4070 super not getting 30% performance improvement in black myth, horizon and ratchet implies 4070 super struggles to run DLSS

you can even see the same with dlss 3 on the 4070 super. unless you also think 4070 super only getting 10% performance improvement with dlss 3 in horizon means that 4070 super also struggles to run ancient dlss 3

anyways that's not the point. dlss 4 performance looks and performs better on a 2060 super no matter what.

So even if I lower the internal resolution by 50%, the frame rate only increases by 10-20%?
I felt that AI upscaling was a really heavy process.
 

KeplerL2

Member
I’m curious to know if you are aware of how the ML acceleration from RDNA4 compare to CDNA3? If they differ significantly do you expect UDNA/PS6 to follow which path?
CDNA supports larger matrix sizes and more precision formats. Also it has an additional register file to reduce VGPR pressure. I don't know what they are doing with matrix HW in gfx13 yet.
 

yamaci17

Member
So even if I lower the internal resolution by 50%, the frame rate only increases by 10-20%?
I felt that AI upscaling was a really heavy process.
this is performance mode against quality mode, not actual native (so a 33% internal resolution reduction)
2060 super video doesn't show native performance so I can't say how much overall performance improvement there is
massive performance improvement usually comes with quality. the rest is diminishing returns, especially at 1080p and 1440p

dlss performance is at its best when heavy path tracing/ray tracing is involved

it just depends on how game is configured to run. some games keep rendering a lot of things at native even with upscaling. in such titles you just don't see meaningful performance increase with upscaling. but you also don't get any visible visual quality loss with such titles

none of this matters. no matter what preset you choose, dlss 3 still has the temporal blur. and no matter what preset you choose, dlss 4 removes the TAA blur.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I have to admit that FSR4 has absolutely exceeded expectations. I was hoping for something that was at least close to DLSS3, but I was not expecting something that is better than DLSS3 and competitive against DLSS4.

FSR4 needed to get better in such a way that DLSS4 would not command a significant premium and AMD absolutely delivered that. DLSS4 is still mostly better, but I would not pay more than a $100 premium to use it.

Ray Tracing...however....was a disappointment as I was hoping for 4070 Ti level of performance, but it seems that we got at best a 4070 Super. If ray tracing REALLY matters that much, then maybe a 5070 Ti/4070 Ti Super MIGHT be worth it, especially if both cards can end up back at MSRP.

The best part FSR4 is only going to get better. DLSS4 will also get better, but that ceiling is rather low
 

HeWhoWalks

Gold Member
Do you equally object to DF comparing it to DLSS CNN and DLSS transformer model, then? As FSR4 isn't PC only, it is RDNA4 only.

edit: And in fact DLSS is Nvidia RTX only. The only semi cross over on different hardware is FSR1-3 which isn't ML , and XeSS is compatible with RDNA2-4, and RTX, but none of them are PC only as universal, rather than some GPU tech walled garden similar to PSSR. And every PSSR thread has made comparison with DLSS, FSR1-3 or XeSS, so why is the reverse not okay?
The general point was clearly lost on you, as a comparison to PSSR wasn’t the issue.

Also, until FSR4 is on consoles (and until the Pro, specifically, has it as an option, we are currently talking PC only.
 
Last edited:

Senua

Member
But it’s not even a Pro thread. This should just be about FSR4 (which is currently a PC-only thing). That the Pro exists in this discussion is the real mystery here (well, not if you read one of the first posts in here).

As a fan and owner of both, these sorts of back and forths with needless chest bumping and trolling is how we end up here.
Eh, it's inevitable PSSR will come up in these AI upscaling discussions. It's actually a testament to PSSR's competence really.
 

HeWhoWalks

Gold Member
Eh, it's inevitable PSSR will come up in these AI upscaling discussions. It's actually a testament to PSSR's competence really.
I agree - comparisons are inevitable. My main point is how we get to these silly back-and-forth battles, with people needlessly trolling and shot-taking.

If anything, it should be totally celebratory! That NVIDIA has even more competition in this area is only a great thing!
 
no it doesn't struggle
performance improvements will depend on how the game interacts with upscaling
also what resolution, what game and what settings are you talking abut? dlss performance improvements at 4k and also with ray tracing will be different compared to raster
don't move goalposts

by your logic 4070 super also struggles to run dlss



15% performance in ratchet from qualtiy to performance
10% performance in horizon
17% performance in black myth
42% performance in cyberpunk

alson in the video I've posted 2060 super has

19% performance improvement in alan wake 2
14% performance in first descendants
19% performance in alan wake 2
10% performance in horizon
36% performance in cyberpunk

so it has nothing to do with 2060 super struggling to run DLSS or something. it just has to do with how much game benefits from upscaling in terms of performance.

unless you think 4070 super not getting 30% performance improvement in black myth, horizon and ratchet implies 4070 super struggles to run DLSS

I know DLSS scales differently depending on the game, but still 10% relative difference between DLSS Quality and Performance modes seems unimpressive compared to my results.

Alan Wake 2 - Path Tracing
DLAA (Native) 37fps vs 63fps DLSS Quality = 70% relative difference
DLSS Quality 63fps vs 84fps DLSS Performance = 33%


Black Myth Wukong - Path Tracing
DLAA 36fps vs 63fps DLSS Quality = 75%
DLSS Quality 63fps vs 86fps DLSS Performance = 36%


Cyberpunk - Path Tracing
DLAA 41fps vs 75fps DLSS Quality = 83%
DLSS Quality 75fps vs 111fps DLSS Performance = 48%


Cyberpunk - Ray Tracing Ultra
DLAA 68fps vs 119fps DLSS Quality = 75%
DLSS Quality 119fps vs 160fps DLSS Performance = 34%


Cyberpunk - Raster, tested at 4K because of CPU limit at 1440p
DLAA 58fps vs 103fps DLSS Quality = 77%
DLSS Quality 103fps vs 138fps DLSS Performance = 34%


Dead Space Remake - Raster 4K tested at 4K because of CPU limit at 1440p
DLAA 71fps vs 115fps DLSS Quality = 62%
DLSS Quality 115fps vs 155fps DLSS Performance = 34%


GTA Enhanced Edition - Ray Tracing tested at 4K because of CPU limit at 1440p
DLAA 66fps vs 101fps DLSS Quality = 53%
DLSS Quality 101fps vs 124fps DLSS Performance = 22%


Silent Hill 2 Remake - Hardware Lumen / RT
DLAA 54fps vs 84fps DLSS Quality = 55%
DLSS Quality 84fps vs 102fps DLSS Performance = 21%


The Witcher 3 - Ray Tracing
DLAA 79fps vs 116fps DLSS Quality = 46%
DLSS Quality 116fps vs 139fps DLSS Performance = 20%


The Witcher 3 - Raster
DLAA 133fps vs 206fps DLSS Quality = 54%
DLSS Quality 206fps vs 251fps DLSS Performance = 21%

 
Last edited:

yamaci17

Member
I know DLSS scales differently depending on the game, but still 10% relative difference between DLSS Quality and Performance modes seems unimpressive compared to my results.
well games tested in the 2060 super benchmark and your examples don't have that much in common
that is why I'm surprised you came to that conclusion actually
 
well games tested in the 2060 super benchmark and your examples don't have that much in common
that is why I'm surprised you came to that conclusion actually
I tested many games and I dont remember any game that would show just 10% relative difference between DLSSQuality and Performance. The RTX2060 has 10% relative difference almost in every game. DLSS transformer works on the RTX2060, but performance scaling is just not good enough. I would never use DLSS Performance if it would be only offering 10% performance boost over DLSS Quality.
 
Last edited:

yamaci17

Member
I tested many games and I dont remember any game that would show just 10% relative difference between DLSSQuality and Performance. The RTX2060 has 10% relative difference almost in every game. I would never use DLSS Performance if it would be only offering 10% performance boost over DLSS Quality.
what? it has 36% performance improvement in cyberpunk
19% in indiana jones
19% in stalker 2
20% in alan wake 2
16% in god of war ragnarok
20% in the last of us part 1

that's half the games tested in the video already
 

Bojji

Member
So based on detailed HU video FSR4 is better in some things than dlss3 and worse in others. General conclusion from them was that it's better in general. Dlss4 is better than both outside of disocclusion test.

Dlss3<FSR4<<DLSS4

Super impressive stuff for first public version. Finally Nvidia has some serious competition in ML SR space.
 

kevboard

Member
I know DLSS scales differently depending on the game, but still 10% relative difference between DLSS Quality and Performance modes seems unimpressive compared to my results.

Alan Wake 2 - Path Tracing
DLAA (Native) 37fps vs 63fps DLSS Quality = 70% relative difference
DLSS Quality 63fps vs 84fps DLSS Performance = 33%


Black Myth Wukong - Path Tracing
DLAA 36fps vs 63fps DLSS Quality = 75%
DLSS Quality 63fps vs 86fps DLSS Performance = 36%


Cyberpunk - Path Tracing
DLAA 41fps vs 75fps DLSS Quality = 83%
DLSS Quality 75fps vs 111fps DLSS Performance = 48%


Cyberpunk - Ray Tracing Ultra
DLAA 68fps vs 119fps DLSS Quality = 75%
DLSS Quality 119fps vs 160fps DLSS Performance = 34%


Cyberpunk - Raster, tested at 4K because of CPU limit at 1440p
DLAA 58fps vs 103fps DLSS Quality = 77%
DLSS Quality 103fps vs 138fps DLSS Performance = 34%


Dead Space Remake - Raster 4K tested at 4K because of CPU limit at 1440p
DLAA 71fps vs 115fps DLSS Quality = 62%
DLSS Quality 115fps vs 155fps DLSS Performance = 34%


GTA Enhanced Edition - Ray Tracing tested at 4K because of CPU limit at 1440p
DLAA 66fps vs 101fps DLSS Quality = 53%
DLSS Quality 101fps vs 124fps DLSS Performance = 22%


Silent Hill 2 Remake - Hardware Lumen / RT
DLAA 54fps vs 84fps DLSS Quality = 55%
DLSS Quality 84fps vs 102fps DLSS Performance = 21%


The Witcher 3 - Ray Tracing
DLAA 79fps vs 116fps DLSS Quality = 46%
DLSS Quality 116fps vs 139fps DLSS Performance = 20%


The Witcher 3 - Raster
DLAA 133fps vs 206fps DLSS Quality = 54%
DLSS Quality 206fps vs 251fps DLSS Performance = 21%


the higher the target resolution, the less performance you gain. that's not surprising.
 
what? it has 36% performance improvement in cyberpunk
19% in indiana jones
19% in stalker 2
20% in alan wake 2
16% in god of war ragnarok
20% in the last of us part 1

that's half the games tested in the video already
The RTX2060 gained 8% to 27% going from DLSS Quality to DLSS Performance. This is based on 10 tests. I also did 10 tests and my card gained between 20% and 48%.

Nvidia offers support for DLSS4 and ray reconstruction on Turing GPUs, but the (performance) cost of using these features is much higher compared to newer generations. This technology obviously needs more tensor cores / AI resources in order to work at its best.


Alan Wake 2 - 21% relative difference

74s0q23.jpeg


Cyberpunk - 27% relative difference


GOBOrea.jpeg


Final Fantasy Rebirth - 10% relative difference

1QyipIs.jpeg


GOW Ragnarok - 11% relative difference

r44cKTE.jpeg


Hogwarts Legacy - 13% relative difference

HQuQ0tG.jpeg


Horizon 2 - 8% relative difference

Vrf3UwI.jpeg


Indiana Jones - 16% relative difference

6tFuFsk.jpeg


Stalker 2 - 16% relative difference

EN14phY.jpeg


The First Descendant - 14% relative difference

c1u1NGf.jpeg


The Last Of Us Remake - 19% relative difference


U4Bm37O.jpeg
 
Last edited:

viveks86

Member
The RTX2060 gained 8% to 27% going from DLSS Quality to DLSS Performance. This is based on 10 tests. I also did 10 tests and my card gained between 20% and 48%.

Nvidia offers support for DLSS4 and ray reconstruction on Turing GPUs, but the (performance) cost of using these features is obviously much higher compared to newer generations. It's still great that Nvidia has offered DLSS4 / RR support for the old Turing, but this technology obviously needs more tensor cores / AI resources in order to work at its best.


Alan Wake 2 - 21% relative difference

74s0q23.jpeg


Cyberpunk - 27% relative difference


GOBOrea.jpeg


Final Fantasy Rebirth - 10% relative difference

1QyipIs.jpeg


GOW Ragnarok - 11% relative difference

r44cKTE.jpeg


Hogwarts Legacy - 13% relative difference

HQuQ0tG.jpeg


Horizon 2 - 8% relative difference

Vrf3UwI.jpeg


Indiana Jones - 16% relative difference

6tFuFsk.jpeg


Stalker 2 - 16% relative difference

EN14phY.jpeg


The First Descendant - 14% relative difference

c1u1NGf.jpeg


The Last Of Us Remake - 19% relative difference


U4Bm37O.jpeg
Performance looks so good that quality mode is almost redundant now. Not worth the hit, imo.
 

yamaci17

Member
test the same games at the same settings and same resolution configuration
otherwise all of your arguments are worthless as you're comparing apples to oranges
unless you do so, I cannot take you seriously
have a good day

not to mention you still haven't explained why 4070 super only gets 10% performance improvement in horizon. but somehow you use the same game to claim that 2060 super cannot run DLSS 4 well.
 
Last edited:

viveks86

Member
Doesn't DLSS4 performance produce similar results to DLSS3 quality?
Yeah! Sometimes even better, from what we've seen. It has made DLSS4 quality useful only if you have the best of the best cards with a lot of unused headroom for the game being played. Might as well just max out fps in performance mode.
 
  • Like
Reactions: GHG
Performance looks so good that quality mode is almost redundant now. Not worth the hit, imo.
It's still worth running games at a higher resolution, because some effects like the resolution of screen space reflections, and not to mention RT are tied to internal resolution, and will look much worse when using lower DLSS modes. Also ray reconstruction is extremely sensitive to internal resolution. DLAA RR looks amazing (image look razor sharp and textures arnt filtered), but in DLSS Performance mode ray reconstruction makes the image look like a paint (filtered / overprocessed look).

test the same games at the same settings and same resolution configuration
otherwise all of your arguments are worthless as you're comparing apples to oranges
unless you do so, I cannot take you seriously
have a good day
I tested some of the games from that list, but I dont have all of them to test every single one.
 
Last edited:

viveks86

Member
It's still worth running games at a higher resolution, because some effects like the resolution of screen space reflections, and not to mention RT are tied to internal resolution, and will look much worse when using lower DLSS modes. Also ray reconstruction is extremely sensitive to internal resolution. DLAA RR looks amazing, but in DLSS Performance mode ray reconstruction makes the image look like a paint (filtered / overprocessed look).
Any comparisons out there of this?
 

PaintTinJr

Member
I don't think it will be possible to use FSR4 on an nvidia GPU.
And there isn't any reason for anyone to want that, because dlss is still the best upscaler.
One reason would be to take Nvidia's DLSS stick away and destigmatizing GPUs that aren't Nvidia - resulting in better competition and an more open inclusive standard for ML AI scaling on PC - lowering GPU prices.
 

GHG

Gold Member
I don't think it will be possible to use FSR4 on an nvidia GPU.
And there isn't any reason for anyone to want that, because dlss is still the best upscaler.

Yeh I'm more asking out of curiosity (if it were possible). Also because with the rate of improvement AMD have shown here, there's every possibility that FSR4 might evolve to do certain things better than DLSS (so might end up being the better solution for certain individual games).
 

poppabk

Cheeks Spread for Digital Only Future
Holy shit did people actually use FSR 3.1? That looks like unrelenting dog shit. It's honestly so bad that it's not even worth testing.
It's only any good (as in acceptable trade off in image quality versus performance gain) in quality mode (4K not sure about any other output res)- anything below and it turns to absolute shit rapidly. DLSS3 suffered in quality in performance mode, but could work.in balanced mode and was pretty much on par with native in quality mode. The new upscalers can finally give acceptable results in performance mode.
 

poppabk

Cheeks Spread for Digital Only Future
No, it is far more than that given that consoles can't afford to waste silicon and pass it onto the consumer while keeping costs down.

These techniques (DLSS3 or 4/FSR4) just take the AI matrix silicon(tensors) and use them for just the tiny amount of time between a frame's native render finishes and the 1-2ms before having to flip the frame from the back buffer to the front buffer, so are an 80-90% idle piece of silicon when not used for offline AI ML in commerce workloads as Quadro or Fire workstation GPUs.

The AI ML silicon in the PS5 Pro is integrated into the generalised WGP/CUs efficiently so developers can utilise the silicon 100% throughout the duration of a frame's rendering, and have made the solution so general that developers could create their own Generative Adversarial Network models that work at a sub render pass level in a deferred renderer, or operate per object type per frame or could replace PSSR entirely for their game and allow completely different AI inferenced rendering paradigms - not constrained by working at the end of a native frame or on just one frame, etc.
I knew this was a PaintTinJr post before I scrolled up to see the name.
 
Any comparisons out there of this?

Look at the car engine texture.

DLAA + RR vs DLSS Performance + RR


and 2'nd comparison


RR looks absolutely stunning at native / DLAA resolution. Sharpness is perfect and ray tracing effects are reconstructed without blur. Unfortunately when you use DLSS Performance and Ray Reconstruction texture details are are no longer as sharp, and what's more RR also makes the lighting less stable. It's impossible to show that difference on the static image, but it looks like textures starts to boil.

DLSS Performance looks amazing, just not with RR on top of that.
 
Last edited:
I don't game on PC, but is there a rule saying AMD is not allowed to keep FSR 4 exclusive on AMD cards?

From a competitive perspective, I'm sure they'd want to.
 

PaintTinJr

Member
I knew this was a PaintTinJr post before I scrolled up to see the name.
And that matters why - exactly ?

Should I know who you are? and be able to infer something from your drive-by comment?
Seems like a pretty random comment you're making without even saying if you disagree, or agree.

What am I supposed to do with that? :)
 

viveks86

Member
Look at the car engine texture.

DLAA + RR vs DLSS Performance + RR


and 2'nd comparison


RR looks absolutely stunning at native / DLAA resolution. Sharpness is perfect and ray tracing effects are reconstructed without blur. Unfortunately when you use DLSS Performance and Ray Reconstruction texture details are are no longer as sharp, and what's more RR also makes the lighting less stable. It's impossible to show that difference on the static image, but it looks like textures starts to boil.

DLSS Performance looks amazing, just not with RR on top of that.
Thanks! Yeah RR still has a ways to go. Whichever genius figures out how to stop the boiling with upscaling should get a big fat raise. I think it’s just a matter of time.
 
Top Bottom