DLAA / DLSS image quality blind test

Which of the screenshots has the best image quality?

  • Screenshot number 1

    Votes: 11 16.4%
  • Screenshot number 2

    Votes: 14 20.9%
  • Screenshot number 3

    Votes: 13 19.4%
  • Screenshot number 4

    Votes: 29 43.3%

  • Total voters
    67
  • Poll closed .
When you say 90ms… you're talking end to end?

Either way, 10ms is significant to me. I work really hard to cut my latency to be as low as possible. And once your body gets used to that near instantaneous response, anything else feels off.

90ms engine + 10ms Screen latency is to me the acceptable upper limit for non-competitive games.

sadly not all games, even at 60fps, are below that limit.

so frame gen isn't an issue, it's the end to end lag that is ultimately the issue.
if you play on PC with frame gen at a 120fps target and Nvidia Reflex enabled, chances are your total input lag is below nearly any console game that isn't a competitive shooter, and below most games that do not support Nvidia Reflex on PC.
 
They all look fucking awful. PS5 Pro takes the win again. Number 4 is the only one without artifacting imo.

Isn't the biggest takeaway here though that the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse? And no any GPU is not worth 2k to play games with a marginal improvement.
 
They all look fucking awful. PS5 Pro takes the win again. Number 4 is the only one without artifacting imo.

Isn't the biggest takeaway here though that the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse? And no any GPU is not worth 2k to play games with a marginal improvement.

Show us PS5 Pro version in the same place to compare.
 
And no any GPU is not worth 2k to play games with a marginal improvement.

$1000 is even too much to pay, imo. Pretty much why I'm done with the high end of PC gaming. And I mean "high end" with the 4080/5080 tier, much less GPUs that cost twice as much. Next GPU for me will be the 5070 or 9070. I'm completely satisfied with 60fps gaming and that is certainly achievable in the mid-range with minimal difference in image quality.
 
OP clearly made super deceiving and devious comparision where there is 0 movement and pics are relatively small, we dont look at 4k screen from distances like we looked at hd or full hd screens, we need to be closer to apreciate it.
Even nvidia themselfs in their very marketing video for dlss/framegen used wukong in motion/durning fights to ilustrate how it looks.

DF mentioned IQ issues in wukong too:

Actually, I'm also very impressed with the DLSSP motion quality. In RDR2, even DLSS performance (with the updated 3.8.1 DLL) completely destroys TAA native, check it out for yourself. There's less shimmering during motion, and the image is also a lot sharper.

They all look fucking awful. PS5 Pro takes the win again. Number 4 is the only one without artifacting imo.

Isn't the biggest takeaway here though that the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse? And no any GPU is not worth 2k to play games with a marginal improvement.
Dude, that is a very bold statement to make :D.

I only played the PS5 version, but the image quality looked like this:

PS5 performance mode 40-60fps

25d10d16247e97b0712c.jpg


16392e21f8f9f7fe582a.jpg


200c0f408188e0bb6aeb.jpg


My PC 80-100fps

b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg

b1-Win64-Shipping-2024-09-01-00-07-52-582.jpg


b1-Win64-Shipping-2024-09-01-00-06-20-759.jpg

b1-Win64-Shipping-2024-09-01-00-25-53-709.jpg


If you have the PS5Pro, please upload your screenshots and we will be able to see how the image quality on the PS5Pro version compares to the PC version with DLSS. You don't have to go to the exact same place, but if you want to find it, here's a map.


a2.jpg


a1.jpg
 
Last edited:
the reason to play on PC is IQ and introducing DLSS, Frame gen etc. only serves to make it much worse?
Where did you take that from?

DLSS can look even better than native. No need for anti-aliasing anymore, image is crisp and clear.
 
First I watched without zooming in and noticed that something changed, but nothing to point my finger on. None of those looked inherently better or worse. Then I noticed some fine grass strands and branches look different, but it's hard to say which looks better.

Then I zoomed in and screenshot number 3 has the most sharpness, looking at some grass strands and the bamboo tree to the left, but looks also the least anti-aliased one (but it's still good IQ).
Screenshot number 1 seems to have the best anti-aliasing, it's smoother by a very small margin, but without seeing it in motion, it's hard to tell.
 
1.jpg is the best for me (picture 4), no contest. It has a smoother look, but things look fine.

All three others have shit going on (patterns etc...) on textures where it simply shouldn't happen. Rocks just to the left of the main characters arm have some real issues on all other screenshots. Same for the bamboos to the left, and actually everywhere when you start looking...

We need to go back to native pictures and resolution, and give up on all of this shit... :(

Sans-titre.png



Also be careful, as the order of the links is the reverse order of the names of the pictures.
 
Last edited:
Where did you take that from?

DLSS can look even better than native. No need for anti-aliasing anymore, image is crisp and clear.
It can look better by losing detail e.g. fine strands of aliased hair. The detail it can't resolve is lost, therefore it's a lossy upscaler that smooths out the lost data with AA or a form of it. I think a lot of people mean the image is more pleasing to my eye than the image is higher quality when speaking of DLSS. I'm just surprised that PC players fawn over what the tech essentially is. From a technical point of view I admit it's impressive but from a fundamental point of view it's still smoke and mirrors and that's not even still being able to pick out clear artifacts like the images OP posted.

Top picture is 1080 DLSS and bottom is 1080 Native.

7KPw2ur.jpeg
XTG0L3h.jpeg


I'm not here to convince anyone, this is just what I see. I just don't buy into the whole PCMR or the cost/performance/game improvement mantra. No matter how good the graphics card is, with PT and all the bells and whistles when the NPCs in the background of cyberpunk still look like this (and this is coming from someone who was running high end in the days of Source with 200 odd fps)

vhmNBWA.jpeg
 
Are you seriously using a base 720p resolution image to make your point? Why don't you use 4K native vs 4K DLSS? I suspect I know why...
I'm on my phone and they are just to hand tbh, I had to resize on my phone to use the imgur uploaded on the site.
 
It can look better by losing detail e.g. fine strands of aliased hair. The detail it can't resolve is lost, therefore it's a lossy upscaler that smooths out the lost data with AA or a form of it.
I disagree. More below.

Top picture is 1080 DLSS and bottom is 1080 Native.

7KPw2ur.jpeg
XTG0L3h.jpeg
Honestly at first glance I'd say top looks better. After all, it's 1080p, so you can't ask for crazy level of detail. Bottom one is jagtown.

That said. What's the native resolution of top? Meaning, is it DLSS Quality or DLSS Performance?

Also... is the bottom one without any kind of AA? I doubt anyone plays their games like that, looks like shit, although maybe the non-zoomed in original pic looks better. Maybe a better comparison would be DLSS vs Native with AA?

In any case I still think DLSS Quality for 4K can look better than native 4K, probably even more so once the new model for DLSS releases. And for DLAA I'd say it's better, no question.

I'm not here to convince anyone, this is just what I see. I just don't buy into the whole PCMR or the cost/performance/game improvement mantra. No matter how good the graphics card is, with PT and all the bells and whistles when the NPCs in the background of cyberpunk still look like this (and this is coming from someone who was running high end in the days of Source with 200 odd fps)

vhmNBWA.jpeg
Yeah well, I'd say that's a problem with the games and not just PC. Diminishing returns and all that. But hey it's an open-world so ofc it's going to look kinda shit when it's a scene with poor lighting.

Try comparing other, more linear games like Black Myth: Wukong or Space Marine 2. I'd guess you'll see more improvements there.
 
I'm on my phone and they are just to hand tbh, I had to resize on my phone to use the imgur uploaded on the site.
It's no secret that the more data you give temporal upscalers to work with, the better the results. 1080p native is usually significantly better than 1080p DLSS Q. 1440p is much closer, and 4K is often a wash at standard viewing distances.
 
I disagree. More below.


Honestly at first glance I'd say top looks better. After all, it's 1080p, so you can't ask for crazy level of detail. Bottom one is jagtown.

That said. What's the native resolution of top? Meaning, is it DLSS Quality or DLSS Performance?

Also... is the bottom one without any kind of AA? I doubt anyone plays their games like that, looks like shit, although maybe the non-zoomed in original pic looks better. Maybe a better comparison would be DLSS vs Native with AA?

In any case I still think DLSS Quality for 4K can look better than native 4K, probably even more so once the new model for DLSS releases. And for DLAA I'd say it's better, no question.


Yeah well, I'd say that's a problem with the games and not just PC. Diminishing returns and all that. But hey it's an open-world so ofc it's going to look kinda shit when it's a scene with poor lighting.

Try comparing other, more linear games like Black Myth: Wukong or Space Marine 2. I'd guess you'll see more improvements there.
Both were native 1080 but I'll have to wait till tomorrow to post both in full. It's zoomed in crazy obviously but the point is basically softening the image loses detail (as in all scenarios not just games/pcs).

It's no secret that the more data you give temporal upscalers to work with, the better the results. 1080p native is usually significantly better than 1080p DLSS Q. 1440p is much closer, and 4K is often a wash at standard viewing distances.

And yes (as TintoConCasera TintoConCasera alludes) I think ultimately the discussion should be around DLAA v DLSS. One is focused on native IQ and one is focused on performance with lossy (albeit excellent) upscaling. The more complete the source the better the upscale (assuming you don't go to 8k or 16k to see how far you can push)

DLSS is better or equal to native is just not right, and I think is a band aid for those who really want to run DLAA but not have the power of their hardware questioned. There are a lot of variables, and some engines, some games and some implementations of whatever AA the developer chooses play nicer with DLSS of course. And DLSS is improving at a fair rate but I've not seen any image comparison where it isn't obvious that it's not native as per OPs example, hoping I selected the right one as the ordering seems messed up.

I would always choose native personally which I guess is the point of the thread.
 
Last edited:
DLSS is better or equal to native is just not right
Lets agree to disagree then. :lollipop_content:

I think it can be, specially considering how bad games can look on native with no AA and how shitty some game's AA implementation can be.

But the cool thing is, native and AA are still there for those who want them. Although maybe they'll be fucked in the future if the trend ends up being more AI push instead of more raw power push.

It's going to be interesting to see where this goes.
 
Last edited:
And yes (as TintoConCasera TintoConCasera alludes) I think ultimately the discussion should be around DLAA v DLSS. One is focused on native IQ and one is focused on performance with lossy (albeit excellent) upscaling. The more complete the source the better the upscale (assuming you don't go to 8k or 16k to see how far you can push)

DLSS is better or equal to native is just not right, and I think is a band aid for those who really want to run DLAA but not have the power of their hardware questioned. There are a lot of variables, and some engines, some games and some implementations of whatever AA the developer chooses play nicer with DLSS of course. And DLSS is improving at a fair rate but I've not seen any image comparison where it isn't obvious that it's not native as per OPs example, hoping I selected the right one as the ordering seems messed up.

I would always choose native personally which I guess is the point of the thread.
Native isn't really a thing anymore. In this day and age, an anti-aliasing solution is needed. For instance, I used an hex editor to get rid of Cyberpunk's built-in TAA and the aliasing is so bad that TAA is preferable. In the case of the ubiquitous TAA, DLSS Q does offer some noticeable advantages over it. Is the overall image quality superior with DLSS? Often times, it isn't, but the 30% performance increase makes the choice clear. It's also neat that DLSS boasts key advantages such as the resolve of fine details, text, and other aspects of the image. On the flip side, it does have issues with ghosting, blurring, disocclusion artifacts, and other problems. Granted, TAA isn't very good and there's even an entire subreddit dedicated to shitting on it.

While I do agree that DLSS Q>Native+TAA is more often than not false, the differences are often minor enough that I can concede DLSS the win due to it being better in some important facets of the final image. The upgrade to motion clarity going from 45>60fps offsets a lot of those losses.
 
Last edited:
Make videos for comparisons. I can use 20x mode on Lossless Scaling and have 1200fps and you wouldn't notice it in still images, but you might throw up trying to use it when actually playing.
 
1.jpg is the best for me (picture 4), no contest. It has a smoother look, but things look fine.

All three others have shit going on (patterns etc...) on textures where it simply shouldn't happen. Rocks just to the left of the main characters arm have some real issues on all other screenshots. Same for the bamboos to the left, and actually everywhere when you start looking...

We need to go back to native pictures and resolution, and give up on all of this shit... :(

Sans-titre.png



Also be careful, as the order of the links is the reverse order of the names of the pictures.

Yes that's the way I see it as well.

DLSS has been heralded yet it has many faults. Reminds me of checkerboarding, upscale is upscale, trying to resolve detail on the fly that isn't there.
 
It can look better by losing detail e.g. fine strands of aliased hair. The detail it can't resolve is lost, therefore it's a lossy upscaler that smooths out the lost data with AA or a form of it. I think a lot of people mean the image is more pleasing to my eye than the image is higher quality when speaking of DLSS. I'm just surprised that PC players fawn over what the tech essentially is. From a technical point of view I admit it's impressive but from a fundamental point of view it's still smoke and mirrors and that's not even still being able to pick out clear artifacts like the images OP posted.

Top picture is 1080 DLSS and bottom is 1080 Native.

7KPw2ur.jpeg
XTG0L3h.jpeg


I'm not here to convince anyone, this is just what I see. I just don't buy into the whole PCMR or the cost/performance/game improvement mantra. No matter how good the graphics card is, with PT and all the bells and whistles when the NPCs in the background of cyberpunk still look like this (and this is coming from someone who was running high end in the days of Source with 200 odd fps)

vhmNBWA.jpeg
To be honest, The Witcher 3 has one of the worst DLSS implementations I have ever seen. Installing DLSS 3.8.1 helps a lot though

DLSSQ+FG max settings

4K-DLSSQ.jpg


FSRQ + FG

4K-FSRQ.jpg



TAA + FG native


4K-TAA.jpg


FXAA + FG

4K-FXAA.jpg


No AA + FG

4K-no-AA.jpg


FXAA and No AA look the sharpest, but with ray tracing the shadows are so detailed and sharp that the image start to shimmer, so I dont recommend to play this game without AA. TAA / DLSS / FSR completely eliminates shimmering, but the image starts to look soft. Also in this particular game it's hard to decide which temporal AA looks the best.


In RDR2 it was much easier for me to decide which AA looked the best.

TAA

N.jpg


DLSS Quality

Q.jpg


DLSS Performance

P.jpg
 
Last edited:
in RDR2 it was much easier for me to decide which AA looked the best.

TAA

N.jpg


DLSS Quality

Q.jpg


DLSS Performance

P.jpg

people really need to understand how bad TAA is in many games to get why people say DLSS is often better than native.
DLSS is TAA with ML based error correction. that ML based error correction, whenever it works well, simply beats TAA.
 
Playing RE 4 and man FSR sucks, as does the default TAA options. Baffling that Capcom don't support DLSS in a lot of their games as well.

We need a better standard going forward. Thankfully FSR 4 looks to be a massive improvement and the new tranformer model in DLSS looks great.
 
Last edited:
OK guys, let's spill the beans:messenger_beaming:. I have updated the first post and you can also see the poll results.

Screenshot number 4 was the one with DLAA, and most people chose that answer, but I think some users have might spoiled the fun for others by not only giving the correct answer, but also providing the evidence to support their answer. By zooming in so much it was much easier to see which image had the best quality, so if people were willing to just read this thread, the answer was right in front of them on the first page.

I was most impressed by squidilix squidilix answer. Not only did he guess where the DLAA was, but he also got all the other DLSS modes right.
 
Last edited:
What about people who choose the second screenshot?

Jennifer Lawrence Oops GIF
Both DLSSP and DLSSB provide insane performance boost compared to native, hell even DLSSQ offered 82% boost. However, those who opt for DLAA will need to spend a lot of money to be satisfied with the image quality. The RTX5090 seems to make sense from their perspective.
 
Last edited:
Show us PS5 Pro version in the same place to compare.
ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....
So try ignoring compression or ignore these at all

The game looks very clean. Default sharpening value (5/10 I think)
llC9Srl.jpeg



KYlK0Yl.jpeg



mTh8Xhj.jpeg



ksPKvKQ.jpeg



3Q8Kvcn.jpeg



R8TIjAu.jpeg



ScAluPm.jpeg



1DFWDWu.jpeg
 
ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....
So try ignoring compression or ignore these at all

The game looks very clean. Default sharpening value (5/10 I think)
llC9Srl.jpeg



KYlK0Yl.jpeg



mTh8Xhj.jpeg



ksPKvKQ.jpeg



3Q8Kvcn.jpeg



R8TIjAu.jpeg



ScAluPm.jpeg



1DFWDWu.jpeg

Gameplay or photo mode?

You or/and OP need to sync up place and sharpness, I'm curious how they compare in 1:1. Wukong has best PSSR implementation so far in UE5.
 
Gameplay or photo mode?

You or/and OP need to sync up place and sharpness, I'm curious how they compare in 1:1. Wukong has best PSSR implementation so far in UE5.
Mix. Some are photo mode and some are gameplay.
I can take exact OP photo in .png no problem. Where is it?
And yeah, what sharpness.
Should I use quality mode and not balanced?
 
$1000 is even too much to pay, imo. Pretty much why I'm done with the high end of PC gaming. And I mean "high end" with the 4080/5080 tier, much less GPUs that cost twice as much. Next GPU for me will be the 5070 or 9070. I'm completely satisfied with 60fps gaming and that is certainly achievable in the mid-range with minimal difference in image quality.
Hell at this point those cards are enough to do 4K. I'm just over it, especially with engines like Unreal 5 skyrocketing system requirements pointlessly. Frostpunk 2 is literal insanity for what's on display and what's required to run it.
 
ps5 pro, Balanced mode. I use JPG format because I take hundreds of shots and it adds up on the ssd....
So try ignoring compression or ignore these at all

The game looks very clean. Default sharpening value (5/10 I think)
llC9Srl.jpeg



KYlK0Yl.jpeg



mTh8Xhj.jpeg



ksPKvKQ.jpeg



3Q8Kvcn.jpeg



R8TIjAu.jpeg



ScAluPm.jpeg



1DFWDWu.jpeg

There's clearly a big difference compared to the base PS5, but the image quality on the PS5Pro still doesn't look good to be honest and I'm taking into account that your screenshots are way more compressed. The details are clearly upscaled, and there's also excessive image sharpenning.
 
Gameplay or photo mode?

You or/and OP need to sync up place and sharpness, I'm curious how they compare in 1:1. Wukong has best PSSR implementation so far in UE5.
OK. Found the exact spot. Maybe different color because I fast traveled here from chapter 4 ?
Corporal.Hicks Corporal.Hicks

PS5 PRO
Balanced mode 5/10 sharpening
Black-Myth-Wukong-20250113182957-BAL50.png


Quality 5/10 sharpening
Black-Myth-Wukong-20250113183110-QUAL50.png


Balanced 0/10 sharpening
Black-Myth-Wukong-20250113183017-BAL0.png


Quality 0/10 sharpening
Black-Myth-Wukong-20250113183129-QUAL0.png
 
Last edited:
OK. Found the exact spot. Maybe different color because I fast traveled here from chapter 4 ?
Corporal.Hicks Corporal.Hicks

PS5 PRO
Balanced mode 5/10 sharpening
Black-Myth-Wukong-20250113182957-BAL50.png


Quality 5/10 sharpening
Black-Myth-Wukong-20250113183110-QUAL50.png


Balanced 0/10 sharpening
Black-Myth-Wukong-20250113183017-BAL0.png


Quality 0/10 sharpening
Black-Myth-Wukong-20250113183129-QUAL0.png
50% sharpening is too strong, while without a sharpening filter the image is too soft. I would probably change the in-game sharpening to something like 10%, maybe 20%, and use TV sharpening to improve detail, because at 50% the in-game sharpening mask is running the image.
 
Last edited:
50% sharpening is too strong, while without a sharpening filter the image is too soft. I would probably change the in-game sharpening to something like 10%, maybe 20%, and use TV sharpening to improve detail, because at 50% the in-game sharpening mask is running the image.
50 is too strong but I am surprised I like the result so much
 
Top Bottom