DF: Nioh 2 DLSS Analysis | DLSS 2.0 has better image quality than native

The comparison is with the native TAA solution which in itself introduces significant blur, not with a supersampled image.
We had Death Stranding, with hyped (!!!) screenshot with grass (no doubts, improved, very impressively) on the right and bush above dude's clearly blurred out, with "better than native" claim to it.
 
1440p + dlss 2.0 is sharper than native 1440p, but has haloing artifacts and can occlude small details like particles.

Native 4k is sharper than any dlss 2.0. If it appears that's not true in a particular game, it's because the TAA is ruining image quality.
 
Last edited:
We had Death Stranding, with hyped (!!!) screenshot with grass (no doubts, improved, very impressively) on the right and bush above dude's clearly blurred out, with "better than native" claim to it.
That's what I mean. Parts of the image will look worse and parts will look better, when compared with the game running at native resolution with an inferior TAA solution. And for that you get considerably better performance and better AA. Lowering the resolution alone gets you better performance and nothing else.
 
That's what I mean. Parts of the image will look worse and parts will look better, when compared with the game running at native resolution with an inferior TAA solution. And for that you get considerably better performance and better AA. Lowering the resolution alone gets you better performance and nothing else.
It gives you bigger performance gain as well (and improved part is again, just lines)
TAA blur isn't caused just by lowering resolution.
 
Then it will be a very interesting comparison between whatever it is AMD is developing, the Sony upscaling solutions and DLSS2.0 in the end - interesting times!
I am very skeptical of AMD having any sort of reasonable solution to DLSS considering DLSS does require Tensor cores and there is no such hardware in the consoles. On top of using the TMUs for RT, making them allocate compute time using some DLSS equivalent solution will bog down the already bandwidth-starved consoles even more. The solution would have to be more efficient than their reconstruction pipeline and offer better visuals than TAA at the same FPS. Considering AMD's next-gen GPUs don't stack up at all to the Ampere boards on innovations, I doubt they will have something decent. We'll see..
 
Not sure how a 4k image can resolve to a "jagged mess" under any circumstances, but whatever.

I did chuckle a bit at the dismissal of the 2060 as weak gpu, considering its the price (or more) of a console.
 
Not sure how a 4k image can resolve to a "jagged mess" under any circumstances, but whatever.

I did chuckle a bit at the dismissal of the 2060 as weak gpu, considering its the price (or more) of a console.
It is the lowest spec card available that can run DLSS.
 
It gives you bigger performance gain as well (and improved part is again, just lines)
TAA blur isn't caused just by lowering resolution.
Right, but the point is that with TAA lines end up "ghosting" in motion, and that ghosting hides detail that is originally in the image (outside of the lines themselves). So by cleaning up the ghosting you make parts of the image look sharper. I think that's why elements of the scenery look sharper in Death Stranding videos with DLSS on; since the camera is constantly in motion.
 
He probably loves AMD and think they can't do anything wrong so if "evil" Nvidia is better at something he attacks. DLSS 2.0 so far is quite amazing and shits all over VRS for example (I stll have hope for Mesh shading) in terms of giving performance, at the same time it also can IMPROVE IQ in games which is great (and suprising) or offer just slight reduction of it (mostly in games with RT, reflection res is lower).
Actually I'm green team.
Any AMD fan in this forum can tell you.
 
Last edited:
download.png
 
I am very skeptical of AMD having any sort of reasonable solution to DLSS considering DLSS does require Tensor cores and there is no such hardware in the consoles. On top of using the TMUs for RT, making them allocate compute time using some DLSS equivalent solution will bog down the already bandwidth-starved consoles even more. The solution would have to be more efficient than their reconstruction pipeline and offer better visuals than TAA at the same FPS. Considering AMD's next-gen GPUs don't stack up at all to the Ampere boards on innovations, I doubt they will have something decent. We'll see..
there was Control dlss implementation called dlss 1.9 running on shaders, not as good results as dlss2 in quality but performance wise had even smaller impact
btw rtx2060 struggle with 60fps even with dlss on, good performance by ps5 version
 
Last edited:
Didn't get round to buying this on PS5 in the last few weeks so I think I'll just get the PC version instead considering the DLSS implementation is good.
 
Actually I'm green team.
Any AMD fan in this forum can tell you.

So I don't get your hate (?) for DLSS considering many people who tired it (including me) are happy with the results and as DF proves it can give better results than native image in case of Nios 2, game that don't have any efficient AA solution without it.

Cherry picking works both ways, screenshots from syte hyping the hell out of DLSS 2 (death stranding):

CAS

l1J2Cv1.png



DLSS 2:

SnpwykJ.png


UeykcQW.jpg

You know setting negative LOD bias would fix slight blur in DS? And reflection quality is lower of course, RT reflections are based on internal resolution wich is lower with DLSS.
 
So I don't get your hate (?) for DLSS considering many people who tired it (including me) are happy with the results and as DF proves it can give better results than native image in case of Nios 2, game that don't have any efficient AA solution without it.



You know setting negative LOD bias would fix slight blur in DS? And reflection quality is lower of course, RT reflections are based on internal resolution wich is lower with DLSS.
Hate? I do like DLSS and it is one of the key points to buy nVidia cards over AMD (another deception).

That doesn't change the title of the thread is a lie, the article owner did a dumb comparison (4k and good AA will reach a domain not possible with DLSS 2.0) and all the narrative being made here... 4k native give you better IQ than DLSS 2.0.

The reason DLSS exists is performance and not IQ.
 
Last edited:
2560x1080 native

d90Nioh2TheCompleteEdit.png


2560x1080 DLSS Q

2e9Nioh2TheCompleteEdit.png


5120x2160 native

EHO5gkM.jpg


5120x2160 DLSS Q

qIrIImT.jpg


5120x2160 with DLSS B (final settings for my DSR)

n9doehG.jpg


And how it looks on my monitor

f63Nioh2TheCompleteEdit.png


Hate? I do like DLSS and it is one of the key points to buy nVidia cards over AMD (another deception).

That doesn't change the title of the thread is a lie, the article owner did a dumb comparison (4k and good AA will reach a domain not possible with DLSS 2.0) and all the narrative being made here... 4k native give you better IQ than DLSS 2.0.

The reason DLSS exists is performance and not IQ.

DLSS fixes things like TAA ghosting or lack of good alpha textures coverage of it in some games (DS, Control) so it CAN produce more pleasing results than native image and TAA
 
Last edited:
If you aren't posting framerate then your comparison is pointless. It seems you don't even understand what DLSS was created for in the first place.
I have never challenged "running at 2.2 times lower resolution improves framerates", only the "better than native" lie.
 
I have never challenged "running at 2.2 times lower resolution improves framerates", only the "better than native" lie.

It looks better than any other other option out there matching the same/similar performance. You can try to spin that FACT however you want.
 
It looks better than any other other option out there matching the same/similar performance. You can try to spin that FACT however you want.
I have NEVER challenged DLSS 2 being the best TAA derivative out there either.
Only the "better than native" lie.
 
I have NEVER challenged DLSS 2 being the best TAA derivative out there either.
Only the "better than native" lie.

You challenged it when you tried to compare it to FidelityFX without showing anything about PERFORMANCE. The entire reason all of this stuff was created in the first place. Fact is you cant show anything that looks as good at matching performance whether you use a custom resolution, TAAU, Sharpening filter, or anything else.
 
Last edited:
Second example has nothing to do with F-FX CAS.

I'm not arguing with point outlined by F FireFly earlier on this page, it's a fair one.
The outright lie about "better than native" is what bugs me.

You will never hear me trumpeting DLSS being better than native. I don't care about that. All that matters is it being the best option to regain performance. Its that simple. Detractors look like fools trying to downplay it.
 
Last edited:
Armorian Armorian comparing them side by side, the dlss picture is blurrier and missing texture detail. Also, one thing I want gone from future dlss versions is the sharpening effect.

Also, DLSS aside, Less Aliasing =/= better. Certain games do well with a sort of filmic, blurred look (assuming the resolution is high enough) but in general I value sharpness. And inserting TAA in a game that wasn't made with it is a general no no. Some TAA is downright horrible, though it's come a long way since Halo reach.
 
Armorian Armorian comparing them side by side, the dlss picture is blurrier and missing texture detail. Also, one thing I want gone from future dlss versions is the sharpening effect.
The reason the DLSS image looks better overall is because it cleans up so much of the artifacting. The texture detail only seems to be affected at certain angles (definitely something that probably can't be helped when you are MIP mapping/filtering textures and then doing an image analysis on it on top of that). I'd more than happily take the DLSS image over the native one. I use DLSS in all of my games if it has it available because the FPS hit is too great to enjoy the game without it and it doesn't detract from the overall rendering of the scene significant enough for me to care. It's about the only thing I can give a pass to concerning "tricks" to allow faster rendering times.
 
Last edited:
Watching the video, I couldn't get past the irony of zooming 400% to note the difference in pixels when even zoomed out the game looks like it came from the PS3 with clunky models and awful textures. I get what the purpose of the video is, I really do, but I feel there's something to be said about focusing on superficial things these days rather than the whole picture.
 
1440p + dlss 2.0 is sharper than native 1440p, but has haloing artifacts and can occlude small details like particles.

Native 4k is sharper than any dlss 2.0. If it appears that's not true in a particular game, it's because the TAA is ruining image quality.
this
 
Watching the video, I couldn't get past the irony of zooming 400% to note the difference in pixels when even zoomed out the game looks like it came from the PS3 with clunky models and awful textures. I get what the purpose of the video is, I really do, but I feel there's something to be said about focusing on superficial things these days rather than the whole picture.
i play on a projector 400% is a matter of life and death
 
The reason the DLSS image looks better overall is because it cleans up so much of the artifacting. The texture detail only seems to be affected at certain angles (definitely something that probably can't be helped when you are MIP mapping/filtering textures and then doing an image analysis on it on top of that). I'd more than happily take the DLSS image over the native one. I use DLSS in all of my games if it has it available because the FPS hit is too great to enjoy the game without it and it doesn't detract from the overall rendering of the scene significant enough for me to care. It's about the only thing I can give a pass to concerning "tricks" to allow faster rendering times.
As of right now, I would only use it if it's a resource heavy game that I can't run with all the bells and whistles without it. I guess if I played online competitive, it would be more useful to me. I am looking forward to seeing how it improves over the years.

But again, aliasing is not the only metric of image quality.
---

I know we're talking about DLSS, but as it kinda factors in, i've not been thrilled at how commonplace TAA has been used in games, almost like a universal solution for western developers at this point. In terms of sharp AA methods, I liked when MLAA was developed on PS3 as a cheap but still sharp alternative to MSAA, then SMAA (1x as it's called in Crysis 3, without the temporal component) further improved the technique. Unless it's a really good TAA solution and it fits the content (e.g. ratchet and clank, bluepoint games) i'd rather smaa be used. Esp. at native 4k where it doesn't need much AA to begin with, at least at the tv sizes I have and my distance from them.

Even back in the 360 generation at 720p, in hindsight I appreciated games like Gears 3 not having any AA instead of blurry FXAA or the much worse experimental TAA stuff devs had back then. I am also not thrilled at the low resolutions Nintendo currently uses for some of their games on switch, but it is preferable that they have no AA as opposed to FXAA or TAA to have them blurred further.

An extreme of example of the horrors of TAA can be seen on id tech games on the switch, sans doom eternal which has no AA. I've never seen a worse looking game than wolfenstein 2 on the switch in terms of image quality. 360p + TAA :messenger_fearful:
 
Have you watched the video? DLSS wins in many games with TAA but in Nioh with some (shit) post process AA it completely destroys native image. DLSS only really loses when comparing RT reflections in games that use them as they are rendered in lower resolution, bot of course this doesn't apply to Nioh 2.

jK1bZi1.png
yrHzXra.png
jimDNBD.png
220C4ez.png

Well done sir. Facts > Opinions.
 
Top Bottom