[nVidia] Nioh 2 - The Complete Edition | 4K NVIDIA DLSS Comparison



Experience the Action RPG, Nioh 2 - The Complete Edition, enhanced with NVIDIA DLSS for the ultimate boost in performance. Available now.

Apparently the DLSS update is available now?

Screen grabs from the video (basically a 30-ish FPS boost)
ayoAxvT.png
wUVfmla.png
yN3llrU.png

---------------------------

Found a more in depth comparison for the game, native vs DLSS. Video is available up to 4k/60. Video is nice quality, we can judge for ourselves. DLSS looks fantastic, to me. I own the game on PS5, so not gonna double dip just to check on my 3080. There are other comparisons out there too, if people want more examples. I won't flood the thread further.




Many people on Reddit saying the game looks better in some areas with DLSS on due to the better AA (guess the game on PC has shoddy AA without DLSS?). Yes, the source is Reddit, so YMMV. Just sayin. Here is one thread but many more are out there

 
Last edited:
Why can't we have an nvidia gpu inside the consoles?

Imagine all the performance gain using this stuff...
 
Last edited:
4K90+ on an RTX3070?

Every game that doesnt implement DLSS should have their developers dragged through the street Game of Thrones style.
 
This is pretty good.

Since the FFXV times I knew this tech could be a game changer. Glad to see it wasn't marketing only.
 
Anoth.....

This is amazing. Now with more and more Sony exclusives coming to PC, we can now finally play Sony games at 60-120 fps at the highest settings. No need to wait for the pro. Demon Souls at 4k 120 fps. yes please.
 
Last edited:
No offense meant to OP but those comparisons are worthless, its like comparing two paintings to see which one is the copy but you are doing it by looking through binoculars from 100m away. You need the native resolution images and they need to be not encoded in (too) lossy a format.

Also I would like to see them starting to show comparison where the native 4K is sharpened as well, for fairness.

I already regret writing this because I sound like a moaning bastard while you all enjoy DLSS so much :pie_disappointed:
 
No offense meant to OP but those comparisons are worthless, its like comparing two paintings to see which one is the copy but you are doing it by looking through binoculars from 100m away. You need the native resolution images and they need to be not encoded in (too) lossy a format.

Also I would like to see them starting to show comparison where the native 4K is sharpened as well, for fairness.

I already regret writing this because I sound like a moaning bastard while you all enjoy DLSS so much :pie_disappointed:

Fair points.

They are meant just to show the FPS difference, as visual clarity isn't really gonna show up in a screengrab of a YT video. That's all we got though, unless there's a blog release of some sort from nvidia that has straight grabs from the source. It's just a personal habit, I don't like just posting a link, or a vid, with nothing else. Maybe I need to get over that

I figure the screens are better than just writing out "61 fps native, 90 fps DLSS"

If better screens/comparisons pop up, or are provided, I will happily update the OP and credit them

Make no mistake about it, though, DLSS 2.0 is a game changer. Flat out. Especially when you can add a slight sharpening filter in the nvidia control panel to cut through any slight blur it may add (I usually go between .12 and .15 sharpening, never need more, makes a big difference in Control and costs like 1 FPS).

Metro: Exodus gonna blow us away when that upgrade goes live, I bet
 
Last edited:
Thanks for this update OP. I was waiting for this update in order to play the game as smoothly as possible, and it seems now I can do so.
 
Last edited:
I hope that at least one of the console manufacture is trying to come up with something, it has been touted, but so far nothing.
 
Whenever consoles get DLSS or the AMD equivalent we will all profit. Until then...the waiting game.
Consoles and AMDs solutions are likely 3-5 years behind Nvidia and lack much of the dedicated hardware to produce this level of "free" performance. After DLSS 1.0 crashed and burned AMD thought it was another Physx feature they could ignore with CAS to fall back on, then DLSS 2.0 caught them with their pants down and has them scrambling to hack something together without the hardware to really support it.
 
Last edited:
DLSS would get less performance than CBR. DLSS is only good for PC since the only other option is native res or adaptive sharpening.
Bro I'm no expert but, as far as I've seen, DLSS gives a greater performance boost plus better framerates.

I mean it's having AI tensor cores doing their mumbo jumbo vs painting half of the pixels on screem, right?
 
DLSS would get less performance than CBR. DLSS is only good for PC since the only other option is native res or adaptive sharpening.
Yeah, not so sure about that, (but i'm not an huge expert)

Horizon and days gone use cbr and they still run at 30 fps with some slowdown here and there.

On pc dlss is the difference between playing at 30 vs 60 with even less fidelity loss compared to cbr in most of the cases.
 
Last edited:
No offense meant to OP but those comparisons are worthless, its like comparing two paintings to see which one is the copy but you are doing it by looking through binoculars from 100m away. You need the native resolution images and they need to be not encoded in (too) lossy a format.

Also I would like to see them starting to show comparison where the native 4K is sharpened as well, for fairness.

I already regret writing this because I sound like a moaning bastard while you all enjoy DLSS so much :pie_disappointed:
I don't know why but before you posted that I never bothered to check native images using dlss.

I went to Eurogamer's control dlss 2.0 article, and it seems that native 1080p to 4k using 2.0 is blurrier than native 4k, but 1440p native to 4k using dlss 2.0 is sharper. But, I did see some haloing artifacts around objects like the main character in that game, so it's still not perfect and native 4k is probably still the way to go for purists. That is, if you have an infinite amount of resources which we don't yet for that game lol. I do not like sharpening, so if they're using that I would like for that to be optional and that would explain the haloing.

I couldn't find native images of death stranding on their site, but I remember DF said it made some tiny details like particles disappear.

Basically it's not quite magic and there are trade offs, but if they can iron out these minor issues in a 3.0 version, at the very least, it would greatly enhance native rendering at whichever resolution we choose, if not quite as good in extreme comparisons like 1080p dlss to native 4k. But then again, the leap from prior versions to 2.0 is massive so there's no telling how good it will get.

The purist in me would hope Nintendo holds out for a 3.0 version so these artifacts go away though.
 
Last edited:
In before it's just AA?

Great stuff as usual. We need DLSS-like tech everywhere. People complaining the quality isn't there are just delusional.
 
Last edited:
Ask Microsoft with their first Xbox.
Of course, main problem there being Nvidia never allowed MS to do die shrinks on the gpu, no doubt one reason why they axed the console so fast.

Then Nvidia f's sony over by selling them a weaker gpu than what ati did for 360. I find it funny that Nintendo were the only ones not screwed over by Nvidia to some degree lol.
 
Bro I'm no expert but, as far as I've seen, DLSS gives a greater performance boost plus better framerates.

I mean it's having AI tensor cores doing their mumbo jumbo vs painting half of the pixels on screem, right?
DLSS 1440p has about a 20% worse frame rate compared to native 1440p. The tensor cores arent doing all the work at all.
The tensor cores on the RTX cards also take up about 20% of the chip space. So if we consoles had a dlss type solution like nvidia, we're looking at maybe ~9tf equivalents instead of 10-12 just so we could have DLSS? And the image quality would be.. marginally better than CBR we've seen in Spiderman?
 
Last edited:
DLSS quality, internal res of 1440p. Has a worse frame rate than just running in native 1440p.
I set my games at 1440p and they run much better with DLSS on, so I think we aren't understanding each other here.
 
He told you that "DLSS quality" is an euphemism for "I've rendered it at 1440p and used fancy temporal upscaling" and that just "render at 1440p" is, shockers, even faster.

Uh, I don't get what the issue is. Sorry.

If the core argument is that CB rendering is as good as DLSS 2.0, I just respectfully disagree. Sorry again. Just IMO.
 
Last edited:
I set my games at 1440p and they run much better with DLSS on, so I think we aren't understanding each other here.
Ok I'll be clearer.
DLSS 4k quality, ie. internal res of 1440p. Has a lower frame rate than 1440p native. My point is the tensor cores are not taking the entire load, your are sacrificing alot of shader performance as well.
 
Uh, I don't get what the issue is. Sorry.

If the core argument is that CB rendering is as good as DLSS 2.0, I just respectfully disagree. Sorry again. Just IMO.

You are in a thread full of orgasm about 1440p => 4k temporal upscaling, with images in OP having resolution of 1151x615.

The "better than native" and "indistinguishable from native" is a horse beaten while dead a number of times on this very forum.
 
Ok I'll be clearer.
DLSS 4k quality, ie. internal res of 1440p. Has a lower frame rate than 1440p native. My point is the tensor cores are not taking the entire load, your are sacrificing alot of shader performance as well.
But it looks much better, and I think that performance wise it depends on the game.

Just my opinion ofc, but I vastly prefer DLSS to dynamic resolutions.
 
Serious question:

is nVidia correct? Or did I fuck that up too? I always thought it was nVidia, but google is telling me Nvidia or NVIDIA

But it looks much better, and I think that performance wise it depends on the game.

Just my opinion ofc, but I vastly prefer DLSS to dynamic resolutions.

This whole thing has sent me down a rabbit hole. Googling around, I don't see anything about CB rendering providing better image quality, or better performance gains than DLSS 2.0

I see some videos of DLSS'd versions of a game on PC compared to CB'd versions of a game on console and skimming through them I don't see anything favoring CB.

Here is one article I have opened up, going through a few at the moment

NVIDIA DLSS 2.0 vs PS4 Checkerboard Rendering: Technical Comparison | Hardware Times
 
Last edited:
But it looks much better, and I think that performance wise it depends on the game.

Just my opinion ofc, but I vastly prefer DLSS to dynamic resolutions.
You'll want to have dynamic resolutions with DLSS as well. We'd have less complaints with the Medium if this was the case.
 
You'll want to have dynamic resolutions with DLSS as well. We'd have less complaints with the Medium if this was the case.
Nah why would I want dynamic resolutions? I would prefer toning a few settings down before doing that honestly.

IMO dynamic = inconsistent.
 
I don't see anything about CB rendering providing better image quality, or better performance than DLSS 2.0

CB upscaling vs traditional:

image-13-1536x737.png


perf hit:

All this takes around 1.4ms. In comparison rendering a frame at native resolution takes as much 10ms.
 
Found a more in depth comparison for the game, native vs DLSS. Video is available up to 4k/60 Kuranghi Kuranghi



Many people on Reddit saying the game looks better in some areas with DLSS on due to the better AA (guess the game on PC has shoddy AA without DLSS?). Yes, the source is Reddit, so YMMV. Just sayin. Here is one thread but many more are out there

 
Last edited:
Found a more in depth comparison for the game, native vs DLSS. Video is available up to 4k/60 Kuranghi Kuranghi



Many people on Reddit saying the game looks better in some areas with DLSS on due to the better AA (guess the game on PC has shoddy AA without DLSS?). Yes, the source is Reddit, so YMMV. Just sayin. Here is one thread but many more are out there



Yeah I think the anti-aliasing effect is great, really clears up those shimmers on the main character.
 
Top Bottom