DLSS transformer model is *ridiculous*

daninthemix

Member
At 4K, DLSS4's preset K (transformer model) is so good in Performance mode (1080p internal) that I no longer even bother with Quality mode. It offers that 'lean in' level of 4K detail you expect, free from aliasing and blurring. If you previously have stayed away from Performance mode DLSS, thinking that the internal resolution is too low - try it. It's really, really good. I presume it continues to evolve, as I know there were some issues with ghosting etc when it released earlier in the year, but now every game I try is pin-sharp and artifact free.

It used to be the case that Quality mode offered a significant step-up in terms of clarity, but that seems to no longer be the case. This model is so good that 1080p is 'enough' to infer a great image.
 
While 4k performance does look "good enough" at preset K, quality is still sharper and I use it in games where there would be smaller targets in the distance.
 
It is. It's amazing. I prefer preset J for the majority of games though, it's a tad sharper than K, but has less stability in foliage and such.
 
Honestly, 1080p DLAA Transformer model on my 4K Sony TV looks far, far cleaner than whatever methods my PS5 Pro uses for games.

Coming from awesome AA like SGSSAA to a decade of post processing crap like FXAA/SMAA was not a great time. Especially as so many games had specular aliasing. DLSS is a breath of fresh air.

Some talented folks are also implementing it themselves into some older DX11 games which is cool. Plenty of older games I'd like to see it work with in the future.
 
Last edited:
That must look incredibly prestine. Even for a lot of us with low end cards like a 12GB 3060, it looks fantastic still. I don't even feel like I need to upgrade yet due to DLSS.

Yeah it looks amazing. I have a 4090 so I can afford it. But DLSS transformer is so good that I would be happy with that too as well if I had a weaker computer
 
DLSS-P was already usable (4K like image) with DLSS3.7 if you just added some sharpening mask.

DLSS 3.7 performance, ingame sharpening set to 0% becasue I used reshade CAS and luma sharpening

b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


b1-Win64-Shipping-2024-09-01-00-07-05-687.jpg


b1-Win64-Shipping-2024-09-01-00-25-53-709.jpg


DLSS image is razor sharp, but native resolution will always be better, because the quality of certain graphics effects is tied to the internal resolution (especially in RT / UE5 games), so when using lower DLSS modes you migh see more aliased RT shadows, less detailed nanite in the distant objects, more blurry RT reflections, or more pixelated volumetric clouds or SSR reflections. Also dynamic lighting (RT GI / PT / lumen) is more stable at higher internal resolutions, therefore the higher the internal resolution, the better image you will get in these modern RT / UE5 gamss. DLAA is my prefered choice, but I found that DLSS Ultra Quality (77% res scale) still can render RT effects with almost native like stability and clarity and it still gives very nice performance boost over native DLAA in heavy RT games. If game is more demanding I dont mind using lower internal resolutions (DLSSQ, DLSSB, DLSSP) because image is still sharp (4K like) and lower quality RT effects arnt so obvious if you just dont look for imperfections. I however dont like DLSS Ultra Performance because it affects image clarity to much bigger degree even compared to standard performance.
 
Last edited:
I would generally agree that dlss is really evolving. And that is one of the reasons why a lot of people like me will continue to go for NVIDIA graphics cards. They aren't exactly sitting on their hands at the top of their perch and not continue to try to innovate.

There may be a time where they have a different balance and maybe go the way of Intel when they got greedy but you got to give credit where credits due.

I play mostly in the quality preset and as long as everything's over 60 I'm good and honestly it's really awesome to have the options to go lower and if it makes sense I may try balance depending on the intensity of the game but when you have a 4090 or higher it's easy to not to compromise much.
 
DLSS has become my new way of telling if someone has correct eye vision.

In the current example : "Yeah no difference playing DLSS Performance and native" aka : No difference between Native 4K and 1080p (4X lower pixels) i definitely know OP hasn't.
 
Last edited:
DLSS has become my new way of telling if someone has correct eye vision.

In the current example : "Yeah no difference playing DLSS Performance and native" aka : No difference between Native 4K and 1080p (4X lower pixels) i definitely know OP hasn't.
I didn't say there's no difference - 4k native is mostly a ridiculous fantasy for modern, demanding games anyway - especially for those of us at high refresh rates. "Native" is an archiac term from 10 years ago that holds no modern relevance.

Saying people you don't agree with have bad vision is hilarious. Let's try it - I think your eyes, ears, mouth and brain are bad. Well? What has it achieved? Still - good talk.
 
DLSS4 really let's you chase perfect image quality of you want to. The higher the resolution you feed it, the better it works. I frequently use DLDSR to push resolutions over 4k combined with DLSS 4 for some really incredible results. It is extremely hardware taxing, obviously. I can bring my 5090 to its knees for the sake of perfect image quality.
 
DLSS has become my new way of telling if someone has correct eye vision.

In the current example : "Yeah no difference playing DLSS Performance and native" aka : No difference between Native 4K and 1080p (4X lower pixels) i definitely know OP hasn't.
4K image reconstructed with DLSSP isnt blurred even slightly therefore people have very good reason to be impressed. My black myth wukong DLSSP screenshots look perfectly sharp and if you see a blurry image then you are the one with vision problems.

That being said, a higher internal resolution will always produce a better-looking image. Motion will have fewer artefacts and look more stable. Most importantly, the quality of certain effects depends on internal resolution and will look the best if you use DLAA (native). So technically, there is a difference, but it's not significant enough for most people. Good vision isnt enough to notice that differece, because you also have to know where to look at and know how certain graphics effects supposed to look at native resolution.

DLAA (native) is prefferable, but only of game runs around 100fps. Lower than 100fps is much better to use lower DLSS presets, because DLAA image at low framerate will look more blurry in motion than DLSS image at much higher framerate. On a sample-and-hold display, the number of frames per second (FPS) determines the clarity of motion.
 
Performance looks great which makes up for the performance loss in going from DLSS 3 to 4.

Having said that, I mostly use Balanced on my 4090 and DLAA on my 21:9
 
I didn't say there's no difference - 4k native is mostly a ridiculous fantasy for modern, demanding games anyway - especially for those of us at high refresh rates. "Native" is an archiac term from 10 years ago that holds no modern relevance.

Saying people you don't agree with have bad vision is hilarious. Let's try it - I think your eyes, ears, mouth and brain are bad. Well? What has it achieved? Still - good talk.
4K image reconstructed with DLSSP isnt blurred even slightly therefore people have very good reason to be impressed. My black myth wukong DLSSP screenshots look perfectly sharp and if you see a blurry image then you are the one with vision problems.

That being said, a higher internal resolution will always produce a better-looking image. Motion will have fewer artefacts and look more stable. Most importantly, the quality of certain effects depends on internal resolution and will look the best if you use DLAA (native). So technically, there is a difference, but it's not significant enough for most people. Good vision isnt enough to notice that differece, because you also have to know where to look at and know how certain graphics effects supposed to look at native resolution.

DLAA (native) is prefferable, but only of game runs around 100fps. Lower than 100fps is much better to use lower DLSS presets, because DLAA image at low framerate will look more blurry in motion than DLSS image at much higher framerate. On a sample-and-hold display, the number of frames per second (FPS) determines the clarity of motion.

I still stand on my position.
I play on 8K/4K screens, as well as use DLDSR resolutions a lot. I spent hundreds of hours with DLSS testing. Cannot even count the amount of times spent on diverse comparison with all options, presets etc... in conjunction with DLDSR 1.78 & 2.25.
DLSS P on 4K is noticeable and absolutely a no go for fast games (racing/fps) it smears, it has blurring and oversharpened stuff mainly on the distance because of the compensation the processing has to do to accommodate for low details.
DLSS, no matter how good it is, is still a TAA solution. It's a good TAA solution, but it still is a TAA.
Also, because of the use of extreme blurring, motion blur, dof, chromatic aberration, sharpen filter etc... on all "modern" slop engines like UE5, the difference seems less noticeable to your brain simply because even on native these engines are blurry & smeary slops. Every medium to distant details are lost within it.
Heck, when you think about it, even "quality" mode is very low as it is 0.667 scale which makes the resolution 0.45x of source.
Just tired to see all these "magic voodoo" statement that constantly repeat that the 25% pixels can construct a proper 100% res. No.
But know that this is because of these false claims & false observations that developers don't bother optimizing games anymore, relying on the "well, put dlss performance bruh it will be like native".
To my testing, DLSS P started to output a statisfying image when using DLDSR 1.78 (2880 image), balanced preset still being a bit better.
DLDSR 2.25X, 5760x3240 so 2880X1620 DLSS resolution was when that preset was good to my eye.

Also, there was no offence implied on my first answer, but my apologies if it was clumsy on my end.
The exhaustion of the constant "DLSS being magic" is sometimes bothering me for the aforementioned reasons.
 
Last edited:
So true.

I dunno if im gonna upgrade my GPU nextgen, might skip another generation because using Balanced and even Performance right now lets me max out pretty much every game I play......if even that isnt enough I might dabble with Ultra-Performance and/or start lowering settings.

If Nvidia App doesnt let you force Preset K with DLSS Transformer, you can always use DLSSwapper and Nvidia Inspector to force the game to do what you want.

Manual .dll swapping used to cause me problems, Swapper and even NvApp havent given me any problems yet.


dc0a08bb-3894-4ec6-affa-8dd6eda7cc00_text.gif
 
DLSS image is razor sharp, but native resolution will always be better, because the quality of certain graphics effects is tied to the internal resolution (especially in RT / UE5 games)

Admittedly I haven't done much testing with UE titles, but I completely disagree in regards to everything else. Even DLSS balanced is superior to native. DLSS resolves everything where minor details in native often fall away in the BVH.
 
Yes transformer mode fucked rules, but still higher internal resolution helps a lot with clarity in motion, especially racing games where it's key to have more information than most.

DLSS4 dlaa is ridiculous though and fixed Forza Motorsports terrible IQ
 
DLSS P on 4K is noticeable and absolutely a no go for fast games (racing/fps) it smears, it has blurring and oversharpened stuff mainly on the distance because of the compensation the processing has to do to accommodate for low details.
Just tired to see all these "magic voodoo" statement that constantly repeat that the 25% pixels can construct a proper 100% res. No.
Right, but have you tried recently? Because I myself was definitely a snob when it came to Performance mode, and only in the last six months has it got good enough that I am willing to use it. That's really my main point - we know that DLSS has become orders of magnitude better than when it first launched in 2018. My point is that even since the launch of DLSS4, it seems to have got better still, so I urge people to try it who previously dismissed Performance mode due to the low internal res.

As to other points made above - yes, I'm aware that a low internal res can have knock-on effects for RT and other visual metrics.

Ultimately we all have to pick what matters the most to us in terms of resolution / frame rate / RT / visual settings / framegen / latency - and for me the Performance mode has got good enough that the needle has now moved and I'm prepared to pick that instead of compromising on other options on that list.
 
I usually use 4k with Native DLAA which is even better.

I can't go back

Theres almost no difference between DLAA and DLSS quality aside from better framerates when playing normally. YOu'd have to zoom in 200% and stand still to notice the differences. I think Corporal.Hicks Corporal.Hicks posted 2 screenshots in an older thread comparing them and people couldnt figure out which is which. Maybe im remembering wrong tho.
 
Big reason why I went back to Nvidia was the support DLSS has. 9070 XT is a great card, but the image quality just isn't as good overall and I really don't think I should have to hack every game to get FSR 4.
 
Theres almost no difference between DLAA and DLSS quality aside from better framerates when playing normally. YOu'd have to zoom in 200% and stand still to notice the differences. I think Corporal.Hicks Corporal.Hicks posted 2 screenshots in an older thread comparing them and people couldnt figure out which is which. Maybe im remembering wrong tho.

When you are as picky as I am with aliasing and artefacts you learn to see the differences. Trust me there is, specially in a 65 inches TV
 
Top Bottom