DLSS transformer model is *ridiculous*

daninthemix

Member
At 4K, DLSS4's preset K (transformer model) is so good in Performance mode (1080p internal) that I no longer even bother with Quality mode. It offers that 'lean in' level of 4K detail you expect, free from aliasing and blurring. If you previously have stayed away from Performance mode DLSS, thinking that the internal resolution is too low - try it. It's really, really good. I presume it continues to evolve, as I know there were some issues with ghosting etc when it released earlier in the year, but now every game I try is pin-sharp and artifact free.

It used to be the case that Quality mode offered a significant step-up in terms of clarity, but that seems to no longer be the case. This model is so good that 1080p is 'enough' to infer a great image.
 
While 4k performance does look "good enough" at preset K, quality is still sharper and I use it in games where there would be smaller targets in the distance.
 
It is. It's amazing. I prefer preset J for the majority of games though, it's a tad sharper than K, but has less stability in foliage and such.
 
Honestly, 1080p DLAA Transformer model on my 4K Sony TV looks far, far cleaner than whatever methods my PS5 Pro uses for games.

Coming from awesome AA like SGSSAA to a decade of post processing crap like FXAA/SMAA was not a great time. Especially as so many games had specular aliasing. DLSS is a breath of fresh air.

Some talented folks are also implementing it themselves into some older DX11 games which is cool. Plenty of older games I'd like to see it work with in the future.
 
Last edited:
That must look incredibly prestine. Even for a lot of us with low end cards like a 12GB 3060, it looks fantastic still. I don't even feel like I need to upgrade yet due to DLSS.

Yeah it looks amazing. I have a 4090 so I can afford it. But DLSS transformer is so good that I would be happy with that too as well if I had a weaker computer
 
DLSS-P was already usable (4K like image) with DLSS3.7 if you just added some sharpening mask.

DLSS 3.7 performance, ingame sharpening set to 0% becasue I used reshade CAS and luma sharpening

b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


b1-Win64-Shipping-2024-09-01-00-07-05-687.jpg


b1-Win64-Shipping-2024-09-01-00-25-53-709.jpg


DLSS image is razor sharp, but native resolution will always be better, because the quality of certain graphics effects is tied to the internal resolution (especially in RT / UE5 games), so when using lower DLSS modes you migh see more aliased RT shadows, less detailed nanite in the distant objects, more blurry RT reflections, or more pixelated volumetric clouds or SSR reflections. Also dynamic lighting (RT GI / PT / lumen) is more stable at higher internal resolutions, therefore the higher the internal resolution, the better image you will get in these modern RT / UE5 gamss. DLAA is my prefered choice, but I found that DLSS Ultra Quality (77% res scale) still can render RT effects with almost native like stability and clarity and it still gives very nice performance boost over native DLAA in heavy RT games. If game is more demanding I dont mind using lower internal resolutions (DLSSQ, DLSSB, DLSSP) because image is still sharp (4K like) and lower quality RT effects arnt so obvious if you just dont look for imperfections. I however dont like DLSS Ultra Performance because it affects image clarity to much bigger degree even compared to standard performance.
 
Last edited:
I would generally agree that dlss is really evolving. And that is one of the reasons why a lot of people like me will continue to go for NVIDIA graphics cards. They aren't exactly sitting on their hands at the top of their perch and not continue to try to innovate.

There may be a time where they have a different balance and maybe go the way of Intel when they got greedy but you got to give credit where credits due.

I play mostly in the quality preset and as long as everything's over 60 I'm good and honestly it's really awesome to have the options to go lower and if it makes sense I may try balance depending on the intensity of the game but when you have a 4090 or higher it's easy to not to compromise much.
 
DLSS has become my new way of telling if someone has correct eye vision.

In the current example : "Yeah no difference playing DLSS Performance and native" aka : No difference between Native 4K and 1080p (4X lower pixels) i definitely know OP hasn't.
 
Last edited:
DLSS has become my new way of telling if someone has correct eye vision.

In the current example : "Yeah no difference playing DLSS Performance and native" aka : No difference between Native 4K and 1080p (4X lower pixels) i definitely know OP hasn't.
I didn't say there's no difference - 4k native is mostly a ridiculous fantasy for modern, demanding games anyway - especially for those of us at high refresh rates. "Native" is an archiac term from 10 years ago that holds no modern relevance.

Saying people you don't agree with have bad vision is hilarious. Let's try it - I think your eyes, ears, mouth and brain are bad. Well? What has it achieved? Still - good talk.
 
DLSS4 really let's you chase perfect image quality of you want to. The higher the resolution you feed it, the better it works. I frequently use DLDSR to push resolutions over 4k combined with DLSS 4 for some really incredible results. It is extremely hardware taxing, obviously. I can bring my 5090 to its knees for the sake of perfect image quality.
 
DLSS has become my new way of telling if someone has correct eye vision.

In the current example : "Yeah no difference playing DLSS Performance and native" aka : No difference between Native 4K and 1080p (4X lower pixels) i definitely know OP hasn't.
4K image reconstructed with DLSSP isnt blurred even slightly therefore people have very good reason to be impressed. My black myth wukong DLSSP screenshots look perfectly sharp and if you see a blurry image then you are the one with vision problems.

That being said, a higher internal resolution will always produce a better-looking image. Motion will have fewer artefacts and look more stable. Most importantly, the quality of certain effects depends on internal resolution and will look the best if you use DLAA (native). So technically, there is a difference, but it's not significant enough for most people. Good vision isnt enough to notice that differece, because you also have to know where to look at and know how certain graphics effects supposed to look at native resolution.

DLAA (native) is prefferable, but only of game runs around 100fps. Lower than 100fps is much better to use lower DLSS presets, because DLAA image at low framerate will look more blurry in motion than DLSS image at much higher framerate. On a sample-and-hold display, the number of frames per second (FPS) determines the clarity of motion.
 
Performance looks great which makes up for the performance loss in going from DLSS 3 to 4.

Having said that, I mostly use Balanced on my 4090 and DLAA on my 21:9
 
I didn't say there's no difference - 4k native is mostly a ridiculous fantasy for modern, demanding games anyway - especially for those of us at high refresh rates. "Native" is an archiac term from 10 years ago that holds no modern relevance.

Saying people you don't agree with have bad vision is hilarious. Let's try it - I think your eyes, ears, mouth and brain are bad. Well? What has it achieved? Still - good talk.
4K image reconstructed with DLSSP isnt blurred even slightly therefore people have very good reason to be impressed. My black myth wukong DLSSP screenshots look perfectly sharp and if you see a blurry image then you are the one with vision problems.

That being said, a higher internal resolution will always produce a better-looking image. Motion will have fewer artefacts and look more stable. Most importantly, the quality of certain effects depends on internal resolution and will look the best if you use DLAA (native). So technically, there is a difference, but it's not significant enough for most people. Good vision isnt enough to notice that differece, because you also have to know where to look at and know how certain graphics effects supposed to look at native resolution.

DLAA (native) is prefferable, but only of game runs around 100fps. Lower than 100fps is much better to use lower DLSS presets, because DLAA image at low framerate will look more blurry in motion than DLSS image at much higher framerate. On a sample-and-hold display, the number of frames per second (FPS) determines the clarity of motion.

I still stand on my position.
I play on 8K/4K screens, as well as use DLDSR resolutions a lot. I spent hundreds of hours with DLSS testing. Cannot even count the amount of times spent on diverse comparison with all options, presets etc... in conjunction with DLDSR 1.78 & 2.25.
DLSS P on 4K is noticeable and absolutely a no go for fast games (racing/fps) it smears, it has blurring and oversharpened stuff mainly on the distance because of the compensation the processing has to do to accommodate for low details.
DLSS, no matter how good it is, is still a TAA solution. It's a good TAA solution, but it still is a TAA.
Also, because of the use of extreme blurring, motion blur, dof, chromatic aberration, sharpen filter etc... on all "modern" slop engines like UE5, the difference seems less noticeable to your brain simply because even on native these engines are blurry & smeary slops. Every medium to distant details are lost within it.
Heck, when you think about it, even "quality" mode is very low as it is 0.667 scale which makes the resolution 0.45x of source.
Just tired to see all these "magic voodoo" statement that constantly repeat that the 25% pixels can construct a proper 100% res. No.
But know that this is because of these false claims & false observations that developers don't bother optimizing games anymore, relying on the "well, put dlss performance bruh it will be like native".
To my testing, DLSS P started to output a statisfying image when using DLDSR 1.78 (2880 image), balanced preset still being a bit better.
DLDSR 2.25X, 5760x3240 so 2880X1620 DLSS resolution was when that preset was good to my eye.

Also, there was no offence implied on my first answer, but my apologies if it was clumsy on my end.
The exhaustion of the constant "DLSS being magic" is sometimes bothering me for the aforementioned reasons.
 
Last edited:
So true.

I dunno if im gonna upgrade my GPU nextgen, might skip another generation because using Balanced and even Performance right now lets me max out pretty much every game I play......if even that isnt enough I might dabble with Ultra-Performance and/or start lowering settings.

If Nvidia App doesnt let you force Preset K with DLSS Transformer, you can always use DLSSwapper and Nvidia Inspector to force the game to do what you want.

Manual .dll swapping used to cause me problems, Swapper and even NvApp havent given me any problems yet.


dc0a08bb-3894-4ec6-affa-8dd6eda7cc00_text.gif
 
DLSS image is razor sharp, but native resolution will always be better, because the quality of certain graphics effects is tied to the internal resolution (especially in RT / UE5 games)

Admittedly I haven't done much testing with UE titles, but I completely disagree in regards to everything else. Even DLSS balanced is superior to native. DLSS resolves everything where minor details in native often fall away in the BVH.
 
Yes transformer mode fucked rules, but still higher internal resolution helps a lot with clarity in motion, especially racing games where it's key to have more information than most.

DLSS4 dlaa is ridiculous though and fixed Forza Motorsports terrible IQ
 
DLSS P on 4K is noticeable and absolutely a no go for fast games (racing/fps) it smears, it has blurring and oversharpened stuff mainly on the distance because of the compensation the processing has to do to accommodate for low details.
Just tired to see all these "magic voodoo" statement that constantly repeat that the 25% pixels can construct a proper 100% res. No.
Right, but have you tried recently? Because I myself was definitely a snob when it came to Performance mode, and only in the last six months has it got good enough that I am willing to use it. That's really my main point - we know that DLSS has become orders of magnitude better than when it first launched in 2018. My point is that even since the launch of DLSS4, it seems to have got better still, so I urge people to try it who previously dismissed Performance mode due to the low internal res.

As to other points made above - yes, I'm aware that a low internal res can have knock-on effects for RT and other visual metrics.

Ultimately we all have to pick what matters the most to us in terms of resolution / frame rate / RT / visual settings / framegen / latency - and for me the Performance mode has got good enough that the needle has now moved and I'm prepared to pick that instead of compromising on other options on that list.
 
I usually use 4k with Native DLAA which is even better.

I can't go back

Theres almost no difference between DLAA and DLSS quality aside from better framerates when playing normally. YOu'd have to zoom in 200% and stand still to notice the differences. I think Corporal.Hicks Corporal.Hicks posted 2 screenshots in an older thread comparing them and people couldnt figure out which is which. Maybe im remembering wrong tho.
 
Big reason why I went back to Nvidia was the support DLSS has. 9070 XT is a great card, but the image quality just isn't as good overall and I really don't think I should have to hack every game to get FSR 4.
 
Theres almost no difference between DLAA and DLSS quality aside from better framerates when playing normally. YOu'd have to zoom in 200% and stand still to notice the differences. I think Corporal.Hicks Corporal.Hicks posted 2 screenshots in an older thread comparing them and people couldnt figure out which is which. Maybe im remembering wrong tho.

When you are as picky as I am with aliasing and artefacts you learn to see the differences. Trust me there is, specially in a 65 inches TV
 
I still stand on my position.
I play on 8K/4K screens, as well as use DLDSR resolutions a lot. I spent hundreds of hours with DLSS testing. Cannot even count the amount of times spent on diverse comparison with all options, presets etc... in conjunction with DLDSR 1.78 & 2.25.
DLSS P on 4K is noticeable and absolutely a no go for fast games (racing/fps) it smears, it has blurring and oversharpened stuff mainly on the distance because of the compensation the processing has to do to accommodate for low details.
DLSS, no matter how good it is, is still a TAA solution. It's a good TAA solution, but it still is a TAA.
Also, because of the use of extreme blurring, motion blur, dof, chromatic aberration, sharpen filter etc... on all "modern" slop engines like UE5, the difference seems less noticeable to your brain simply because even on native these engines are blurry & smeary slops. Every medium to distant details are lost within it.
Heck, when you think about it, even "quality" mode is very low as it is 0.667 scale which makes the resolution 0.45x of source.
Just tired to see all these "magic voodoo" statement that constantly repeat that the 25% pixels can construct a proper 100% res. No.
But know that this is because of these false claims & false observations that developers don't bother optimizing games anymore, relying on the "well, put dlss performance bruh it will be like native".
To my testing, DLSS P started to output a statisfying image when using DLDSR 1.78 (2880 image), balanced preset still being a bit better.
DLDSR 2.25X, 5760x3240 so 2880X1620 DLSS resolution was when that preset was good to my eye.

Also, there was no offence implied on my first answer, but my apologies if it was clumsy on my end.
The exhaustion of the constant "DLSS being magic" is sometimes bothering me for the aforementioned reasons.
Are you sure you arnt talking about ray reconstruction (RR) artefacts? RR does everything you mentioned. At a lower internal resolution (even DLSSQ), it starts bluring fine details and smears during motion (and especially with DLSSP). RR is also adding adaptive sharpening. These problems are very noticeble to me and even on static image RR clearly affects the image if I will not use DLAA.

As for Deep Learning Super Resolution, this technology (DLSS2) previously used adaptive sharpening during motion as well. However, DLSS3 has completely resolved this issue, and now uses uniform sharpening settings (and at default sharpening settings doesnt use any sharpening mask at all). Therefore, I find it strange that the sharpening in the DLSS image bothers you so much when this technology no longer adds sharpening.

If you are using DLDSR, it is also possible that this technology will sharpen the image. I used DLDSR feaature on my previous 2560x1440 monitor because downscaling allowed me to achieve a sharp image even in the blurriest TAA games. DLDSR image however wasnt perfect, because depending on the smoothness level this downscaling method either blurs the image, or eversharpen it, so I wasnt never able to get neutral sharpness like with normal DSRx4 downscaling, or just MSAA / SMAA. I stopped using DLDSR since buying a 32-inch 4K OLED monitor, because TAA no longer looks blurry (the very high pixel density helps to mitigate the blur caused by TAA).

My training in photography and photoshop has taught me to look out for imperfections in images. Even from that perspective, however, I found the 4K DLSS image quality to be very good, and it would be an exaggeration to call it unacceptable. I need to sit like 50cm from my 32inch monitor to notice some 4-8 pixels ghosting / noise during motion but from normal viewing distance (80cm-100cm) both DLAA and DLSSP look indistinguishable to me when it comes to motion clarity, so the only difference comes down to the quality of RT effects. I can still notice things like RT GI instability (especially in dimly lit locations), RT noise and shadow pixelation even at 100cm dostance from my monitor, so that's the reason why I prefer to play at the highest internal resolution if possible. DLAA looks better than DLSS for sure, but it's not always worth trading 5% better image quality for 2x worse framerate. Games at 120fps DLSS-Performane looks sharper during motion than 60fps DLAA and the whole gaming experience is also a lot better. 60fps on OLED / LCD technology (sample and hold displays) blurs 16 pixels during relarively fast motion / scroling, whereas 120fps 8 pixels. Based on what I saw DLSS motion artefacts blend perfectly with that persistence blur and if you want a reasonably sharp image during motion you would need at least 240Hz. My 4K oled has 240Hz and that's still not good enough to offer a CRT like motion clarity.

motion_blur_from_persistence.png



If you think the DLSSP image quality is terrible and unacceptable, what do you think about the image quality on consoles :P?

Image quality on the base PS5 in black myth wukong.

25d10d16247e97b0712c.jpg


27f622856b78b7bcdf1d.jpg


4K DLSS-performance using the same internal resolution (1080p) as the PS5.

b1-Win64-Shipping-2024-09-01-00-30-46-747.jpg


b1-Win64-Shipping-2024-09-01-00-25-53-709.jpg


I think console gamers would be blown away if the PS6 could offer similar image quality. In Black Myth Wukong even DLSS3 in it's performance mode looked way better to my eyes than FSR3 Native. FSR image looked very noisy during motion even at 100% res scale, and there was also noticeable shimmering. DLSS is an extremely impressive technology, clearly superior to FSR3 and that's why so many people prefer to use it over native TAA. I haven't tested FSR4 yet, but I've heard that AMD has made significant improvements to their image reconstruction.

If the DLSS image doesn't satisfy your standards, then I cant imagine you having a good gaming experience, because even the RTX5090 can't run the latest UE5 games well without AI. I wonder what GPU you're using to play games at 4K and 8K with DLAA and at least 100fps to get a reasonable motion clarity.
 
Last edited:
At 4K, DLSS4's preset K (transformer model) is so good in Performance mode (1080p internal) that I no longer even bother with Quality mode. It offers that 'lean in' level of 4K detail you expect, free from aliasing and blurring. If you previously have stayed away from Performance mode DLSS, thinking that the internal resolution is too low - try it. It's really, really good. I presume it continues to evolve, as I know there were some issues with ghosting etc when it released earlier in the year, but now every game I try is pin-sharp and artifact free.

The only problem is that GPU power requirements will also increase, and this will mean more failure rates. CUDA uses a shiton of energy
 
Last edited:
Theres almost no difference between DLAA and DLSS quality aside from better framerates when playing normally. YOu'd have to zoom in 200% and stand still to notice the differences. I think Corporal.Hicks Corporal.Hicks posted 2 screenshots in an older thread comparing them and people couldnt figure out which is which. Maybe im remembering wrong tho.
Bullshit, play fast paced games to make the difference extra obvious if you can't tell, but the difference is still there
 
Bullshit, play fast paced games to make the difference extra obvious if you can't tell, but the difference is still there
IMO, it's easier to notice the DLSS motion artefact in slow-paced games (especially on gamepad), where you can pan the camera slowly and clearly see the motion quality around pixel wide details like hair. In fast-paced games, where I need to perform a quick 180-degree camera turn my eyes can only see motion blur. The eyes can only track a moving object up to a certain speed.
 
Bullshit, play fast paced games to make the difference extra obvious if you can't tell, but the difference is still there
Oh for sure DLAA is ultra clean, pristine, the ultimate AA method par excellence. My point is that DLSS Performance mode now has a great image quality (sharpness, stability, everything) for the first time and is completely viable if you're shooting for high frame rates (or playing a ridiculously heavy game that can only manage 60fps at 1080p....)

When people waffle about 'native' its eye-rolling because the TAA of the pre-DLSS era resulted in 'native' 4k image quality that was far, far less clean and less sharp than today's DLSS4 Performance mode.
 
dlss 4 practically saved 1080p gaming for me. it also practically allowed me to keep my 3070. i always used 1440p DLDSR or 4K DSR and DLSS combo on my 1080p screen to make games look decent but 8GB VRAM was becoming problematic at 1440p lately, especially since I really like ray tracing and path tracing and enable them whenever I can. 1080p native taa looks horrible in most games, and while dlss 3 quality was able to match native 1080p taa in many games and improve upon it in certain cases, it was still blurry.

thankfully dlss 4 changed all of that so I will keep my 3070 until GTA 6 arrives on PC


dlss 4 still has its problems that need fixing though

its really broken in ac shadows for some reason


 
Last edited:
DLSS 4 is black magic

Gaming at 4K and even using performance mode looks great. Some games in motion though at performance you can tell slightly. But 99.9% of games look outstanding in balanced mode. And you gain 30-40 fps
 
Top Bottom