Do you just DLSS it for every games now?

DLSS it do you not


  • Total voters
    197
Yes, load game, go to settings, crank everything to max then hunt down DLSS optons sometimes hidden in weird places and set to Balanced. It's become a habit.
Then play game for a little while, go to settings set DLSS to Quality, go back into game see how is, repeat for first hour of gameplay, then settle on a setting and actually play the game.
 
Started playing Hogwarts on PC a few weeks ago, and was using native res with ray tracing, but for some reason some of the ray traced reflections showed some flickering, after googling ppl said to just turn on Ray reconstruction that fixes it, but Ray reconstruction is a DLSS feauture and cant be turned on using native res, so i tried turn on DLSS in quality mode, it lowered the native res but with ray reconstruction on fixed the flickering problem.

So not only the image quality remained the same with DLSS on, it actually fixed a graphic issue that was present in native res.

I still go for native res if my PC can handle the game well in max settings, but after experiencing how well DLSS performs, i have no problems turn it on and play it like that if i just want some extra FPS.
 
Last edited:
Can I mod an old game to use it? that would be epic

How old a game are you planning on going?
Cuz im pretty sure at some point even an RTX2060 will be able to brute force past the need of DLSS.
If you just need better image quality then you could probably get away with DLDSR for nonsquare resolutions or even just DSR for square.
 
if you want to get 1440p and >60fps for most games today you have to.

and not because things are so technically advanced, PC ports are just that shit.
 
I cannot fathom not enabling DLSS if it was available. Native resolution is a gross waste of resources, and can often look worse.
It's weird when it's not available. Atomfall had no upscalling at all. none. It looked awful in places. DLSS would have cleaned that up no problem.
 
Maybe I'm a piece of shit, but I think games look and perform amazing with it. I also have no plans to upgrade my 3070ti anytime soon so...
 
More frames, no noticeable change to image quality. At this point I just roll my eyes at anyone complaining about fake pixels or fake frames. You're leaving performance on the table if you don't toggle it on
 
native 4k for a while to get used to the game's graphics/set a visual baseline

then based on performance, will move down (DLSS) or up (DLAA or old-fashioned downsampling).
 
Last edited:
Native for 1080p and below, DLSS for 1440p and 4k

You actually run your 4K panel/games at 1080p or using scaling?

Wouldnt Transformer Performance get you a similar workload but higher quality output?
I use Balanced or Performance depending on the workload, cuz id rather have all the bells and whistles whenever possible.
Actually scaling down to 1080p.......well I actually havent tried in forever, but maybe I should, I cant imagine it actually being better than using 1080 as a base resolution and having DLSS Transformer upscale to 4K.
 
You actually run your 4K panel/games at 1080p or using scaling?

Wouldnt Transformer Performance get you a similar workload but higher quality output?
I use Balanced or Performance depending on the workload, cuz id rather have all the bells and whistles whenever possible.
Actually scaling down to 1080p.......well I actually havent tried in forever, but maybe I should, I cant imagine it actually being better than using 1080 as a base resolution and having DLSS Transformer upscale to 4K.
My PC is connected into a 1440p monitor and a 4k TV, I play competitive games in 1440p on my desk, and more immersive story driven games in a 4k TV.

I don't run anything in 1080p anymore, but it used to be the case before I updated my monitor.
 
Last edited:
DLSS all the way.

Been using x2 frame gen on Doom this last week and i really can't notice the input lag.

Feels great maxed out at over 200 FPS on a laptop.

I was using DLAA and 3x framegen with reflex and it was incredibly smooth 150+ fps and responsive
 
The question is not "do you but more like you dont have a say in it anymore" because quite frankly dev's have given up with optimisation and just default to the mindset that everyone will use a upscaling technique anyway, AAA games with demanding graphic's run like shit the last 3-4 years in native.
 
Most cases yes even if I can hit my target framerate. Sometimes it depends on the game and how much it taxes my system. I've noticed in some games, my GPU is running at 99% and CPU at like 20%. I don't know why this is, but turning on DLSS and especially frame generation tends to change the usage. So I configure to balance performance and usage
 
yes ever since dlss2 was invented if I play on pc.
It looks better than native 4k. It just does. Even in Balanced setting but for sure in quality.
Native makes no sense
 
dlssno.png
 
How old a game are you planning on going?
Cuz im pretty sure at some point even an RTX2060 will be able to brute force past the need of DLSS.
If you just need better image quality then you could probably get away with DLDSR for nonsquare resolutions or even just DSR for square.

I got a 4070 and atm I'm playing Dragon Age Inquisition from 2014, game actually runs pretty good everything maxed out but still I'm curious.
 
Last edited:
Does it? I thought it was just Transformer Ray Reconstruction that runs badly on Turing and Ampere, but the regular Transformer model has roughly the same performance impact as it does on Ada and Blackwell.
Ray Reconstruction is even worse. But transformer upscaling also reduces performance by quite a lot, too.
 
There are many games in my library that can run DLAA with playable framerate (over 60fps) but I found out that DLSS ultra quality (77% resolution scale) is definitely much better choice. I still gain 15fps compared to DLAA and there's maybe 1% visuall difference. If I however need more performance I like to use DLSS Quality (67% resolution scale). If DLSS implementation is very good, I might even use DLSS performance (50% resolution scale). I played Robocop Unfinished Business at 4K with DLSS performance + FGx2 at 170-190 fps, and the image quality looked 4K-like to my eyes. With DLSS UltraQuality (77% resolution) I had 120-130fps.
 
Last edited:
I really do wish there was a better situation for AA beyond AI upscaling, because some older games at 4k with SMAA look very crisp.
 
Last edited:
Depends on my hardware.
The older it gets, the more I'll need to rely on dlss\fsr tech.
I had a 2060 super for a LONG time, and dlss most definitely helped.
 
I don't search for issues and would like GeForce Now settings to always work without double checking. Never had any major issues that made me turn DLSS off from 2.0 onwards.
 
DLSS 4 with transformer model, set to quality and boom watch your GPU shaves off 25% power usage and visuals look virtually the same while playing and not pixel peeping! Edit: and runs 30% higher fps! It's like magic

Team Jensen ❤️
You forgot to add 25% more impute lag.....
 
DLSS 3/4 is so good that it's not worth using shitty TAA + native res in vast majority of games.
I agree, but there will always be some differences that nitpickers can spot. For example in UE5 games nanite quality is tied to internal resolution. It's not obvious just from playing the game, but if you take screenshots and compare them closely, you can definitely see some differences. Also in RT games DLSS can make a big difference because internal resolution adjust the resolution of RT effects. For example, GTA5 EE with DLAA produces reasonably sharp shadows, but even DLSS quality makes them look slightly pixelated. Ray Reconstruction supposed to help with this, but sometimes it can make the image look too filtered. Even in raster games there are some differences, especially with SSR resolution (Dead Space Remake has more pixelated SSR reflections with DLSSQ). These differences arnt however big considering the boost in framerate. Choosing between 35 fps at native resolution and 120 fps with DLSS is an easy decision. The DLSS image will still look 4K like and offer a much better experience overall.
 
Last edited:
You forgot to add 25% more impute lag.....
An increase in input lag of 25% is simply not true. DLSS SR (image reconstruction component) reduces input lag exactly like real framerate, while DLSS FG adds very little lag (my measurement in cyberpunk 33.8ms with FG vs 31.3ms with real framerate). MFG adds a litte big higher lag but it's should be still small.
 
I agree, but there will always be some differences that nitpickers can spot. For example in UE5 games nanite quality is tied to internal resolution. It's not obvious just from playing the game, but if you take screenshots and compare them closely, you can definitely see some differences. Also in RT games DLSS can make a big difference because internal resolution adjust the resolution of RT effects. For example, GTA5 EE with DLAA produces reasonably sharp shadows, but even DLSS quality makes them look slightly pixelated. Ray Reconstruction supposed to help with this, but sometimes it can make the image look too filtered. Even in raster games there are some differences, especially with SSR resolution (Dead Space Remake has more pixelated SSR reflections with DLSSQ). These differences arnt however big considering the boost in framerate. Choosing between 35 fps at native resolution and 120 fps with DLSS is an easy decision. The DLSS image will still look 4K like and offer a much better experience overall.

You are 100% correct here.

UE5 games are usually heavy enough that native 4k picture is not possible for most people so you have to use TSR/FSR/XeSS or DLSS and DLSS produces the best results.

RT games are heavy as well so not a lot of people can run them in native 4k (or even 1440p in some cases), of course native looks much better with reflections and some other effects (like you said, resolution of them scales with internal res) but between 30 fps with sharp reflections and 60fps with blurry reflections (but all other things looking good) I will choose higher framerate pretty much every time.

There are some games where native res produces better results, Forza with MSSAx4 and 4k has shit ton of jaggies but it's also super sharp, very thin geometry looks correct (power lines) and there is zero ghosting or other issues associated with temporal effects.
 
Last edited:
I generally turn it on if it's available because why not? I usually just set it to Quality but I just got a 4K OLED so if I play on that or my C2 I'll use balanced it even performance.

FG I only use if I feel like I need to or if I'm already at 80 or above.
 
Top Bottom