Do you just DLSS it for every games now?

DLSS it do you not


  • Total voters
    158
Yes, load game, go to settings, crank everything to max then hunt down DLSS optons sometimes hidden in weird places and set to Balanced. It's become a habit.
Then play game for a little while, go to settings set DLSS to Quality, go back into game see how is, repeat for first hour of gameplay, then settle on a setting and actually play the game.
 
Started playing Hogwarts on PC a few weeks ago, and was using native res with ray tracing, but for some reason some of the ray traced reflections showed some flickering, after googling ppl said to just turn on Ray reconstruction that fixes it, but Ray reconstruction is a DLSS feauture and cant be turned on using native res, so i tried turn on DLSS in quality mode, it lowered the native res but with ray reconstruction on fixed the flickering problem.

So not only the image quality remained the same with DLSS on, it actually fixed a graphic issue that was present in native res.

I still go for native res if my PC can handle the game well in max settings, but after experiencing how well DLSS performs, i have no problems turn it on and play it like that if i just want some extra FPS.
 
Last edited:
Can I mod an old game to use it? that would be epic

How old a game are you planning on going?
Cuz im pretty sure at some point even an RTX2060 will be able to brute force past the need of DLSS.
If you just need better image quality then you could probably get away with DLDSR for nonsquare resolutions or even just DSR for square.
 
if you want to get 1440p and >60fps for most games today you have to.

and not because things are so technically advanced, PC ports are just that shit.
 
I cannot fathom not enabling DLSS if it was available. Native resolution is a gross waste of resources, and can often look worse.
It's weird when it's not available. Atomfall had no upscalling at all. none. It looked awful in places. DLSS would have cleaned that up no problem.
 
Maybe I'm a piece of shit, but I think games look and perform amazing with it. I also have no plans to upgrade my 3070ti anytime soon so...
 
More frames, no noticeable change to image quality. At this point I just roll my eyes at anyone complaining about fake pixels or fake frames. You're leaving performance on the table if you don't toggle it on
 
native 4k for a while to get used to the game's graphics/set a visual baseline

then based on performance, will move down (DLSS) or up (DLAA or old-fashioned downsampling).
 
Last edited:
Native for 1080p and below, DLSS for 1440p and 4k

You actually run your 4K panel/games at 1080p or using scaling?

Wouldnt Transformer Performance get you a similar workload but higher quality output?
I use Balanced or Performance depending on the workload, cuz id rather have all the bells and whistles whenever possible.
Actually scaling down to 1080p.......well I actually havent tried in forever, but maybe I should, I cant imagine it actually being better than using 1080 as a base resolution and having DLSS Transformer upscale to 4K.
 
You actually run your 4K panel/games at 1080p or using scaling?

Wouldnt Transformer Performance get you a similar workload but higher quality output?
I use Balanced or Performance depending on the workload, cuz id rather have all the bells and whistles whenever possible.
Actually scaling down to 1080p.......well I actually havent tried in forever, but maybe I should, I cant imagine it actually being better than using 1080 as a base resolution and having DLSS Transformer upscale to 4K.
My PC is connected into a 1440p monitor and a 4k TV, I play competitive games in 1440p on my desk, and more immersive story driven games in a 4k TV.

I don't run anything in 1080p anymore, but it used to be the case before I updated my monitor.
 
Last edited:
DLSS all the way.

Been using x2 frame gen on Doom this last week and i really can't notice the input lag.

Feels great maxed out at over 200 FPS on a laptop.

I was using DLAA and 3x framegen with reflex and it was incredibly smooth 150+ fps and responsive
 
The question is not "do you but more like you dont have a say in it anymore" because quite frankly dev's have given up with optimisation and just default to the mindset that everyone will use a upscaling technique anyway, AAA games with demanding graphic's run like shit the last 3-4 years in native.
 
Most cases yes even if I can hit my target framerate. Sometimes it depends on the game and how much it taxes my system. I've noticed in some games, my GPU is running at 99% and CPU at like 20%. I don't know why this is, but turning on DLSS and especially frame generation tends to change the usage. So I configure to balance performance and usage
 
yes ever since dlss2 was invented if I play on pc.
It looks better than native 4k. It just does. Even in Balanced setting but for sure in quality.
Native makes no sense
 
dlssno.png
 
How old a game are you planning on going?
Cuz im pretty sure at some point even an RTX2060 will be able to brute force past the need of DLSS.
If you just need better image quality then you could probably get away with DLDSR for nonsquare resolutions or even just DSR for square.

I got a 4070 and atm I'm playing Dragon Age Inquisition from 2014, game actually runs pretty good everything maxed out but still I'm curious.
 
Last edited:
Does it? I thought it was just Transformer Ray Reconstruction that runs badly on Turing and Ampere, but the regular Transformer model has roughly the same performance impact as it does on Ada and Blackwell.
Ray Reconstruction is even worse. But transformer upscaling also reduces performance by quite a lot, too.
 
Top Bottom