DLSS transformer model is *ridiculous*

IMO, it's easier to notice the DLSS motion artefact in slow-paced games (especially on gamepad), where you can pan the camera slowly and clearly see the motion quality around pixel wide details like hair. In fast-paced games, where I need to perform a quick 180-degree camera turn my eyes can only see motion blur. The eyes can only track a moving object up to a certain speed.
Playing Forza Motorsport and Tokyo Xtreme Racer, the difference between DLSS4 quality modes is very evident, much less artefacts the higher the internal res, while static you can barely tell any of them apart. DLAA holds up the best under close scrutiny, as you'd expect, because even DLSS4 isn't magic, as brilliant as it is
 
Playing Forza Motorsport and Tokyo Xtreme Racer, the difference between DLSS4 quality modes is very evident, much less artefacts the higher the internal res, while static you can barely tell any of them apart. DLAA holds up the best under close scrutiny, as you'd expect, because even DLSS4 isn't magic, as brilliant as it is
I only played Forza Motorsport for about an hour, but I noticed that the DLSS image quality was quite soft in this game. I could download the game again for testing purposes (because I still have it on my Microsoft account). However, based on what I saw, I agree with you that motion clarity could be an issue with this game.

But forza is not exactly a slow-paced game, like Hellblade 2 or walking simulators, but it's also nowhere near as fast-paced as FPS games, where you constantly make quick camera turns. In true fast paced games (even borderlands 4 for example) I can only see blur during motion regardless of framerate.
 
True enough, but anyone who is in the market for a mid-range GPU. I'd easily recommend an under $650 9070XT over a $750 5070 Ti.
i would agree with you if games nowadays didn't have such bad optimization requiring DLSS/FSR + FG to achieve some good performance, that said FSR4 is a breath of fresh air for AMD cards, if they can improve on it to have the same level of DLSS with the lower price tag Nvidia may have a real problem on their hands.
 
True enough, but anyone who is in the market for a mid-range GPU. I'd easily recommend an under $650 9070XT over a $750 5070 Ti.
$100 isn't a significant difference, especially for someone planning to build a new PC. However, the Nvidia card is simply better at pretty much everything, so the $100 price difference is a silly thing to consider. The RTX5070ti has better thermal performance and lower power consumption. Most importantly nvidia card offers more advanced technology. The RTX 5070 Ti can run even PT games quite well and especially with FGx2 and MFGx4. Nvidia card also supports ray reconstruction, which makes PT games less noisy. What's more nvidia cards can use DLSS 4 in an extremely wide range of old and new games. If I were building a new PC right now, I would choose the 5070ti over 9070XT, even if the 5070ti would be priced at $1000, because at least I would be happy with my gaming experience. The 9070XT is cheaper for a reason, but at least AMD has improved FSR4 image reconstruction to the point where it's usable in games that support it. However, FSR FG is still nowhere near as usable as DLSS FG, and I wouldn't want to use it even if I had no choice.
 
Well I think I have something on my eyes, because for me dlss4 besides quality is a blurry mess even at 4k and if u turn on frame generation then is a complete crap.
 
$100 isn't a significant difference, especially for someone planning to build a new PC. However, the Nvidia card is simply better at pretty much everything, so the $100 price difference is a silly thing to consider. The RTX5070ti has better thermal performance and lower power consumption. Most importantly nvidia card offers more advanced technology. The RTX 5070 Ti can run even PT games quite well and especially with FGx2 and MFGx4. Nvidia card also supports ray reconstruction, which makes PT games less noisy. What's more nvidia cards can use DLSS 4 in an extremely wide range of old and new games. If I were building a new PC right now, I would choose the 5070ti over 9070XT, even if the 5070ti would be priced at $1000, because at least I would be happy with my gaming experience. The 9070XT is cheaper for a reason, but at least AMD has improved FSR4 image reconstruction to the point where it's usable in games that support it. However, FSR FG is still nowhere near as usable as DLSS FG, and I wouldn't want to use it even if I had no choice.
I actually 100% disagree with this and I am surprised that I do. In Horizon Forbidden West, I found the game performed much better and less stuttery using AMD FG with DLSS on my 5070 Ti than all NV.
 
At 4K, DLSS4's preset K (transformer model) is so good in Performance mode (1080p internal) that I no longer even bother with Quality mode. It offers that 'lean in' level of 4K detail you expect, free from aliasing and blurring. If you previously have stayed away from Performance mode DLSS, thinking that the internal resolution is too low - try it. It's really, really good. I presume it continues to evolve, as I know there were some issues with ghosting etc when it released earlier in the year, but now every game I try is pin-sharp and artifact free.

It used to be the case that Quality mode offered a significant step-up in terms of clarity, but that seems to no longer be the case. This model is so good that 1080p is 'enough' to infer a great image.

it's not always that simple tho.
any game that uses raytracing will scale worse with resolution than games without raytracing.

as you lower the internal resolution, you also lower the rays per pixel, which can lead to more obvious boiling and pixelated reflections.
 
I actually 100% disagree with this and I am surprised that I do. In Horizon Forbidden West, I found the game performed much better and less stuttery using AMD FG with DLSS on my 5070 Ti than all NV.
Every game I tried felt terrible with FSR FG enabled. Even at a base of 100 fps, I could clearly feel the input lag, and if a game feels laggy to me, I don't care how many frames per second the card renders. DLSS FG extremely low input lag (I measured 1-2ms difference in the best case scenario), and I also feel like DLSS FG adjust mouse movement to match generated frames with better results. My aiming skills aren't negatively affected by DLSS FG, and that's why I like to use this technology. The ability to boost the framerate by 80% without experiencing the negative aspects of generating frames is a significant achievement for Nvidia. I also havent noticed any stuttering with DLSS FG. You saw stutters with DLSS FG and not without it? I have Horizon Forbidden West on my steam account, so I can retest the game and see if it really has trouble with DLSS FG implementation, but based on what I reammeber this game run without such issues.
 
Last edited:
dlss 4 practically saved 1080p gaming for me. it also practically allowed me to keep my 3070. i always used 1440p DLDSR or 4K DSR and DLSS combo on my 1080p screen to make games look decent but 8GB VRAM was becoming problematic at 1440p lately, especially since I really like ray tracing and path tracing and enable them whenever I can. 1080p native taa looks horrible in most games, and while dlss 3 quality was able to match native 1080p taa in many games and improve upon it in certain cases, it was still blurry.

thankfully dlss 4 changed all of that so I will keep my 3070 until GTA 6 arrives on PC


dlss 4 still has its problems that need fixing though

its really broken in ac shadows for some reason



Its still that bad? I remember going back to 3 in that game even tho 4 was a lot sharper, because of these bugs.
 
The only thing I'd like, considering I mostly use DLAA on a 1080p monitor, is some built in sharpening again like CAS. They removed it from DLSS before DLSS3.

Nvidias sharpening via Profile Inspector is a bit shit. Not always needed, but UE5 games are a bit soft and full of so much blurry bullshit. I don't want to have to use Reshade.

Fine on my Sony TV as Reality Scaling at 10-15 is more than enough, but monitors are wank in general, and don't offer scaling or sharpening on the same level.
 
Last edited:
Can someone help a brother out? I have an RTX 3080 and typically use DLSS Balanced @4k.

How do I switch models? Or is it automatic based on my Performance/Balanced/Quality/DLAA setting? Can I even use different models on a 3000-series.

Feels weird not being clued in on different models but it's Greek to me.
 
Can someone help a brother out? I have an RTX 3080 and typically use DLSS Balanced @4k.

How do I switch models? Or is it automatic based on my Performance/Balanced/Quality/DLAA setting? Can I even use different models on a 3000-series.

Feels weird not being clued in on different models but it's Greek to me.

Most new games should use dlss4 by default but it's not 100% guaranteed.

In NVIDIA app you can always choose profile for the game and override it to use "latest" available version. It will provide dlss4 in all supported games.

You can also manually switch presets, A-F is dlss3 and L, K are DLSS4. If you want to mess with it I suggest Preset E for Dlss3 and preset K for Dlss4.

Nvidia app is the official way but there also is Nvidia inspector or dlss swapper. With them you can change dlss versions even in unsupported games.
 
Last edited:
Most new games should use dlss4 by default but it's not 100% guaranteed.

In NVIDIA app you can always choose profile for the game and override it to use "latest" available version. It will provide dlss4 in all supported games.

You can also manually switch presets, A-F is dlss3 and L, K are DLSS4. If you want to mess with it I suggest Preset E for Dlss3 and preset K for Dlss4.

Nvidia app is the official way but there also is Nvidia inspector or dlss swapper. With them you can change dlss versions even in unsupported games.
Thanks! I don't use the NVIDIA app, inspector, or swapper today. That explains where this stuff has been hiding from me.
 
Thanks! I don't use the NVIDIA app, inspector, or swapper today. That explains where this stuff has been hiding from me.

Nvidia app is the most "official" way and it's quite easy to use.

Swapper offers MANY interesting options so I suggest installing it anyway, you can see and change .dlss files that games have in their folders. With it you can also mess with games not officially supporter by NVIDIA app.
 
Big reason why I went back to Nvidia was the support DLSS has. 9070 XT is a great card, but the image quality just isn't as good overall and I really don't think I should have to hack every game to get FSR 4.
I did as well. Which card did you end up going with?
 
I actually think this is an area channels like Digital Foundry can hurt the normal everyday gamer and plant some unnecessary bias.
I'm not going to be standing still zooming in on a leaf that's in the distance and wondering if it had proper smooth edges. Yes, things up close and in your realistic view matter, but sometimes stuff gets highlighted that doesn't matter and you don't see it when actually playing versus say ghosting, artifacting, blurriness etc

Go with your eyes and tweak it to what you like. Same as ray tracing, I don't care if some random house plant pot is casting a shadow or not. I don't play games like those IGN developer exclusive 15 minute playthroughs that show like 3 minutes of the actual game.
 
You have to wonder how the native absolutists reconcile and make peace with themselves when they see clear DLSS superiority as shown here. Do they continue to lie about it?
Like every outdated opinion in human history, yes. They do. Not just to others, but to themselves.
 
Last edited:
A few months ago I switched to the K preset on the E33 and was impressed with how the image quality improved in performance mode, not to mention that several artifacts (mainly in hair and shadows) disappeared.
 
I output to a big 4k TV, but I'd still love to see some 1080p tests with DLSS4. At one time it was recommended to stick to native at that resolution, but I imagine the transformer model changed the outcome, maybe even down to balanced mode. Can't find any one testing this for some reason though, which is odd.
 
If you push that to its logical conclusion as AI gets stronger, you could imagine one day that a "game engine" proper eventually just renders the spatial scene and physics in an extremely low res and low fidelity form, to give consistency and reliable interactions, then an LLM layer above it creates the entire high quality visual world atop that simple frame.
 
I actually think this is an area channels like Digital Foundry can hurt the normal everyday gamer and plant some unnecessary bias.
I'm not going to be standing still zooming in on a leaf that's in the distance and wondering if it had proper smooth edges. Yes, things up close and in your realistic view matter, but sometimes stuff gets highlighted that doesn't matter and you don't see it when actually playing versus say ghosting, artifacting, blurriness etc

No, you won't notice one leaf or one jagged edge but you'll notice when playing the game that something feels wrong. There's a shimmer off in the distance or a blur over a moving object or some object which looks flat in the lighting or just something that catches your eye and bothers you.

The zoom is an over-exaggerated investigation into the reasons why the cumulative image feels off, but it's not stuff that goes unnoticed, consciously or otherwise.

Go with your eyes and tweak it to what you like. Same as ray tracing, I don't care if some random house plant pot is casting a shadow or not. I don't play games like those IGN developer exclusive 15 minute playthroughs that show like 3 minutes of the actual game.

Again, no, one bad shadow or light bounce won't ruin your day. (I'm fact, it's kind of crazy, if you're an animation nerd, to look at some flubs or even cheats in blockbuster movies where there's zero shadow or the lighting is totally botched but the scene still works. )

However, the overall impression of accurate lighting affects your unconscious acceptance of the scene. There's a reason why Pixar movies look like movies and videogame graphics look like games even as res and tech catch up in realtime visuals. RT/PT looks more "right" in cg visuals, and while that's not the most important thing to enjoy a game, it helps where he matters.
 
DLSs4 is great.

I used to use DLSS3 Quality mode on most titles, but DLSS4 Balanced basically gives you better image quality with slightly more frames.

I still stay away from Performance mode unless it's absolutely necessary.
 
I always use DLSS 4 performance mode at 4K in my 4K monitor with my RTX 4070 Super. It saved my 4070 Super. No way i would be able to play current games at 4K with a 4070.
 
Last edited:
No, you won't notice one leaf or one jagged edge but you'll notice when playing the game that something feels wrong. There's a shimmer off in the distance or a blur over a moving object or some object which looks flat in the lighting or just something that catches your eye and bothers you.

The zoom is an over-exaggerated investigation into the reasons why the cumulative image feels off, but it's not stuff that goes unnoticed, consciously or otherwise.



Again, no, one bad shadow or light bounce won't ruin your day. (I'm fact, it's kind of crazy, if you're an animation nerd, to look at some flubs or even cheats in blockbuster movies where there's zero shadow or the lighting is totally botched but the scene still works. )

However, the overall impression of accurate lighting affects your unconscious acceptance of the scene. There's a reason why Pixar movies look like movies and videogame graphics look like games even as res and tech catch up in realtime visuals. RT/PT looks more "right" in cg visuals, and while that's not the most important thing to enjoy a game, it helps where he matters.
I was saying to basically go with what looks right. If DLSS in a game makes it have issues then don't use it. I've played games that DLSS caused obvious issues from ghosting, blurriness, broken images when turning the camera etc. I don't care if a random object I don't pay attention to in real game play isn't lit properly or if some far off grass has jaggy edges.

I generally use DLSS in games as I like what it offers, but sometimes the analysis videos can plant things into your thinking that tou would never notice.
 
If you push that to its logical conclusion as AI gets stronger, you could imagine one day that a "game engine" proper eventually just renders the spatial scene and physics in an extremely low res and low fidelity form, to give consistency and reliable interactions, then an LLM layer above it creates the entire high quality visual world atop that simple frame.
This is the idea. However, it wouldn't be an LLM given that those are text based (Large Language Model) but you are right. LLMs can exist thanks to the transformer model and DLSS4 also relies on the transformer model. In the future there might new improved models.

If with the transformer models up to 15 out of 16 pixels can be AI generated, we can only imagine what could be possible in the future!
 
I actually 100% disagree with this and I am surprised that I do. In Horizon Forbidden West, I found the game performed much better and less stuttery using AMD FG with DLSS on my 5070 Ti than all NV.
Rise of the Ronin plays better with AMD FG than NVs version as well. It's very rare, but it does have it's use cases, for whatever reason.
 
I actually 100% disagree with this and I am surprised that I do. In Horizon Forbidden West, I found the game performed much better and less stuttery using AMD FG with DLSS on my 5070 Ti than all NV.

I gotta say, as a 3060ti user, the only experience I have with frame gen is AMD's frame gen. and in every game I tried it it was HORRENDOUS to say the least. it almost always felt less smooth than frame gen off, and when it looked off, I could easily see the outer edges of the screen warping weirdly as new image information enters the screen.
 
Last edited:
I got a 5070 ti a few days ago and it has been a ride (coming from a 3070 which I liked a lot).

I'm playing on a 1440p monitor so there hasn't been any need to use FG but DLSS4 is just mind blowing. I'm getting 90 FPS in games like Jedi Survivor and Ninja Gaiden II (ultra settings). Even RDR2 runs maxxed out at a high framerate.

There are some artifacts but nothing that is bad enough that makes me prefer to play without DLSS. It's crazy.
 
I gotta say, as a 3060ti user, the only experience I have with frame gen is AMD's frame gen. and in every game I tried it it was HORRENDOUS to say the least. it almost always felt less smooth than frame gen off, and when it looked off, I could easily see the outer edges of the screen warping weirdly as new image information enters the screen.
something could be off with your configuration, i've given it a fair shake in ac shadows and spiderman 2 and liked it quite a lot. only problem it has for me is that UI elements seemed to stay at low FPS but overall it was quite smooth
 
Top Bottom