• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

NVIDIA DLSS 4.5 to feature 2nd Gen Transformer model and Dynamic 6x Frame Generation

Model E (the last CNN one) is deprecated and has been removed once model K was out of beta (SDK 310.4) so back in DLSS 4.0, it's obviously missing in 4.5 as well.
Also there can be changes in existing models between SDK versions, the docs clearly say so. Minor improvements (or regressions) will not be visible though.
Missing? You can still select the model in the App?
 
Missing? You can still select the model in the App?
7odjifpy.png

Dunno what the app does on 310.4+ if you select models A-E but my guess would be nothing as in you're actually getting the latest K (+L/M in 310.5.x now) regardless.
 
Is it great tho?

Now people are never gonna try profile M quality because they think it's only good for perf mode which is bullshit.

It's more straightforward, now you don't have to manually tweak Nvapp or Swapper with every internal resolution change. It chooses the most optimal preset.

And if you want to use preset L DLAA for example, no one is stopping you if set it like that :)

7odjifpy.png

Dunno what the app does on 310.4+ if you select models A-E but my guess would be nothing as in you're actually getting the latest K (+L/M in 310.5.x now) regardless.

Probably recent change, in that 310.4.0 version. When 310.x.x launched there were games using this .dll and preset E by default.
 
It's more straightforward, now you don't have to manually tweak Nvapp or Swapper with every internal resolution change. It chooses the most optimal preset.

And if you want to use preset L DLAA for example, no one is stopping you if set it like that :)



Probably recent change, in that 310.4.0 version. When 310.x.x launched there were games using this .dll and preset E by default.
I was more talking about people less informed than us, they are never gonna try profile m quality because the app says that k is better...
 
I was more talking about people less informed than us, they are never gonna try profile m quality because the app says that k is better...

True. But first impressions with DLSS4.5 from many people were: "I have set the newest profile and now my performance is terrible" (mostly from 2xxx, 3xxx users).

So with that update they mostly avoid it and power users still have many options to set up DLSS as they like.
 
Last edited:
And this new version allows for good comparisons. Mirage, Balanced K, vs. Performance M vs. Ultra Performance L:

yZP41sr.jpeg
g4WvK1f.jpeg
mYScOAh.jpeg
 
Last edited:
Jesus framegen is useless in cyberpunk, it introduce a shitload of artifacts.
I absolutely love the DLSS FG and I can't imagine playing Cyberpunk at 4K without it. Even at 1440p, my experience improved a lot when the frame rate jumped from 110 fps to 185 fps. Motion gets sharper and it's even easier to aim. It really feels like true 185fps. I'm only using an FGx2, meaning one generated frame for every real frame making it more difficult to see potential artifacts. If the base framerate is high, I don't see any artifacts at all.

Even at a low base framerate, the FGx2 artifacts aren't that obvious and I need to look for them in specific situations like small text during motion, or in places with small straight lines, like power cables. At a low base framerate (30fps), DLSS x2 shows some shimmering in such places. However standard 30 fps has very strong judder EVERYWHERE that literally hurts my eyes. Generated 60 fps removes that judder almost completely, so even if there is some slight shimmering around moving text, it's an extremely small price to pay considering the benefits.

I tried to record a video. Filming with the phone camera while moving the camera was difficult, but nevertheless, this video still demonstrates what I said. I used the latest 310.5.FG dll in my test.




0:00 - 0:25 DLAA 30fps ○ Judder is clearly visible and it hurts my eyes. It's possible to minimise the judder if I turn on motion blur in the game settings, but I still wouldnt want to play like that.

0:25 - 1:03 DLAA + FGx2 60fps (30fps base) ○ Generated 60 fps removes that judder almost completely. There are some minor artifacts around the moving text or cable lines, but as I said, that's a small price to pay considering the benefits.

1:03 - 1:35 DLAA + FGx2 60fps + motion blur ○ artifacts around text are no longer visible despite very low base feamerate (30fps). Game is surprisingly playable with these settings. Aiming feels more responsive that typical 30fps game on consoles. Console gamers would certainly love that experience 😃.

1:35 - 2:18 DLSS-P + FGx2, 190-200fps ○ I cant see artifacts at such high generated framerate (even base was around 120fps). The game is also incredibly responsive.

2:18 - 2:45 ○ The same settings but with motion blur. I actually like that small amunt of motion blur and that's how I prefer to play. Even if there were some DLSS FG artifacts, the motion blur would hide them.

2:45 : 3:01 ○ Here, I demonstrate how quickly I turn the camera when gaming with a mouse. My eyes just see a blur during movement, so even if I played with only generated 60 fps and no motion blur, I still wouldnt see artifacts at that speed.

And input lag measurements:

4K DLSSP + FGx2, RT Ultra ○ 147fps 33.5ms lag (the NVIDIA OSD is located in the upper right corner). The game is perfectly responsive at these settings, and DLSS FG makes it feel like I'm really playing at a high refresh rate.



4K DLAA 30fps, 66.3ms


4K DLAA 30fps base + FGx2 ○ 61fps, 70ms lag (3.7ms difference)



4K DLSSP ("M") + FGx2, raster ultra settings ○ 129fps, 23.4ms


4K DLSSP ("M") + FGx2, raster ultra settings ○ 190fps, 27.4ms (4ms difference)


Even at generated 60 fps (70 ms), Cyberpunk has less lag than the typical PS5 game.


CvR1cKKLACbzuOr6.jpg


If I can see artifacts in cyberunks they are usually the result of PT noise / trailing, especially with agressive DLSS settings (50% resolution scale).

Also, when I run the game with raster settings and use the 4.5/4.0 DLSS (M,K presets) I can sometimes see artifacts around the grass. The old D or E presets don't have this problem in raster, so I would recommend using these presets to people who want to play Cyberpunk with raster graphics. It's almost as if 4.0/4.5 were exclusively trained for PT/RT in this game.

And I almost forgot about ray reconstruction! This feature can also create artifacts and even 'boiling' in dimly lit locations. If people play Cyberpunk with PT and ray reconstruction at a low internal resolution (50%), they will get nasty artifacts. People might assume that DLSS FGx2 is to blame for ghosting and artifacts, but based on my experience, FG is actually the last thing in this game to create artifacts.

In summary, I disagree that DLSS FG in Cyberpunk is useless. It drastically improves motion clarity and perceived smoothness. I can't notice artifacts at a high base framerate, and even at a low base framerate (30 fps), I have to look hard to see them (and especially with ingame low motion blur settings). Without DLSS FG, Cyberpunk wouldn't be very playable with RT at 4K. I would need to play with the raster settings, or with minimal RT (RT reflections in this game cost almost nothing on my GPU).
 
Last edited:
Yep, I can see the noise from the frames generating. Like flickering. Wish they would fix it.
Are you talking about DLSS FGx2 or the latest MFG x3 / x4? My DLSS FGx2 test shows small artifacts around text or cables, but these are only visible at a low base framerate (30fps) and without motion blur. With a base frame rate of around 80–120 fps, there are no artifacts, as I have demonstrated in my video. This was filmed in the most challenging scenario for frame generation, which involved tracking the motion of small text.

However, I didn't test MFG because my GPU doesn't support it. Generating three additional frames would certainly make these small artifacts much bigger and more noticeable.
 
Soooooo Switch 2 update?

New models are way too heavy for S2.

Interesting thing I noticed with the new DLSS.dll (310.5.2).

High on Life don't support preset forcing for some reason, yet it changes presets (K, L, M) automatically (like with "use recommended") with just DLSS file swap:

ljjEgKLRAEyWU1Gi.jpg
 
DLSS 4.5 from my testing is very suspect. It looks both sharper and blurry at the same time. It's like an artist used a stencil to outline all the edges but the actual detail is blurry.

It's very off-putting and it's a real struggle to rationalize the praise it's getting. It's materially sharper but i don't know that the image is better for it.

At extremely low resolutions, it's better than using TAA alternatives but, I wouldn't upscale from those terrible resolutions anyway. All this is going to do imo is just cause devs to optimize even less than they are now.
 
Has anyone managed to make Death Stranding not look like a blurry mess?

Even with DLSS 4.5, the game still looks way too blurry. It also lacks anisotropic filtering for some reason.
 
Has anyone managed to make Death Stranding not look like a blurry mess?

Even with DLSS 4.5, the game still looks way too blurry. It also lacks anisotropic filtering for some reason.

Force AF in the driver - I have x16 by default since 2007.

You are talking about standard DS or Director's Cut?
 
Last edited:
True. But first impressions with DLSS4.5 from many people were: "I have set the newest profile and now my performance is terrible" (mostly from 2xxx, 3xxx users).

So with that update they mostly avoid it and power users still have many options to set up DLSS as they like.
I mean it's harder to run and only Ada and Blackwell have beefy enough tensor cores to shrug it off. Same thing happened with DLSS 4.0 Ray Reconstruction on 2 and 3 series cards.
 
From what I have (just) seen. DLSS works correctly in base DS1. Native, DLSS2 Q (in game), DLSS4 Q, DLSS4.5 P:

l1ztbAu.jpeg
nTMkvuK.jpeg
0AE2qQE.jpeg
xEIrO6C.jpeg
This game never looked nearly as sharp for me with previous DLSS versions.

It only looks close to these (though maybe not quite) only when i choose DLSS 4.5 with K preset (which i never had to do with other games, i would just use the "default" preset).

Keep in mind though, i have a 1080p monitor so it upscales from 720p i assume.

Not sure if i'm doing something wrong. The only other thing i changed was to enable 16X anisotropic filtering through Nvidia, because the game natively had zero.

I'm using DLSS Swapper.
 
Last edited:
The more you trial it the more you find out Preset L for ultra performance 4k mode is definitely not up to scratch and its practical use is being over-egged a lot by the talking heads. There are artifacts galore in every game's BVH nooks and crannies, and it ain't pretty.
 
Last edited:
This game never looked nearly as sharp for me with previous DLSS versions.

It only looks close to these (though maybe not quite) only when i choose DLSS 4.5 with K preset (which i never had to do with other games, i would just use the "default" preset).

Keep in mind though, i have a 1080p monitor so it upscales from 720p i assume.

Not sure if i'm doing something wrong. The only other thing i changed was to enable 16X anisotropic filtering through Nvidia, because the game natively had zero.

I'm using DLSS Swapper.
use DLDSR + DLSS circus combo
you might need to change your desktop to the DLDSR resolution if the game does not allow you to pick the DLDSr resolution
 
use DLDSR + DLSS circus combo
you might need to change your desktop to the DLDSR resolution if the game does not allow you to pick the DLDSr resolution
Sounds like a good solution but these factors seem too high

x7CyjxfuNG35Xove.png


Why not give me the standard 1440p or 4K? I'm on a 1080p monitor.
 
Sounds like a good solution but these factors seem too high

x7CyjxfuNG35Xove.png


Why not give me the standard 1440p or 4K? I'm on a 1080p monitor.
for death stranding you can try 2880p + dlss ultra performance :messenger_grinning_sweat:
your monitor probably has a virtual 4K resolution by itself. you can use that too, you don't have to use DLDSR or DSR to get the supersampling benefits
 
your monitor probably has a virtual 4K resolution by itself. you can use that too, you don't have to use DLDSR or DSR to get the supersampling benefits
I don't know what that is or how to see if i can use it or how.

I'm suspecting these numbers Nvidia gives me are because i use two screens, though the secondary is turned off.
 
I don't know what that is or how to see if i can use it or how.

I'm suspecting these numbers Nvidia gives me are because i use two screens, though the secondary is turned off.
it is probably due to the second screen indeed. then no worries, you can still enable DLDSR, it should be enabled separately for both monitors (even though UI wouldn't tell you so). otherwise you might have to set your 1080p screen as your primary screen
 
Last edited:
Game doesn't see the DSR factor resolution, but i think it looks good enough with the new version of DLSS + K preset.

WNSET6rYEX8zo9k4.png


Not sure why it was significantly blurrier before.

Though i still feel like it's softer than Boji's examples.
 
Last edited:
Game doesn't see the DSR factor resolution, but i think it looks good enough with the new version of DLSS + K preset.



Not sure why it was significantly blurrier before.

Though i still feel like it's softer than Boji's examples.
you have to change your desktop to the virtual resolution before launching the game
then you should be able to see the higher resolution in the game
 
you have to change your desktop to the virtual resolution before launching the game
then you should be able to see the higher resolution in the game
I don't like to mess with my default desktop settings and refresh rate (i'm running 240hz, game is locked at 60) and i just don't want to have to manually switch every time i want to play this particular game.

But it's an interesting feature, i hope other games can work with it without having to manually change stuff.
 
Just wanted to say that i tried using frame generation for the first time in two pretty demanding games, Silent Hill 2 and Talos 2. Using DLSS 4.5.

The result is way more impressive than i would imagine. I was pretty negative for all this AI/fake frames thing but i can't argue with the results.

I tested these games with RT ON and max settings at 1080p/locked 60fps. My 5060ti could barely handle that, often getting at 100% usage and dropping to mid 50's.

With frame generation i was able to lock Silent Hill at 80fps and Talos at 90fps and not only they look smoother/less blurry, the card also doesn't go above 80% usage so it's dead silent. Plus, with the GPU prices and the whole 5060/5070 situation, i want my card to last longer so i try not to push it too much.

80fps is fine in Silent Hill, i feel no extra input lag. With Talos i felt the extra input lag at 80fps but at 90fps it's fine.

I'm also using this with Atomic Heart. This game was extremely blurry at 60fps. For some reason it seems like motion blur is always ON, even when OFF in the settings. So with frame generation i managed to bump it at 100fps, which cleans up motion blur by a lot and not only that, i was able to also enable Ray Tracing.

So in conclusion, frame generation rocks. There's some minor artifacting that looks a tiny bit like screen tearing at times. But it's very minor and worth the "sacrifice".
 
Last edited:
Just wanted to say that i tried using frame generation for the first time in two pretty demanding games, Silent Hill 2 and Talos 2. Using DLSS 4.5.

The result is way more impressive than i would imagine. I was pretty negative for all this AI/fake frames thing but i can't argue with the results.

I tested these games with RT ON and max settings at 1080p/locked 60fps. My 5060ti could barely handle that, often getting at 100% usage and dropping to mid 50's.

With frame generation i was able to lock Silent Hill at 80fps and Talos at 90fps and not only they look smoother/less blurry, the card also doesn't go above 80% usage so it's dead silent. Plus, with the GPU prices and all, i want my card to last longer so i try not to push it too much.

80fps is fine in Silent Hill, i feel no extra input lag. With Talos i felt the extra input lag at 80fps but at 90fps it's fine.

I'm also using this with Atomic Heart. This game was extremely blurry at 60fps. For some reason it seems like motion blur is always ON, even when OFF in the settings. So with frame generation i managed to bump it at 100fps, which cleans up motion blur by a lot and not only that, i was able to also enable Ray Tracing.

So in conclusion, frame generation rocks. There's some minor artifacting that looks a tiny bit like screen tearing at times. But it's very minor and worth the "sacrifice".
Yeah, i'm curious what Nivida has in petto for DLSS 5 when it comes out with the 6000 series.
Maybe we finally see that nvidia neural texture compression.
 
Last edited:
Yeah, i'm curious what Nivida has in petto for DLSS 5 when it comes out with the 6000 series.
Maybe we finally see that nvidia neural texture compression.
Ι hope they continue improving things for current hardware, before 6xxxx gets out.

And i hope whatever improvements 6xxxx gets will be backwards compatible, like how 4xxxx cards also get the current stuff.
 
Yeah, i'm curious what Nivida has in petto for DLSS 5 when it comes out with the 6000 series.
Maybe we finally see that nvidia neural texture compression.

Ι hope they continue improving things for current hardware, before 6xxxx gets out.

And i hope whatever improvements 6xxxx gets will be backwards compatible, like how 4xxxx cards also get the current stuff.

Neural Shaders will work on RTX 5000, but not on anything before, eg. RTX 4000.
 
Man, Returnal @4k/165fps with DLSS 4.5 Model M is a sight to be seen. Ridiculous clarity and not a single aliased edge to be found.

I tried Returnal when it came to PC and only got maybe 4 hours into it and it didn't click with me, but it's got it's claws in me now. Incredible game.

Some shots:

kR95vQ1k4RwUOgiq.png
NIMLJZPYhM0LCRVN.png
IbTKnUk3WfSUFh7p.png
SP2ktcnIEbBXVABJ.png
h4eVM7Pca245k7a1.png
 
Last edited:
A.I.L.A. shows why some people here hate UE5 so much:



Boling is still present on Epic setting.

Edit: I meant to post this in Graphics thread, lol...
 
Last edited:
Man, Returnal @4k/165fps with DLSS 4.5 Model M is a sight to be seen. Ridiculous clarity and not a single aliased edge to be found.

I tried Returnal when it came to PC and only got maybe 4 hours into it and it didn't click with me, but it's got it's claws in me now. Incredible game.

Some shots:

kR95vQ1k4RwUOgiq.png
NIMLJZPYhM0LCRVN.png
IbTKnUk3WfSUFh7p.png
SP2ktcnIEbBXVABJ.png
h4eVM7Pca245k7a1.png
Nice screenshots! Does the game run well now? I heard there were some stutters with RT at launch, but maybe that issue has been resolved by now.
 
Last edited:
Man, Returnal @4k/165fps with DLSS 4.5 Model M is a sight to be seen. Ridiculous clarity and not a single aliased edge to be found.

I tried Returnal when it came to PC and only got maybe 4 hours into it and it didn't click with me, but it's got it's claws in me now. Incredible game.

Some shots:

kR95vQ1k4RwUOgiq.png
NIMLJZPYhM0LCRVN.png
IbTKnUk3WfSUFh7p.png
SP2ktcnIEbBXVABJ.png
h4eVM7Pca245k7a1.png
Why is saros not on PC day 1 :( I have to play it on PS
 
Thanks! I've experienced a stutter or two when entering new biomes/areas, but generally speaking it's as smooth as can be.
I've only played early level on base PS5 back at launch, but I don't recall some of those circular structures having a faceted octagonal look, or the top middle of the first image - the overhanging cowl shape looking facetted.

Is that just a side effect of the capture? or is that what DLSS is doing after upsampling a (more facetted?) native source image than a native render?
 
I've only played early level on base PS5 back at launch, but I don't recall some of those circular structures having a faceted octagonal look, or the top middle of the first image - the overhanging cowl shape looking facetted.

Is that just a side effect of the capture? or is that what DLSS is doing after upsampling a (more facetted?) native source image than a native render?

Are you talking about the geometry at the top of the first screenshot? DLSS isn't going to do anything to change that.
 
Top Bottom