• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

NVIDIA DLSS 4.5 to feature 2nd Gen Transformer model and Dynamic 6x Frame Generation

Oh boy, they will be pushing for that ridiculous value, won't they?
https://wccftech.com/nvidia-up-to-1...frame-generation-tech-future-dlss-technology/
I appreciate the resolution fix, but everytime they add a frame gen, all I think off is toilet paper adding layers.
images

It's all smearing shit.
 
I can't wait for the AI Bubble to burst and bring nvidia back to earth again.
AI bubble bursting means Nvidia only needs to supply a few companies instead of many companies. Genie's not going back in the bottle on this one.
 
Any advancement is a good thing in the end, interested to see how much of an improvement it is.
 
Last edited:
AI bubble bursting means Nvidia only needs to supply a few companies instead of many companies. Genie's not going back in the bottle on this one.

Yup, anyone thinking that Nvidia is coming back with the tail between their legs when there's an arm race between USA and China to achieve AGI is completely oblivious to the world. I wish I had still that kind of naivety.

A few AI companies might go out of business by failing to develop models against the competition but any silicon wafer for the next 10 years is gonna have a huge part of the pie going to AI.

This is Manhattan project + space race wrapped into one and then some kind of arm race to achieve the first AGI. They ain't lifting the foot from the pedal.
 
Dang I missed that. It's been thought you need a 1000Hz display to match the motion clarity of CRTs. We're about to find out.





No, there's additional hardware on the RTX 4000 cards that the 3000 cards are missing.
With them bringing back the 3060 I wouldn't be surprised if they figured out a way to introduce framegen to 30 series cards. Especially now with the whole market jacked up.
 
Last edited:
But 6X frame generation is just stupid.
No, it's not "stupid". We have 720Hz and 1000Hz even monitors incoming this year. There are no ways of getting anywhere close to these max refresh rates without FG. If you're at 100 fps or more then 6x FG would put you into 500+ territory with good input lag and very low persistence blur.
 
When is it coming out? after the announcement or later this year?

My Google search yielded:

NVIDIA is expected to provide full demos and video comparisons on Tuesday, January 6, 2026, during their CES presentation.

DLSS 4.5 Super Resolution release Date: Expected to be available immediately via an update to the NVIDIA App.

Dynamic Multi Frame Generation (6x FG) release Date: Scheduled for Spring
 
My Google search yielded:

NVIDIA is expected to provide full demos and video comparisons on Tuesday, January 6, 2026, during their CES presentation.

DLSS 4.5 Super Resolution release Date: Expected to be available immediately via an update to the NVIDIA App.

Dynamic Multi Frame Generation (6x FG) release Date: Scheduled for Spring
Holy so DLSS4.5 tomorrow?? Im excited
 
Holy so DLSS4.5 tomorrow?? Im excited
It's possible, they've been quiet on DLSS updates since August (so were working on something new presumably) and have no real reason to announce something in advance as they are in the lead anyway.
 
I wonder if the cost of 4.5 will be another 5-10% depending on the game though 🤔

I suspect the same cost, just superior algorithm (like DLSS2 ->DLSS3 jumps in the past). I doubt they will make it more heavy and still claim old RTX cards support.

But who knows...
 
Yup, anyone thinking that Nvidia is coming back with the tail between their legs when there's an arm race between USA and China to achieve AGI is completely oblivious to the world. I wish I had still that kind of naivety.

A few AI companies might go out of business by failing to develop models against the competition but any silicon wafer for the next 10 years is gonna have a huge part of the pie going to AI.

This is Manhattan project + space race wrapped into one and then some kind of arm race to achieve the first AGI. They ain't lifting the foot from the pedal.
They will eventually. You can only spend so much money and time on something before giving up if you don't see results. And AGI anytime soon is still extremely unlikely. The scientific consensus is pretty clear on this. LLM research especially appears more and more likely to be a dead end.

If a decade passes and nobody can come close to an AGI, well, expect countries and companies to loose interest.
 
Last edited:
Is it recommended to swap the older dlss.dll files from older games with the newer one?

I did this with a few games and the improvements in sharpness/clarity was substantial. But Silent Hill 2 didn't work well with a newer version IIRC.
 
Yeah but switch 2 is the smallest and least powerful Nvidia card with tensors and RT. It doesn't have enough ML power for transformer model.
what i mean is, the DLSS in the NSW2 will always be updated. So it will not be shocking if later these or some features come to NSW2 in the future.
 
what i mean is, the DLSS in the NSW2 will always be updated. So it will not be shocking if later these or some features come to NSW2 in the future.

CNN model stopped being updated after 3.8 version. Nvidia made "tiny DLSS" for SW2 (low quality), maybe they will improve it in the future but I doubt anything close to DLSS4 will happen on this hardware.
 
CNN model stopped being updated after 3.8 version. Nvidia made "tiny DLSS" for SW2 (low quality), maybe they will improve it in the future but I doubt anything close to DLSS4 will happen on this hardware.
Let see how things out near the end of NSW2 generation
 
6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.
 
6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.

I think it's like adaptive FG from LSFG, except this max out at 6X. Dynamic means you set a target frame rate then the AI generate frame based on a multiplier in 2 up to 6x max.
 
6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.
Are you thinking of interpolation between two finished frames (like often on traditional TVs)? That's never been DLSS FG.

DLSS FG basically use pixels + depth buffer + motion vectors, + reflex low latency syncing.

And DLSS 4.5 hopefully uses the second gen transformer model to try to neutralize some of the "shimmer" and ghosting which often is the result of pushing FG tech too far.
 
6x is dumb, but the thing being able to dynamically slip in frames (if i understand correctly) is pretty cool. That's what I thought fg was going to do from the start.

we are getting 500hz screens now, so 6x makes sense 🤷

lower persistence blur, smoother looking image.
 
Glad to see more improvements to DLSS 4. The transformer model is already excellent, but I'll take any improvements that can be made, especially if it means pushing higher "real" frames with similar quality.
NVIDIA also says users will be able to set a target FPS in the app up to the display's maximum refresh rate, with the feature dynamically adjusting how many frames are generated to meet that goal.
That's the only thing that interests me with frame-gen. If it can dynamically adjust the frame-gen to only give the extra fake frames to max my refresh rate and no more...maybe could reduce the input latency hit from being as high if I picked static amounts?
 
Are you thinking of interpolation between two finished frames (like often on traditional TVs)? That's never been DLSS FG.

DLSS FG basically use pixels + depth buffer + motion vectors, + reflex low latency syncing.

And DLSS 4.5 hopefully uses the second gen transformer model to try to neutralize some of the "shimmer" and ghosting which often is the result of pushing FG tech too far.

No I know that part. I like how you can have a target, perhaps something that you can mostly hold natively, and the fg can jump in as soon as you get into a situation where frames start dropping.
 
OverHeat OverHeat Dibs on your 5090 when you get the 6090
 
They will eventually. You can only spend so much money and time on something before giving up if you don't see results. And AGI anytime soon is still extremely unlikely. The scientific consensus is pretty clear on this. LLM research especially appears more and more likely to be a dead end.

If a decade passes and nobody can come close to an AGI, well, expect countries and companies to loose interest.
Jensen keynote today at CES made it clear he's all in on this AI revolution. How it looks, if no one achieves AGI, he'll end up just doing it himself anyways and make it open source.

 
Last edited:
MFG is the future. I know some may not like it, but there's not much you or we can do about it. Next-gen consoles as well and games will rely on fake frames to run games(bigger ones at least) instead of relying on raw power. It is what it is, best to just accept it I guess. I dont think AI bubble will burst any time soon, much different than crypto this time around.
 
MFG is the future. I know some may not like it, but there's not much you or we can do about it. Next-gen consoles as well and games will rely on fake frames to run games(bigger ones at least) instead of relying on raw power. It is what it is, best to just accept it I guess. I dont think AI bubble will burst any time soon, much different than crypto this time around.

I love it and wish Sony would have something ready for PS5 pro for wide adoption of FG for the system, if not even the readily availble FSR3 FG at the very least. The leak from MLID where devs hates it makes no sense to me.

was hoping LG would show off 240hz OLED TVs at CES, but theyre still 165hz like the previous gen

big TVs with high refresh rates will be so good w/ FG

Same, note that G6 does support 1080p 330hz


Really wish 240hz is on all 2026 flagship tv at the very least.
 
Last edited:
Can a 500hz screen do something more than a 240hz monitor can, though?

the higher the refresh/framerate, the less the persistence blur, which a sample and hold screen inherently has, will be noticeable.

you can also of course use Black Frame Insertion or backlight strobing to reduce the blur, but that will also reduce your screen brightness.
 
the higher the refresh/framerate, the less the persistence blur, which a sample and hold screen inherently has, will be noticeable.

you can also of course use Black Frame Insertion or backlight strobing to reduce the blur, but that will also reduce your screen brightness.
Yeah, i guess with flat panel monitors you can't have enough hz. Old CRT Monitors had less blur and lag than todays monitors.
 
Last edited:
Sounds good to me. I use 2x frame gen sometimes to get maxed out/ray traced games up to or near 144hz on my connected TV and it works well. Being able to just set 144hz as the output goal sounds pretty convenient.
 
Last edited:
Yeah, i guess with flat panel monitors you can't have enough hz. Old CRT Monitors had less blur and lag.

I mean, a 500hz monitor has far less lag than any CRT ever produced.
even at 60hz we are getting insanely close to CRT latency now. we are around 2ms away from CRT level lag now in TVs by Samsung and LG. and if you run your Xbox at 120hz while playing a 60fps game, on a modern Samsung TV, you get less lag than if you'd play it on a 60hz CRT.

the lag issue is mostly resolved... unless you have a Sony TV lol.


but yes, we need far higher refresh rates to get a similar motion clarity as CRTs.
the guy who runs Blurbusters thinks above 1000hz is needed to exceed CRT clarity without black frame insertion or strobing.

and DLSS framegen is not perfect, but preferable to persistence blur.
 
Last edited:
I mean, a 500hz monitor has far less lag than any CRT ever produced.
even at 60hz we are getting insanely close to CRT latency now. we are around 2ms away from CRT level lag now in TVs by Samsung and LG. and if you run your Xbox at 120hz while playing a 60fps game, on a modern Samsung TV, you get less lag than if you'd play it on a 60hz CRT.

the lag issue is mostly resolved... unless you have a Sony TV lol.


but yes, we need far higher refresh rates to get a similar motion clarity as CRTs.
the guy who runs Blurbusters thinks above 1000hz is needed to exceed CRT clarity without black frame insertion or strobing.

and DLSS framegen is not perfect, but preferable to persistence blur.
Aha, interesting, didn't know that modern monitors are that close to a CRT technically or even better.
 
Last edited:
Aha, interesting, didn't know that modern monitors are that close to a CRT technically or even better.

at the same refresh they are still slightly slower than CRT, because the image processing does take 1 to 2 ms.

but if you play at high refresh rates you'll get sub-CRT lag these days.

the lowest latency CRTs were those rare ones that reached 160hz at reasonable resolutions. so to beat those you'd probably need a ~200hz gaming monitor nowadays or something along those lines.

to beat a 60hz CRT however, all you need is a decent LG, Samsung, or maybe Panasonic TV, and a way to run at 120hz, which an Xbox can force for every game for example. and I think some LG TVs have a game mode setting that runs the TV at 120hz even if the input it receives is 60hz.

if you do that you have around 5ms of lag. beating the 8.333ms of a 60hz TV.
 
Last edited:
Top Bottom