• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

NVIDIA DLSS 4.5 to feature 2nd Gen Transformer model and Dynamic 6x Frame Generation

Go Away What GIF
See, you're already on the hook, ready to defend the big corporation. If I were selling GPUs, I'd do it exactly as I've just described. First off you test the waters, see how the public reacts to small perf. losses at the promise of an actual 'improvement'. Then repeat the same thing until they're milked dry. Why even bother selling cheap GPUs if the people can cough up more money on the more expensive GPUs? After all, the image quality WILL be better (he-he, those clowns will never know that it could've been better from the very start). The very technology and software they make is designed as the core layer of this scheme.
 
See, you're already on the hook, ready to defend the big corporation. If I were selling GPUs, I'd do it exactly as I've just described. First off you test the waters, see how the public reacts to small perf. losses at the promise of an actual 'improvement'. Then repeat the same thing until they're milked dry. Why even bother selling cheap GPUs if the people can cough up more money on the more expensive GPUs? After all, the image quality WILL be better (he-he, those clowns will never know that it could've been better from the very start). The very technology and software they make is designed as the core layer of this scheme.

Are you 12 or something? You think you've uncovered a big conspiracy here and had a dopamine hit or something? Cringe as fuck your posts.
 
It's an actual fucking scam:

Step 1: initial DLSS Buzzword 4.0 release. Hidden truth: it's botched deliberately, with some IQ downsides baked in.
Step 2: sell hardware that supports the initial release well.
Step 3: announce an 'improved' Buzzword 4.5 version of it. "It's for the better": the performance is 1% to 5% worse, but the customers won't mind this because IQ is 'improved', wink-wink.
Step 4: introduce new hardware that supports DLSS Buzzword 5.0: the performance is 5% to 10% worse compared to 4.5, but holy shit the pixels now dance or something, it's +9999% image quality!
Step 5: collect huge margins because the PC gamer Stans and Joes won't even look at RTX Basic_Crap anymore, since that 10% drop makes it inconvenient, and oh boy do they have a solution for this little problem: RTX Super_Duper.
See, you're already on the hook, ready to defend the big corporation. If I were selling GPUs, I'd do it exactly as I've just described. First off you test the waters, see how the public reacts to small perf. losses at the promise of an actual 'improvement'. Then repeat the same thing until they're milked dry. Why even bother selling cheap GPUs if the people can cough up more money on the more expensive GPUs? After all, the image quality WILL be better (he-he, those clowns will never know that it could've been better from the very start). The very technology and software they make is designed as the core layer of this scheme.

Dummy Feeling Dumb GIF
 
Last edited:
Looks great but is the fps hit that bad? I might wait til they optimize it before I switch over (I know it's currently beta).

It is a small hit with older DLSS but it basically kills off native rendering

Look at the perf uplift while looking the same and drawing less power

d7aCOIg.jpeg
 
Last edited:
See, you're already on the hook, ready to defend the big corporation. If I were selling GPUs, I'd do it exactly as I've just described. First off you test the waters, see how the public reacts to small perf. losses at the promise of an actual 'improvement'. Then repeat the same thing until they're milked dry. Why even bother selling cheap GPUs if the people can cough up more money on the more expensive GPUs? After all, the image quality WILL be better (he-he, those clowns will never know that it could've been better from the very start). The very technology and software they make is designed as the core layer of this scheme.

Papito its time for the medication
 
I used swapper once in the past and the game didn't even started, i had a black screen.

If i have to study what precise dlss version i need for every damn game to even start, i think the app may be more secure proof for a fucking noob like me...

Did the fixed that and now the app only install something that actually work for the game or is it still more for experienced users that know exactly what version of dlss to use?

Weird, I never had issues with swapper. I always use the latest .dll, there should be no downsides in doing that.

Z1PynYxNxHjvENCo.png
WfVY1wTmcwVPN7Um.png
ubL0Xuy5RCsX4cYn.png


paul-rudd-anchorman-the-legend-of-ron-burgundy.gif


It is a small hit with older DLSS but it basically kills off native rendering

Look at the perf uplift while looking the same and drawing less power

d7aCOIg.jpeg

images
 
Last edited:
Weird, I never had issues with swapper. I always use the latest .dll, there should be no downsides in doing that.

Z1PynYxNxHjvENCo.png
WfVY1wTmcwVPN7Um.png
ubL0Xuy5RCsX4cYn.png


paul-rudd-anchorman-the-legend-of-ron-burgundy.gif
I mean i guess the game that i tried wasn't compatible with the latest dlss or framegen or ray whatever files so it gave me the black screen...

So what are the exact steps to do?

Chose the game in the app
Download the latest dlss, framegen (something else?) files from the app
Switch to the latest version
Then chose profile k

The end?

I don't have to physically copy and paste stuff into folders etc. Right?
 
Last edited:
Are you 12 or something? You think you've uncovered a big conspiracy here and had a dopamine hit or something? Cringe as fuck your posts.
This isn't a conspiracy of any kind, it's how this business operates. They're implementing the usual 'power creep' strategies from mobile / gacha games into the real world market, except this time it's not some paywalled gacha characters but the hardware/software being engineered in a way that makes it sub-optimal to purchase basic, non-improved variants. I saw quite enough of such practices. They introduce a problem, they even 'buff' something to make it all appear as an act of goodwill, but it's all ultimately just to sell you more. Worst of all, people are so gullible and trusting that they don't even want to consider the possibility of getting scammed in the open daylight.
 
I mean i guess the game that i tried wasn't compatible with the latest dlss or framegen or ray whatever files so it gave me the black screen...

So what are the exact steps to do?

Chose the game
Download the latest dlss, framegen (something else?) files from the app
Then chose profile k

The end?

I don't have to physically copy and paste stuff into folders etc. Right?

Yep, you don't have to do anything inside outside of the app. Cyberpunk before and after (with DLSS4.5 - M preset):

6CVlqHwFV9FDhrT1.jpg
Dc5UoEkoCZ6x7Yje.jpg


You can also turn on this to see dlss hud in game:

U7X3IqvXqMHKp1Qs.jpg
 
Last edited:
This isn't a conspiracy of any kind, it's how this business operates. They're implementing the usual 'power creep' strategies from mobile / gacha games into the real world market, except this time it's not some paywalled gacha characters but the hardware/software being engineered in a way that makes it sub-optimal to purchase basic, non-improved variants. I saw quite enough of such practices. They introduce a problem, they even 'buff' something to make it all appear as an act of goodwill, but it's all ultimately just to sell you more. Worst of all, people are so gullible and trusting that they don't even want to consider the possibility of getting scammed in the open daylight.
There is a general consensus that newer Nvidia drivers tend to provide better performance over time for supported cards, although the performance gains for specific older games may be minimal.
 


Haven't had chance to watch it myself yet but this guy knows his stuff. Check out his other optimised settings vids, he goes pretty in-depth.
 
It is a small hit with older DLSS but it basically kills off native rendering

Look at the perf uplift while looking the same and drawing less power

d7aCOIg.jpeg

Woah, 100 less watts? That's actually crazy. Are people with 4090/5090's using Quality, balanced, or performance modes (I assume it depends on the resolution you're playing at)?
 


Haven't had chance to watch it myself yet but this guy knows his stuff. Check out his other optimised settings vids, he goes pretty in-depth.


Great comparison. At this point DLAA/native rendering seems to be pointless. Just losing huge amounts of performance with virtually zero gain.
 
There is a general consensus that newer Nvidia drivers tend to provide better performance over time for supported cards, although the performance gains for specific older games may be minimal.
What's important is whether this mixed trend (i.e. IQ improvements at the cost of performance) will continue or not. When they introduced DLSS 4 (transformers, first gen), I vaguely remember the same scenario - better IQ, worse performance.

Given that they do care about growth, gobbling up as much money as possible from all sources... Where's the guarantee that they aren't screwing over the customers, taking small steps to make it 5% worse here, 2% worse there?
There aren't any external regulators involved in engineering NVIDIA products, either hardware or software. They decide how it works, they decide how the image quality appears. No one can intervene or supervise, not at any step.
 
So if i don't have the latest drivers and beta app, i don't need to worry about chosing latest because i don't physically have 4.5 ready to use right?

Shit is confusing as hell...
Yeah, that's the gist of it.

Personally, my setup is this:
- Nvidia app for RTX HDR and basic config adjustments like forcing 16x AF
- DLSS Swapper for forcing new DLL versions and model presets in games (even those that aren't officially supported by DLSS overrides in the official Nvidia app)
- Nvidia Profile Inspector for more low-level tweaks like disabling/enabling ReBAR or adjusting base resolution for different DLSS modes

You can technically tweak DLSS version & presets with any of those tools, but DLSS Swapper is the most straightforward and bulletproof solution if you're just focused on that.
 


This was a 19 minute video where he didn't even activate either of the new Presets because he didn't follow the correct process, yet was still convinced he was using DLSS 4.5, there was an image quality improvement and there was a performance hit. Even the comments are calling him out for it.

The perfect encapsulation of the placebo effect.
 
What's important is whether this mixed trend (i.e. IQ improvements at the cost of performance) will continue or not. When they introduced DLSS 4 (transformers, first gen), I vaguely remember the same scenario - better IQ, worse performance.

Is this coming as a shock that better IQ requires worse performance? How has it not been like that for the past 3 decades of video game graphics?

Given that they do care about growth, gobbling up as much money as possible from all sources... Where's the guarantee that they aren't screwing over the customers, taking small steps to make it 5% worse here, 2% worse there?
There aren't any external regulators involved in engineering NVIDIA products, either hardware or software. They decide how it works, they decide how the image quality appears. No one can intervene or supervise, not at any step.

Dude, its been like that since the dawn of video game graphics, no you won't have a "guarantee" nor external regulators to monitor this shit. Saying shit like this is why I told you that's conspiracy theory dopamine land. If they pull that kind of stunt for years, AMD would outclass them in no time.

There's a penalty in performance because it requires a lot more transformer model parameters, thus more compute. The image quality is more stable. Voila!

Dexter Idk GIF


Expecting better image quality and more performance, in what kind of lalaland can I find when that happened?
 
From what I gather, 4.5 provides a clearer picture by eliminating some shimmer, ghosting, and aliasing issues, but it will come at the cost of some frames. Up to you if the new image is worth the fps hit. I have no idea atm what the best option is, it is pretty confusing.
What about Balanced and Performance? Are they as good as Quality on 4.0?
 
zWORMz is terrible, another layman who got lucky winging it (like Daniel, but not as smart and analytical).

Owens is generally fine, and yeah, I'd expect him to be smarter and more analytical given he's a maths teacher by day. I've been watching him for a few years, even when he only had 1-2k followers and he's always been pretty consistent in how he delivers information and analyses it. The extra attention hasn't gone to his head, unlike some.
 
Last edited:


At CES 2026, NVIDIA announced the new graphical capabilities coming to over 400 games very soon, including DLSS 4.5 super resolution and 6X multiframe generation. If you want to get a taste of some of this tech before it officially releases, here's how to enable DLSS 4.5 on compatible cards right now.
 
Yeah, that's the gist of it.

Personally, my setup is this:
- Nvidia app for RTX HDR and basic config adjustments like forcing 16x AF
- DLSS Swapper for forcing new DLL versions and model presets in games (even those that aren't officially supported by DLSS overrides in the official Nvidia app)
- Nvidia Profile Inspector for more low-level tweaks like disabling/enabling ReBAR or adjusting base resolution for different DLSS modes

You can technically tweak DLSS version & presets with any of those tools, but DLSS Swapper is the most straightforward and bulletproof solution if you're just focused on that.
Is nvidia hdr rtx better than using windows hdr calibration tool?
 
Owens is generally fine, and yeah, I'd expect him to be smarter and more analytical given he's a maths teacher by day. I've been watching him for a few years, even when he only had 1-2k followers and he's always been pretty consistent in how he delivers information and analyses it. The extra attention hasn't gone to his head, unlike some.

Yeah, Daniel is great.
 


This was a 19 minute video where he didn't even activate either of the new Presets because he didn't follow the correct process, yet was still convinced he was using DLSS 4.5, there was an image quality improvement and there was a performance hit. Even the comments are calling him out for it.

The perfect encapsulation of the placebo effect.

He's using DLAA, isnt that independent of DLSS? like same technology, different functionality?
 
He's using DLAA, isnt that independent of DLSS? like same technology, different functionality?

It still uses CNN, transformer 1 or transformer 2 algorithms so depending on what you use end result will be different.

DLAA is just using native res frames to create super sample like image unlike DLSS that uses reduced resolutions (like 67%, 58% etc.) to create native like image.
 
For most things larger than pixel-sized particles, this is much better than native rendering. I feel like devs need to separate out particles rendering instead of feeding it through DLSS. The worst-case scenario is Horizon FW in the desert area with all the blowing sand.
 
Last edited:
Is nvidia hdr rtx better than using windows hdr calibration tool?
They have different purposes.

RTX HDR is normally used to help simulate HDR support for games that don't have native HDR support.

The Windows HDR calibration tool on the other hand is a tool to help calibrate your monitor for HDR (it creates a color profile tied to your monitor).
 
Last edited:
They have different purposes.

RTX HDR is normally used to help simulate HDR support for games that don't have native HDR support.

The Windows HDR calibration tool on the other hand is a tool to help calibrate your monitor for HDR (it creates a color profile tied to your monitor).
Why the fuck would anyone use fake hdr when even the games that actually support hdr usually do a shit job with it on pc?? (Like literally there is a famous yt channel that only does hdr settings for games and 9 times out of 10 hdr suck balls)

I can't even imagine how bad fake hdr is in term of accuracy...
 
Last edited:
Is it true that 4.5 doesn't support RR yet? If so, I wonder why they chose to release a preview at all. What's the hurry?
Fun fact: 4.0 "didn't support RR" either cause the presets which were in RR at the moment of 4.0 launch were there since about half a year prior.

People still can't seem to grasp that the DLSS major version is a "package" which contains three different technologies each of which has its own internal versioning which is not necessarily aligned with "marketing" releases of major versions.
SR, RR and FG in "version 4.5" are all "version 4.5". In SR there were new models added, in RR weren't, and FG is in fact waiting till spring for a DMFG update which will still be a part of "version 4.5" when it will launch.
Model K in SR 310.5 is as much "version 4.5" as model M or L. Same goes for current RR models D and E in 310.5 RR release.
 
Why the fuck would anyone use fake hdr when even the games that actually support hdr usually do a shit job with it on pc?? (Like literally there is a famous yt channel that only does hdr settings for games and 9 times out of 10 hdr suck balls)

I can't even imagine how bad fake hdr is in term of accuracy...
It kinda works like DLSS in the sense that its models are also trained by compute. RTX HDR is analysing and identifying elements in frames (like light sources). It also exapnds the color palette and prevents clipping. It's a pretty good alternative.
 
Is this coming as a shock that better IQ requires worse performance? How has it not been like that for the past 3 decades of video game graphics?



Dude, its been like that since the dawn of video game graphics, no you won't have a "guarantee" nor external regulators to monitor this shit. Saying shit like this is why I told you that's conspiracy theory dopamine land. If they pull that kind of stunt for years, AMD would outclass them in no time.

There's a penalty in performance because it requires a lot more transformer model parameters, thus more compute. The image quality is more stable. Voila!

Dexter Idk GIF


Expecting better image quality and more performance, in what kind of lalaland can I find when that happened?

Excuse me? Like, in any case of upgrading your PC, up until at least the AI took over the world?
You buy something or you install a software upgrade --> you expect a better performance.

Of course you may say "Hey, it's the IMAGE QUALITY you get upgraded!" - but that's the entire point, such upgrades weren't reducing the performance before, and who-the-hell-knows if the image quality couldn't be better with DLSS 4, since they could've just worsened it on purpose, because their competition (AMD) was nowhere near. Why release 'image quality upgrade' for free, if it can cost something, like... the peformance? It's still money! Perhaps they're forcing us to upgrade the hardware earlier down the road.

Now, let's address this part:
> There's a penalty in performance because it requires a lot more transformer model parameters, thus more compute. The image quality is more stable. Voila!
Following that logic, shouldn't RTX 5090 be vastly superior? Because RTX 4060 Ti doesn't fall too far behind it, in terms of performance difference (DLSS 4 vs DLSS 4.5).

Raw compute power of RTX 5090 is unmatched: even compared to RTX 5080, if I use the latter to load some AI model, I'd get a twice slower prompt processing, and about half of t/s in generation. I know, comparing LLMs to whatever DLSS is actually using isn't fair... but that compute power doesn't go anywhere, RTX 5090 shouldn't suffer at all if RTX 4060 Ti suffers only 5% performance loss.

Anyway, I'm not here to start a holy war against Jensen Huang (after all, I myself own a workstation rig with 64GB VRAM). It's just that I find it all weird, as if we're going in a wrong direction where the customer buys a GPU and instead of 'longevity' he ends up with 'uncertainty', if you get what I mean. What's next, will they roll out DLSS 5 before the next RTX series comes? And what should we expect, another performance drop for the sake of some trinkets in games looking more sharp?
 
It kinda works like DLSS in the sense that its models are also trained by compute. RTX HDR is analysing and identifying elements in frames (like light sources). It also exapnds the color palette and prevents clipping. It's a pretty good alternative.
I never use hdr unless i know it's accurate so i don't see myself using a dlss approach with it.

Dlss look better than native in some instances, i doubt fake hdr is ever gonna look better than good real hdr, i guess it can look better than sdr if you like popping colors and over-bright image but i doubt it's anything remotely close to accurate...

If it wasn't clear, i like color accuracy :lollipop_grinning_sweat:
 
Last edited:
I never use hdr unless i know it's accurate so i don't see myself using a dlss approach with it.

Dlss look better than native in some instances, i doubt fake hdr is ever gonna look better than good real hdr, i guess it can look better than sdr if you like popping colors and over-bright image but i doubt it's anything remotely close to accurate...

If it wasn't clear, i like color accuracy :lollipop_grinning_sweat:
In theory I'd say RTX HDR COULD potentially provide better color accuracy than SDR simply because it "uncrushes" the brightness level and expands the color palette into something that could look more natural - At least in games that sort of tend to tilt towards realism. But, yeah, I get what you're saying. I wouldn't run RTX HDR on Donkey Kong..

Anyway, some RenodX modded games looks fantastic in HDR.
 
Last edited:
It kinda works like DLSS in the sense that its models are also trained by compute. RTX HDR is analysing and identifying elements in frames (like light sources). It also exapnds the color palette and prevents clipping. It's a pretty good alternative.

I never use hdr unless i know it's accurate so i don't see myself using a dlss approach with it.

Dlss look better than native in some instances, i doubt fake hdr is ever gonna look better than good real hdr, i guess it can look better than sdr if you like popping colors and over-bright image but i doubt it's anything remotely close to accurate...

If it wasn't clear, i like color accuracy :lollipop_grinning_sweat:

In theory I'd say RTX HDR COULD potentially provide better color accuracy than SDR simply because it "uncrushes" the brightness level and expands the color palette into something that could look more natural - At least in games that sort of tend to tilt towards realism. But, yeah, I get what you're saying. I wouldn't run RTX HDR on Donkey Kong..

Anyway, some RenodX modded games looks fantastic in HDR.

RTX is not accurate in any way. It's better than Microsoft's Auto HDR, but is still an SDR-to-HDR post-processing filter, which is "fake" by definition.


The only way to to add HDR into unsupported games accurately is to use RenoDX or SpecialK, as these tools have the ability to operate natively within the game engine and are custom configured on a per-game basis for accurate native HDR output.
 
Last edited:
Woah, 100 less watts? That's actually crazy. Are people with 4090/5090's using Quality, balanced, or performance modes (I assume it depends on the resolution you're playing at)?
I've got a 4090. Maybe I'm blind or it's the distance I'm sitting at (hooked up to my 55" QD OLED tv), but I did some testing and performance doesn't look that much different than quality now. The extra (locked) frames you're getting now plus the image quality is pretty incredible. I'll have to do some more testing, but so far I like what I see.
 
Now, let's address this part:
> There's a penalty in performance because it requires a lot more transformer model parameters, thus more compute. The image quality is more stable. Voila!
Following that logic, shouldn't RTX 5090 be vastly superior? Because RTX 4060 Ti doesn't fall too far behind it, in terms of performance difference (DLSS 4 vs DLSS 4.5).
That's exactly what we see. In the below benchmarks, the 5090 executes DLSS 4.5 up to 2.61x faster than the 4070 Ti.


And bear in mind, the new models are entirely optional and DLSS 4.0 already delivers great image quality that exceeds what AMD can provide with FSR 4.
 
Last edited:
Excuse me? Like, in any case of upgrading your PC, up until at least the AI took over the world?
You buy something or you install a software upgrade --> you expect a better performance.

Lol ?

When peoples went from bilear/trilinear filtering and upgraded to anisotropic filtering 16x, what do you think happened?

I've owned GPUs since 1996. I've followed every tech changes. Never you make a feature that has better image quality than a previous solution without a performance hit, unless its a revolutionary method, but most of the time its cutting corners for performances and your eyes are fooled. Just like half/quarter resolution effects in UE engine being hidden by temporal averages.

Of course you may say "Hey, it's the IMAGE QUALITY you get upgraded!" - but that's the entire point, such upgrades weren't reducing the performance before

Holy shit lol
  • MSAA rings any bells? SSAA?
  • Programmable vertex/pixel shaders at directX 8 & 9 for more sophisticated materials and lighting models but resulting in more demanding games with performance bottlenecks? Moving away from fixed pipeline came at a cost.
  • Tessellation? Helllooo?? Anybody home? Whole fucking TressFX and Hairworks kneecapping performances for better visual quality? Call of Pripyat having huge performance issues? Crysis 2 controversy?
  • Volumetric effects that can kneecap the best hardware with 1% visual difference from medium to ultra high?
  • Ray tracing? Surely that's recent enough to remember?
  • DLAA itself, you run at native resolution to have best image quality... at the cost of performances. Mind blowing stuffs.

and who-the-hell-knows if the image quality couldn't be better with DLSS 4, since they could've just worsened it on purpose, because their competition (AMD) was nowhere near. Why release 'image quality upgrade' for free, if it can cost something, like... the peformance? It's still money! Perhaps they're forcing us to upgrade the hardware earlier down the road.

Ah yes, let's doubt Nvidia who basically is the frontrunner of this tech implementation and basically changed the upscaling game since Turing

Nvidia is kneecapping DLSS 4, it should have better image quality with more parameters but somehow also more performant.

Hell, let's go at r/AMD and r/Intel to simply tell them to beat the fuck out of Nvidia with their upscalers, seems easy enough.

Now, let's address this part:
> There's a penalty in performance because it requires a lot more transformer model parameters, thus more compute. The image quality is more stable. Voila!
Following that logic, shouldn't RTX 5090 be vastly superior? Because RTX 4060 Ti doesn't fall too far behind it, in terms of performance difference (DLSS 4 vs DLSS 4.5).

Because both series use FP8?

Do you understand how a pipeline works? Clearly not. There's inherently a bottleneck that you need an image to come in to be fixed by AI, no matter how many TOPS you can have, DLSS is a fraction of the frametime. A GPU is not using all its TOPs for DLSS, ever. Inherently the 5090 generates frames faster than other cards and of course ML blocks can tackle more workload, but they go hand in hand, of course never using full TOPS available.



Raw compute power of RTX 5090 is unmatched: even compared to RTX 5080, if I use the latter to load some AI model, I'd get a twice slower prompt processing, and about half of t/s in generation. I know, comparing LLMs to whatever DLSS is actually using isn't fair... but that compute power doesn't go anywhere, RTX 5090 shouldn't suffer at all if RTX 4060 Ti suffers only 5% performance loss.

psa-do-not-force-latest-preset-with-dlss-4-5-aka-how-to-v0-snwm2453dqbg1.jpeg


Even for a 4070 Ti its a ~40% latency increase to run M model vs K model, narrowing it down to just DLSS, not your whole frametime.

5090 is a 20% penalty, but even at M model on a 5090 is faster than 4070 Ti's K model by 1.88x

Nothing so far of what you say makes any sense. of course a ms difference in the overall frametime is gonna shrink down to a low % difference in performances, but the data's there, it's much faster on a 5090. That leaves room to breathe for a lot more AI effects, be it Neural radiance cache path tracing, framegen, and upcoming neural texture compressions / neural shaders, etc.

Anyway, I'm not here to start a holy war against Jensen Huang (after all, I myself own a workstation rig with 64GB VRAM). It's just that I find it all weird, as if we're going in a wrong direction where the customer buys a GPU and instead of 'longevity' he ends up with 'uncertainty', if you get what I mean. What's next, will they roll out DLSS 5 before the next RTX series comes? And what should we expect, another performance drop for the sake of some trinkets in games looking more sharp?

Same fucking discussion happened during DLSS 4.0 release really, I think you even mentionned it.

Now go ask Turing card owners if they are happy to have received DLSS 4 for free years later and allows them to run games at performance/ultra performance with never before seen image quality at those resolutions and letting them play more modern games for a longer period of time over upgrading. They're fucking happy campers. For all the "AMD finewine" bullshit back then suggesting to buy a 5700XT over Turing, that ended up aging like milk.
 
Last edited:
RTX is not accurate in any way. It's better than Microsoft's Auto HDR, but is still an SDR-to-HDR post-processing filter, which is "fake" by definition.


The only way to to add HDR into unsupported games accurately is to use RenoDX or SpecialK, as these tools have the ability to operate natively within the game engine and are custom configured on a per-game basis for accurate native HDR output.

I was talking about the visual outcome (how it looks more natural and uncrushes details), and you're responding with a technical dismissal.. I just think it's funny that you're blaming RTX HDR for being "fake" in a thread about DLSS, because in the end they are pretty much fundamentally doing the same thing (trained models).

I think we're on the same page technically, but maybe looking at it from different angles. You're right that RenoDX is "truer" because it's a script injection that hooks directly into the engine shaders.

But my point was more about the end result. Even if RTX HDR is technically an "inverse tone mapper" working from an SDR base, the AI is good at reconstructing the highlights and clearing up the black detail that usually gets lost. For something that works in almost any game without needing a specific mod, it gets you a lot closer to a natural, uncrushed look than you'd expect. And yes, RenoDx is definitely a gold standard for accuracy if a mod exists for the game, that's why I use it as a number one priority myself.
 
Last edited:
I was talking about the visual outcome (how it looks more natural and uncrushes details), and you're responding with a technical dismissal.. I just think it's funny that you're blaming RTX HDR for being "fake" in a thread about DLSS, because in the end they are pretty much fundamentally doing the same thing (trained models).

I think we're on the same page technically, but maybe looking at it from different angles. You're right that RenoDX is "truer" because it's a script injection that hooks directly into the engine shaders.

But my point was more about the end result. Even if RTX HDR is technically an "inverse tone mapper" working from an SDR base, the AI is good at reconstructing the highlights and clearing up the black detail that usually gets lost. For something that works in almost any game without needing a specific mod, it gets you a lot closer to a natural, uncrushed look than you'd expect. And yes, RenoDx is definitely a gold standard for accuracy if a mod exists for the game, that's why I use it as a number one priority myself.

I mean, not even talking about the technical details, RTX HDR just fails far too often for me to consider it usable. It dramatically overbrightens UI elements in pretty much every instance. And while it expands the nit output of specular highlights, it doesn't actually retain any highlight detail the way real HDR does. It just makes it brighter/makes it pop more.

This is a good example of how RTX HDR just blows out highlights whereas a proper HDR implementation from RenoDX actually preserves highlight detail:

Nj1rYaoHtZ4LVQ8y.png
 
Last edited:
I mean, not even talking about the technical details, RTX HDR just fails far too often for me to consider it usable. It dramatically overbrightens UI elements in pretty much every instance. And while it expands the nit output of specular highlights, it doesn't actually retain any highlight detail the way real HDR does. It just makes it brighter/makes it pop more.

This is a good example of how RTX HDR just blows out highlights whereas a proper HDR implementation from RenoDX actually preserves highlight detail. Look at the tree next to The Knight in each shot:

V2ghUlpAFpjrd3pN.png
I don't understand why you keep quoting me as if this is a RTX HDR vs REnoDX discussion. Not interested.
 
Last edited:
"Untrue claims"

Holy shit... Compulsive much

Ryan Reynolds Reaction GIF


Still not interested. Please go away.
 
Last edited:
Is nvidia hdr rtx better than using windows hdr calibration tool?
As someone else already mentioned, RTX HDR is for simulating HDR in games that don't support it natively. For example, Metaphor: ReFantazio, Black Myth: Wukong, Clair Obscur: Expedition 33 look much better using RTX HDR over the original SDR image. It's much better than Windows 11 built-in AutoHDR since it actually uses the correct gamma curve. Of course, the introduction of tools like RenoDX allow for a much more accurate attempt of what an official native HDR implementation would achieve, but it does require for some manual tweaking or game-specific profiles to be made available whereas RTX HDR is a fully automatic process. So, I tend to use RTX HDR as a stop-gap solution until something better comes along with RenoDX. However, I wouldn't choose either over native HDR support if the game ships with it.

The windows HDR calibration tool is just for configuring HDR to match your display specs (peak brightness - maxCLL, black levels - minCLL, etc.).

Why the fuck would anyone use fake hdr when even the games that actually support hdr usually do a shit job with it on pc?? (Like literally there is a famous yt channel that only does hdr settings for games and 9 times out of 10 hdr suck balls)

I can't even imagine how bad fake hdr is in term of accuracy...
One problem that is especially annoying is some games shipping with black-level raise in their HDR implementation ex: Cyberpunk 2077 and Hogwarts Legacy. Meaning they don't allow the HDR to achieve true blacks which is especially noticeable on an OLED since the pixels should be completely turned off. For these, I would first resort to finding a ReShade profile to address this, but if the problem isn't easily resolved that I might resort to using "simulated HDR" over native HDR. This underappreciated (and frankly, entertaining) channel offers some insight how you can use ReShade to fix "broken" HDR implementations:
 
Lol ?

When peoples went from bilear/trilinear filtering and upgraded to anisotropic filtering 16x, what do you think happened?

I've owned GPUs since 1996. I've followed every tech changes. Never you make a feature that has better image quality than a previous solution without a performance hit, unless its a revolutionary method, but most of the time its cutting corners for performances and your eyes are fooled. Just like half/quarter resolution effects in UE engine being hidden by temporal averages.



Holy shit lol
  • MSAA rings any bells? SSAA?
  • Programmable vertex/pixel shaders at directX 8 & 9 for more sophisticated materials and lighting models but resulting in more demanding games with performance bottlenecks? Moving away from fixed pipeline came at a cost.
  • Tessellation? Helllooo?? Anybody home? Whole fucking TressFX and Hairworks kneecapping performances for better visual quality? Call of Pripyat having huge performance issues? Crysis 2 controversy?
  • Volumetric effects that can kneecap the best hardware with 1% visual difference from medium to ultra high?
  • Ray tracing? Surely that's recent enough to remember?
  • DLAA itself, you run at native resolution to have best image quality... at the cost of performances. Mind blowing stuffs.



Ah yes, let's doubt Nvidia who basically is the frontrunner of this tech implementation and basically changed the upscaling game since Turing

Nvidia is kneecapping DLSS 4, it should have better image quality with more parameters but somehow also more performant.

Hell, let's go at r/AMD and r/Intel to simply tell them to beat the fuck out of Nvidia with their upscalers, seems easy enough.



Because both series use FP8?

Do you understand how a pipeline works? Clearly not. There's inherently a bottleneck that you need an image to come in to be fixed by AI, no matter how many TOPS you can have, DLSS is a fraction of the frametime. A GPU is not using all its TOPs for DLSS, ever. Inherently the 5090 generates frames faster than other cards and of course ML blocks can tackle more workload, but they go hand in hand, of course never using full TOPS available.





psa-do-not-force-latest-preset-with-dlss-4-5-aka-how-to-v0-snwm2453dqbg1.jpeg


Even for a 4070 Ti its a ~40% latency increase to run M model vs K model, narrowing it down to just DLSS, not your whole frametime.

5090 is a 20% penalty, but even at M model on a 5090 is faster than 4070 Ti's K model by 1.88x

Nothing so far of what you say makes any sense. of course a ms difference in the overall frametime is gonna shrink down to a low % difference in performances, but the data's there, it's much faster on a 5090. That leaves room to breathe for a lot more AI effects, be it Neural radiance cache path tracing, framegen, and upcoming neural texture compressions / neural shaders, etc.



Same fucking discussion happened during DLSS 4.0 release really, I think you even mentionned it.

Now go ask Turing card owners if they are happy to have received DLSS 4 for free years later and allows them to run games at performance/ultra performance with never before seen image quality at those resolutions and letting them play more modern games for a longer period of time over upgrading. They're fucking happy campers. For all the "AMD finewine" bullshit back then suggesting to buy a 5700XT over Turing, that ended up aging like milk.

When MSAA or tessellation launched, the developers didn't come back a year later to say 'hey, here's MSAA 2.0 - it looks better but your old GPU now runs 20% slower'. Most importantly, MSAA, anisotropic filtering and such weren't secretive AI black box things. DLSS, on the other hand, isn't something so open. It gets updated with no clear indication of whether the IQ improvements are caused by the same thing that affects the performance negatively. Let's assume for a brief moment that they're lying to us, putting both 'good' and 'bad' in those updates deliberately - wouldn't it be a planned obsolescence? I hope I'm wrong, I really hope I'm indeed just a clown wearing a tinfoil hat, but there's a possibility that soon we'll all be on a treadmill where 'better DLSS quality' is just code for 'give them carrot, but don't forget to slap them with a stick'.

I guess I just cannot trust proprietary AI stuff. Good for you, if you do...
 
Top Bottom