DLSS 4 just made 4k and Path ray tracing obtainable for the masses

If you dont notice all the artifacting from frame generation with a base frame rate of ~30fps, then you absolutely did not notice the difference in quality between DLSS3 and DLSS4.
 
*With frame generation.
michael GIF
It's wild how many people underestimate frame generation technology.
They're gonna have a bad time with future games trying to play without it.
 
Last edited:
Sorry but 62fps with framegen means real frames are close to 30fps, the latency going to tell you that it is not a 62fps...
I do NOT LIKE Framegen and I never use it. I don't care about that shit. I want True Frames, not fake shit. I think it could be useful for certain scenarios, but for me and many others, here? Maybe not!
 
I do NOT LIKE Framegen and I never use it. I don't care about that shit. I want True Frames, not fake shit. I think it could be useful for certain scenarios, but for me and many others, here? Maybe not!
Ultimately framegen is only useful if you have a 240hz monitor and the baseframe rate is already high eg 80FPS.

Framegen automatically causes a 10-15% reduction in baseline FPS.

eg if you're running a game at 60FPS and you want framegen for 120 fake frames, your real FPS will be 50FPS.

The formula is 10% reduction x2/3/x4 based on your MFG multiplier

I have a 4K120 LG OLED TV as my 'PC monitor'. I never use framegen. I would rather reduce my settings or use DLSS performance instead to maintain higher framerates.
 
Last edited:
It's wild how many people underestimate frame generation technology.
They're gonna have a bad time with future games trying to play without it.
Relying on framegen to deliver framerates is a retarded concept. For framegen to be effective, the base framerate has to already be high.

The best use of framegen is actually when you don't need framegen!
 
Last edited:
Relying on framegen to deliver framerates is a retarded concept. For framegen to be effective, the base framerate has to already be high.

The best use of framegen is actually when you don't need framegen!
I don't think its the same thing.
Lossless does it differently.
It NEEDS the frames to already be high to even function.
Lossless does't actually create anything, just copy paste.
AI frame gen does not.

Lossless doesn't work at low framerates.
FG DOES.
Thats the difference.
you 100% will not see lossless doing what framegen can do with 16k resolutions.
 
100 fps framegen >>>>> 50 fps normal
Well yes, anything sub 60FPS is garbage.

100FPS framegen will feel like 40-50FPS.

But keep in mind, if you are achieving 100FPS in framegen, then your base framerate is too low. You would be better off going DLSS Balanced/Perf if your base framerate is 50FPS.
 
Last edited:
If I had no other choice I'd use rtss to lock to a straight line 50fps and play a game that way, rather than feel the floaty detachment of sub 60fps frame gen.
 
If I had no other choice I'd use rtss to lock to a straight line 50fps and play a game that way, rather than feel the floaty detachment of sub 60fps frame gen.
The good news is you can always tailor your framerate and optimise your game settings.

Digital Foundry often puts 'optimised' settings for PC games which look 95% as good as the game running at max settings (minus RT), but at a huge performance boost.

Thanks to the latest DLSS transformer model, I run all my games at 4K DLSS Performance. I'm now able to achieve 90-120FPS in every game I play at 4K.
 
Last edited:
A ps5 pro just sold for $700. I think a $500 card is a mass market price.
PS5 is not a mass market product, It's very much a niche one, PS5 is mass product, also, people who buy a $500 GPU won't pair it with budget CPU's and motherboards. We are talking at very least 1k systems.
 
Complaining about interpolation frames and reconstruction methods is like complaining about antialiasing or data compression. Computing and video games is all about smokes and mirrors, it's how it works, it's how it's been done since forever.
There is indeed a concern about greedy companies using these technologies to save up on optimization dev. Vote with your wallets, that's it.
 
Complaining about interpolation frames and reconstruction methods is like complaining about antialiasing or data compression. Computing and video games is all about smokes and mirrors, it's how it works, it's how it's been done since forever.
There is indeed a concern about greedy companies using these technologies to save up on optimization dev. Vote with your wallets, that's it.
This does not make any sense since interpolation confers an objectively worse image quality and input lag.
 
How can be that DLSS is now better than NATIVE or TAA?
A raw Native image without AA will still look pretty aliased - so that's not really a good measuring stick.
If we compare with Native ground truth (using say - 4xOGSSAA), nothing temporal will be better, not even DLAA (which is also running at native). But obviously in that case the problem is cost.

TAA is literally the same thing DLSS is - just analytically upsampled instead of using ML, so temporal artifacts they all introduce (ghosting, disocclusion etc.) tend to be perceptually more pronounced.
While there's more 'real-samples' if TAA is running native, the noise introduced from temporal samples tends to counteract that, and better history handling allows ML upscalers to add more samples (that's most noticeable in stationary images, but that's where people compare detail anyway).
 
Last edited:
How can be that DLSS is now better than NATIVE or TAA? I dont get it. Someone can explain it to me? (not talking about the FPS. Im talking about image quality)
so the overall image quality is usually on par or a bit worse than native. plus there can be artifacts, like object ghosting, or other rendering oddities.

when people say its "better than native" theyre usually referring to AA--dlss' AA can outperform other AA methods.
and the magic is in machine learning.. it's been trained on the game and is good at guessing the correct pixels, and has been tuned to make IQ improvements as best it can.
the good/slightly worse base IQ + better AA = people prefer it to native.

native 4k with no AA is still very aliased. 4k needs good AA to shine.
TAA is super smooth but a bit blurry.
 
Last edited:
I use Frame Generation in most of my titles and see no harm in it. If you are extremely anal and notice every detail then I could see how it effects you. I am a big fan and think it's one of the best things Nvidia has done in years. With a $500 card I can play games at 4k/60 and sometimes 4k/120 and looks totally beautiful to me on my 65 inch QD oled. You can't tell me shit because I have 2 eyes.
 
Top Bottom