Hardware Unboxed - DLSS4 vs DLSS3

Confirms what we already discerned with our own eyes.

I'm interested in more quirky comparisons, though. Whatever we can do to increase frames while maintaining good quality. For eg: DLSS4 1440p Balanced compared to DLSS3 4K Performance?

Couch gaming stands to benefit much more from DLSS upgrades, when thinking of configurations that aren't necessarily the most obvious.
 
Last edited:
I'm just curious if the Switch 2 would use some form of DLSS, even if it's last gen DLSS. Because that would be a good thing for some of their games that struggle to hold FPS like Zelda.
 
Thanks to DLSS4 Performance I am now able to play all my 4K games at 100-120FPS without any loss of visual quality.

Gaming has never felt or looked better.

I honestly cannot go back to 60FPS. It's so jarring from switching from 120 to 60.

It felt similar to how I first discovered 60FPS gaming after playing 30FPS games for years on the PS3.
 
Thanks to DLSS4 Performance I am now able to play all my 4K games at 100-120FPS without any loss of visual quality.

Gaming has never felt or looked better.

I honestly cannot go back to 60FPS. It's so jarring from switching from 120 to 60.

It felt similar to how I first discovered 60FPS gaming after playing 30FPS games for years on the PS3.
It might be the best thing about PC games. Getting potentially hundreds of frames really helps with the input lag and reduces motion blur quite a bit. I can't stand motion blur. I'd kill myself if had to go back to playing 30 fps games on an LCD tv at this point.
 
It might be the best thing about PC games. Getting potentially hundreds of frames really helps with the input lag and reduces motion blur quite a bit. I can't stand motion blur. I'd kill myself if had to go back to playing 30 fps games on an LCD tv at this point.
I'd rather quit gaming than play a game at 30FPS.

After playing 120FPS, 60FPS feels like '45'. Just doesn't look smooth at all.
 
Last edited:
I'm just curious if the Switch 2 would use some form of DLSS, even if it's last gen DLSS. Because that would be a good thing for some of their games that struggle to hold FPS like Zelda.
Switch 2 uses Ampere (RTX 30 series) so it will support DLSS 4 without FG just like the RTX 30 series of 2 gens ago on PC
 
That name rings a bell. You wouldn't be the same Unknown Soldier from Eye eye, ya know, bee?
Nope

I have no idea what "Eye eye, ya know, bee" is but I'm pretty certain that whoever this is you're talking about it's not me

I've been this user name on GAF for 17 years

Actually wow fuck I'm old what the FUCK 17 years??
 
God damn, it's honestly insane how good the image holds up even in Performance mode. People like to bitch about modern Nvidia GPUs, but these GPUs will remain viable for a very, very long time because of these DLSS technologies.

Imagine the peoples who picked a Turing card never knowing these techs would come

If you picked any Turing over a 5700XT back then even though it didn't seem to make sense, they won long term.
 
AMD will wait for Sony to develop a Transformer model for them like they waited for Sony to develop PSSR for them

So AMD owners will get Transformer model only when Sony gets around to it for PS6 I guess

And they'll need to buy a new video card to get it unlike Nvidia which has back ported Transformer model to all RTX capable cards
PSSR is not RDNA 4's FP8 AI upscaler.
 
God damn, it's honestly insane how good the image holds up even in Performance mode. People like to bitch about modern Nvidia GPUs, but these GPUs will remain viable for a very, very long time because of these DLSS technologies.
DLSS doesn't solve RTX 3070 Ti's 8 GB VRAM issues.

I have RTX 4090 24GB, 4080 16 GB, 5070 Ti 16 GB, and 3070 Ti 8GB.
 
Imagine the peoples who picked a Turing card never knowing these techs would come

If you picked any Turing over a 5700XT back then even though it didn't seem to make sense, they won long term.
The 2080 Ti ended up aging remarkably well. In retrospective, it was ahead of its time.
 
The 2080 Ti ended up aging remarkably well. In retrospective, it was ahead of its time.
Even that is an understatement. I still have mine and there aren't many games that don't run decently even at 3440x1440. It's going to have a 10+ year lifespan at this rate which is probably a new record.
 
The 2080 Ti ended up aging remarkably well. In retrospective, it was ahead of its time.

Nvidia jumping both on RT acceleration and ML acceleration early gave them such a huge advantage over AMD it's kinda crazy.

AMD is still catching up to the RTX30 series to this day, and still can't match it in some areas... they barely overtook the RTX20 series now, and that's wild
 
Last edited:
Nvidia jumping both on RT acceleration and ML acceleration early gave them such a huge advantage over AMD it's kinda crazy.

AMD is still catching up to the RTX30 series to this day, and still can't match it in some areas... they barely overtook the RTX20 series now, and that's wild

I thought 9070XT is onpar with 4070 outside of RT and PT.
 
Top Bottom