• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

DF Revealed that NVIDIA needed to use two RTX 5090s for the DLSS 5 tech demo (It will use single card with official release)

LectureMaster

Or is it just one of Adam's balls in my throat?



There's much we don't know though. That starts with the computational cost of the ML algorithm. Nvidia actually used two RTX 5090s for its demos: one plays the game, the other exclusively runs the DLSS 5 technology. The use of two GPUs is required right now as DLSS 5 still has a long way to go in terms of optimisation - both in terms of performance and its VRAM footprint. However, DLSS 5 is designed for use on a single GPU and that's how it will ship later this year. Quite how scalable it is also remains to be seen, but in common with other DLSS technologies, Nvidia tells us that the computational cost scales with resolution.

Expect to see DLSS 5 as a further option with the graphics menus of PC games, alongside super resolution and frame generation, but the demos we saw were running with 2x frame-gen. In fact, DLSS 5 is integrated into frame-gen, which makes sense - after all, using this lighting technique, every frame is now generated. And yet, the quality is there, with few if any of the inconsistencies and mistakes typically seen within photo-realistic generated AI.
 
We all know where the extra power went to.

2ZvZzdwxK4R1Tg9f.png
 
This needed its own thread after we've been talking about this since announcement or are you trying to capitalize and not reading that it's supposed to ship to run on 1 card?
 
Nvidia always like to stay one step ahead of AMD, they probably know know by next-gen AMD would have caught up with ML upsampling and path tracing, so the next USP to distinguish and stay ahead and apart of the competition is DLSS 5, likely exclusive to the 60 Series.
 
I do wonder how performant it could possibly be. Going from a dedicated 5090 to working on a 5070 while the 5070 does regular DLSS as well seems like a tall order.
 
So, expect downgrade?
Expect an upgrade. PC hardware doesn't stand still. By the time DLSS 5 arrives 5090 will already be nearly 2 years old. This tech will get a performance boost on next-gen RTX 60-series GPUs...
 
Last edited:
Expect an upgrade. PC hardware doesn't stand still. By the time DLSS 5 arrives 5090 will already be nearly 2 years old. This tech will get a performance boost on next-gen RTX 60-series GPUs...
If it took two 5090s to run what we saw now, I expect a downgrade on one 5090.

You don't have to explain the other obvious like I am 5.
 
lol most expensive slop filter ever. bet you'll have 20fps on a single 5090 too.
but don't worry we got you covered with dlss5 exclusive 16x framegen
 
Last edited:
Some games they showed had no RT (like Starfield). So far fucking path tracing is cheaper than DLSS5 lol.

Why is it called DLSS5 when it has nothing to do with DLSS 1-4.5?
DLSS is the commercial name for video games AI tools from Nvidia. They have all techs integrated in the same SDK unlike AMD who has different SDK
 
DLSS is the commercial name for video games AI tools from Nvidia. They have all techs integrated in the same SDK unlike AMD who has different SDK

Deep Learning Super Sampling - it has nothing to do with neural rendering or whatever they want to call it. They should have new name for this.
 
Deep Learning Super Sampling - it has nothing to do with neural rendering or whatever they want to call it. They should have new name for this.

Yup. They're using the DLSS name like a brand name. It's obvious. People already accept and like DLSS.

I just hope this doesn't mean 4.5 is the end of their work on regular DLSS.
 
Expect to see DLSS 5 as a further option with the graphics menus of PC games, alongside super resolution and frame generation, but the demos we saw were running with 2x frame-gen. In fact, DLSS 5 is integrated into frame-gen, which makes sense - after all, using this lighting technique, every frame is now generated. And yet, the quality is there, with few if any of the inconsistencies and mistakes typically seen within photo-realistic generated AI.
YvXHyN.jpg


YvXqpv.webp



YvXgXB.jpg
 
So basically $10,000 (at least here in Canada) worth of GPU's. Not sure what this will say about a solitary 5060... especially for those who ended up getting an 8GB model. Not that It matters for me, as I am all AMD right now.
 
How did John handle this news? hopefully better than me, because I really don't care anymore about cutting edge graphic chasing.

So NVIDIA hasn't been completely upfront? or have they? either way I'm not shocked. People will need to get loans for this stuff ;)
 
I'm expecting downgrades when running on one card, but not by a lot.

It's likely that they were running the "full" model for the preview, with weights/parameters taking up a ton of VRAM - hence the need for another 5090. A distilled model would cut down on the number of parameters significantly while still keeping a similar level of accuracy. Quantization would reduce the precision of weights (FP16 vs FP8 vs FP4) and computation to bring VRAM footprint even lower.
 
DLSS5 will probably run fine on a variety of moderately modern cards. I wouldn't be surprised if any RTX card could run it with varying results, though they will probably lock certain functions behind 5000 series cards. I can't see them releasing a landmark product like DLSS5 and then totally locking it down to the ultra scarce 5000 series cards, with like, less than 1% of the PC gaming populace actually using a 5000 series card. Just doesn't make sense.

Remember when AMD said that FSR4 could not run on anything other than an RDNA4/9000 series card? And then... Oopsie, the files got leaked and it turns out you can totally run it on something as weak and a steam deck and there's legit merit to doing so. Is it heavy? Yes. Is the tremendous boost to IQ worth it? Absolutely.

DF even said it themselves. Once this shit hits optiscaler it's going to be on basically everything. To what degree of effectiveness? Hard to say. I wouldn't worry too much though. Furthermore, just look how much DLSS and FSR as technologies have matured over the years.

DLSS1 used to look like shit, and its actual effectiveness was questionable. Now the newest DLSS preset looks insanely good, arguably better than native, even at very low resolutions. It's flat out magic.

FSR used to be miles behind and look like shit. Now its actually pretty good. FSR4 is genuinely worthwhile.

Same for frame generation, which used to be smeary and unusable, especially with text involved. Now it's totally fine. You might not like it, but it is usable.

People are ripping DLSS5 to fucking SHREDS and it's still in development, not even due out for at least another 6 months or so. It reminds me of all of those morons screeching that GTA6 looked like ass when it leaked years ago. Like.. no fucking shit, it's not done yet.

And to the naysayers squawking about it ruining artistic intent- if the developers choose to add DLSS5 to their game, obviously it lines up with their "artistic intent," or they just wouldn't allow it to be an option, hello? Even the slapdash demo of Starfield - endorsed by Todd Howard himself - looked amazing, graphical bugs notwithstanding. One of that game's biggest problems was how dull and flat all of the characters looked. If this technology makes games look better, and it's supported by the developers themselves, what's the problem?

DLSS and FSR are already using machine learning in the exact same way to literally reconstruct an image and generate more pixels that didn't exist before from much lower quality images, and then generate whole ass completely new frames with it. It's AI all the way down, and we've been using it for years! Why now is it the bogeyman?
 
Instagram filters run in 100€ Chinese phones. This is but a glorified version of that. Of course it will run in one card. The 2 5090s presentation move was so people pat them in the back when they show it running at 240fps in a 2060.
 
Last edited:
How did John handle this news? hopefully better than me, because I really don't care anymore about cutting edge graphic chasing.

So NVIDIA hasn't been completely upfront? or have they? either way I'm not shocked. People will need to get loans for this stuff ;)
He cried like a bitch on BlueSky.
 
Top Bottom