• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Nvidia at Live GTC : DLSS 5

dlss-5.png
dlss-4.png
Will DLSS 5 make women in western games beautiful again? B.R.U.T.A.L.
 
Last edited:
I don't think people quite understand how this could potentially work to benefit and save on rendering time. People want shorter game cycles and lesser budgets and don't see the little ways that we can get it to be beneficial. Makes you wonder on this enthusiast gaming forum, you would think there would be more people that understand some fundamentals about this technology evolving. And that isn't some cynical Nvidia torch bearer type stuff.

This could really benefit a lot of little things like that. And the cool part is you can probably turn it off or cycle between different things. Again, I'm not sure why the outrage about some things but not yourselves out. Those of us that care about tech and all these little things that are advancing will gladly take part in this and if we don't like it we will disable it. Novel concept.
 
"JUST"

This thread clearly uncovers non RTX50 owners haha.
I mean, come on this shit isn't going away, it can only get better. Devs can control how DLSS5 works (according to DF), so in great games we may get potentially greater results.

Yes we will see less and less unique looking games, hurray!

Huzzah! All the games now look the same thanks to UE6 and DLSS SLOP
 
I don't like this at all, DLSS was supposed to be a scaling technology, not in specific "generative AI" shit. DLSS 5 is literally interfering with lightening and face models, totally changing how devs. originally intended it to be, this is shit and what's worse is that more and more games will NEED this to run well unless you have an insanely powerful GPU.

I hope NVIDIA puts in a slider for how much of that "gen ai" shit I want in my video games. All I want is up-scaling without changing any of the artstyle.

Edit:
From the press release: By providing developers with detailed controls such as intensity and color grading. Artists can use these controls to adjust blending, contrast, saturation, and gamma, and determine where and how enhancements are applied to maintain the game's unique aesthetic. Developers can also mask specific objects or areas to be excluded from enhancement.
 
Last edited:
Wow at first view i thought this was fake.

Although yes it looks like AI filter, AI bad blah blah blah etc I think most of the comparisons look incredible and to think this will be improved on....

Worried they was using two 5090s to show it off though.

Hopefully its scalable. and they don't pull some only available on 6000 series
 
Like other DLSS features, if you don't like the result, simply don't activate it.

I believe there will be some kind of control over the intensity applied to the scene.
 

If you could use this without touching the character models it could be incredible.

This is their first attempt at this. I'm hyped for where it will lead. DLSS1 stunk too and didn't become great until DLSS3 really. As they iterate on it, it will get much better.

I just know from messing around with LLMs on my own PC it is very demanding and requires very good hardware to generate video. I'm very curious if this is running an LLM locally on your hardware to achieve this, or if it's using their data centers. If it's the latter then they're gonna want a sub and I have zero interest in it. If it's the former then you are gonna need 40GB VRAM or something crazy like that. At least as things exist right now.
 
Real truth here: Sony and AMD will immediately start copying this as they always do, so you're getting it whether you like it or not. But it will be in quite a few years time.
 
Makes me wonder if some devs will rely on this to make their game look better and not even try?

PC game ports are still highly unoptimized for Christ sake.

This might create new problems in the industry that we can't even think of yet.
 
If you could use this without touching the character models it could be incredible.

This is their first attempt at this. I'm hyped for where it will lead. DLSS1 stunk too and didn't become great until DLSS3 really. As they iterate on it, it will get much better.

I just know from messing around with LLMs on my own PC it is very demanding and requires very good hardware to generate video. I'm very curious if this is running an LLM locally on your hardware to achieve this, or if it's using their data centers. If it's the latter then they're gonna want a sub and I have zero interest in it. If it's the former then you are gonna need 40GB VRAM or something crazy like that. At least as things exist right now.


 
What is DLSS 5?

DLSS 5 is a real-time, 3D guided neural rendering model, unveiled at NVIDIA GTC, that infuses pixels with photoreal lighting and materials. It marks NVIDIA's most significant breakthrough in computer graphics since the debut of real-time ray tracing in 2018.

What are the key benefits of DLSS 5?

DLSS 5 delivers several significant benefits:

  • Cinematic Lighting: Reconstructs complex effects like rim lighting, subsurface scattering for realistic skin, and contact shadows with high-fidelity.
  • Material Depth: Enhances PBR properties like roughness and adds micro-realism to complex objects such as eyes and hair.
  • Temporal Consistency: Provides stable image quality from frame-to-frame that adheres to the underlying game content.
  • Real-Time Performance: Delivers photorealistic enhancement at up to 4K resolution while maintaining smooth, interactive gameplay.
  • Controllability: Allows game developers to tune intensity, color, and masking to determine where and how enhancements are applied to maintain the game's unique aesthetic.
How does DLSS 5 work to achieve photorealism?

DLSS 5 is a neural rendering model that takes the game's color and motion vectors as input for each frame, then infuses the scene with photoreal lighting and materials that are anchored to the source 3D content and temporally consistent from frame-to-frame.

When will DLSS 5 be available?

DLSS 5 releases in fall 2026. An early preview of the technology is being demonstrated this week at GTC in San Jose, CA.

Which games will support DLSS 5 at launch?

DLSS 5 will debut in games including AION 2, Assassin's Creed: Shadows, Black State, CINDER CITY, Delta Force, Hogwarts Legacy, Justice, NARAKA: BLADEPOINT, NTE: Neverness to Everness, Phantom Blade Zero, Resident Evil Requiem, Sea of Remnants, Starfield, The Elder Scrolls IV: Oblivion Remastered, Where Winds Meet, and more.

Does DLSS 5 work with DLSS Super Resolution, Ray Reconstruction, Frame Generation, and Multi Frame Generation?

Yes.

Which GPUs support DLSS 5?

Minimum GPU specifications are pending model optimizations and will be provided closer to release.

What hardware was the demo shown at the GTC booth running on?

The DLSS 5 early preview demo shown at GTC is run on two GeForce RTX 5090s. One RTX 5090 is dedicated to rendering the game while the other is dedicated for running the DLSS 5 model. DLSS 5 will be optimized to run on a single GPU for release.

What is the memory and performance impact of DLSS 5?

DLSS 5 at GTC is an early preview and the model is still being optimized. We will share these details closer to release in fall 2026.

How do developers integrate DLSS 5?

Integration is easy and similar to DLSS Frame Generation – using the NVIDIA Streamline SDK or Unreal Engine 5 plugin.

Does DLSS 5 replace graphical features like Path Tracing?

No. Path tracing provides lighting accuracy (i.e. lighting, shadows, and reflections in the proper location) whereas DLSS 5 delivers lighting photorealism (i.e. as if you had a larger ray budget and higher quality materials). These technologies go hand in hand.

How does DLSS 5 ensure image quality is consistent with the artist's intent?

DLSS 5 honors artistic intent in two ways:

  • Inputting the game's color and motion vectors for each frame into the model, anchoring the output in the source 3D content.
  • By providing developers with detailed controls such as intensity and color grading. Artists can use these controls to adjust blending, contrast, saturation, and gamma, and determine where and how enhancements are applied to maintain the game's unique aesthetic. Developers can also mask specific objects or areas to be excluded from enhancement.


 
Those of you cheering this on - imagine that the PS6 will be a continuation of the PS5 Pro and if the next version of PSSR copies this. With console games you usually don't have the same options that you do on PC. Now imagine if you can't turn this off. That's where we may be headed. (just using playstation as an example. Switch 3 will likely have some version of this too)
 
Last edited:
Apparently DLSS5 is only generating lighting with AI. Geometry and all other assets are left untouched. I think it's pretty cool.

That said, they need to keep refining how much it affects character's faces. That shot of RE9 in the streets was awful. She looked like Audrey Plaza. It's still in development, so hopefully that will be sorted out in the end.
 
I genuinely think people don't have any idea of what this could do for rendering and that doesn't mean generating AI to replace graphics. Developers can do whatever they want on their own before even going through the actual rendering of the image and with the hardware capability can deliver.

If this could fundamentally loosen the rendering by using AI or whatever it's doing to help the image or the actual rendering, couldn't this possibly mean that this would actually help a little bit with bring up compute for other important tasks or even helping developers turn around games quicker? People don't even know what this tech means yet, or at least the ones that are being cynics.
 
Why is this even called DLSS? This is more akin to RTX remix (for newer dx) thats part of streamline, meaning most games should easily integrate it or even the user/nvidia.

Has that disgusting AI look on Characters, but the lighting improvements seem nice. Very mixed bag, this better be highly customizeable.
 
Last edited:
Real truth here: Sony and AMD will immediately start copying this as they always do, so you're getting it whether you like it or not. But it will be in quite a few years time.
Why aren't you including the AI company in MS into this shit take?

Nobody wants this sloppy shit other than spoon-fed goldfish.
 
Last edited:
Top Bottom