RTX Neural Texture Compression Tested, the next DLSS?

peish

Member


I dont know the tech, but looks impressive to add fake vram soon!

maxresdefault.jpg
 
We'll have to wait until DXR 1.2 to see get API support. And then wait for developers to implement it into their games.
So it might take several years until we see games using this.
 
Inevitable future

Imagine the bandwidth and VRAM relieved by doing inference on sample

On top of having better quality than standard algorithms

 
Last edited:
Guessing that Nvidia will sponsor a few titles within the next couple of years time to use it.

It would be in Playstation's favor to also adopt a very similar feature, we've already seen them develop tools to heavily optimise for things like memory bandwidth, most recently Universal Compression with the PS6 and RDNA 5.

With a heavy focus on ML for the PS6, Sony would be fools to ignore this feature as the savings on RAM would be significant. Hopefully Nvdia pushes it, and so do Playstaion so we can get more industry wide adoption.
 
Last edited:
I dont know the tech, but looks impressive to add fake vram soon!
It's not 'fake' - however to get the improvements in the demo it's subject to specific conditions.
This isn't a '1:1' comparison on compressing individual textures.

That said - if VRam savings are the primary concern - it's not dissimilar to what can already be done with SFS. Or more manual methods on older hardware - ie. Rage implemented decompression on 'access' on PS360 class hardware as well.
Yes - these wouldn't 'quite' match the best case from the OP demo - but they would bring that 79MB down to 20-40MB range.

Is any game announced to be using this in future? Could have widespread adoption, or it could be the next Sampler Feedback.
Implementation principles are very similar - but they are also complementary, ie. you need some equivalent of SF to implement decompression on sampling on existing hardware.
 
It would be in Playstation's favor to also adopt a very similar feature, we've already seen them develop tools to heavily optimise for things like memory bandwidth, most recently Universal Compression with the PS6 and RDNA 5.
Microsoft has patents in this area already, so does Sony.


Sony's version is basically work backwards from full fat textures to distill a ML language of sorts that applies transforms from a base texture reconstructing in realtime. Those transforms are essentially metadata and are hugely compressed versus a texture file. They refer to the metadata as an intermediate file.
 
I see nothing to complain about when it comes to cool new advancements which you can optionally disable. Bring on RTX tittie

Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.

Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.
 
Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.

Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.
And unlike other industry leaders like Intel, they are not exactly sleeping at the wheel and getting complacent. Nvidia is still the premiere company when it comes to pc graphics and it's not just due to their performance superiority.
 
Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.

Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.

Thing is this seem to be hardware agnostic as DirectX will update to have cooperative vectors. The demo right now is with a custom Nvidia API but really just for testing purposes. Same for neural shaders which will use cooperative vectors. So good news overall for big support when its implemented.

And unlike other industry leaders like Intel, they are not exactly sleeping at the wheel and getting complacent. Nvidia is still the premiere company when it comes to pc graphics and it's not just due to their performance superiority.

Nvidia is so involved with research papers in collaborations with universities that unless they completely change their stance on this, they'll always by at the front row of new technologies. This has helped them immensely and is one of the key factor to where they are today.


They're really nothing like Intel.
 
Last edited:
Thing is this seem to be hardware agnostic as DirectX will update to have cooperative vectors. The demo right now is with a custom Nvidia API but really just for testing purposes. Same for neural shaders which will use cooperative vectors. So good news overall for big support when its implemented.



Nvidia is so involved with research papers in collaborations with universities that unless they completely change their stance on this, they'll always by at the front row of new technologies. This has helped them immensely and is one of the key factor to where they are today.


They're really nothing like Intel.
Indeed and so I'm clear, I meant in the way that Intel was market leader and I supported them for years due to their superior performance. I've been on AMD since 2nd gen Ryzen. Nvidia still has this equity with me.
 
The last few years has really just been trying to get 720p 30fps to look like 4k 120fps.
Guess by that logic we should just say tesselation is just fake geometric detail. Virtual geometry is just a lazy man's LOD, great smoke effects and leaves are just 2d sprites....etc.

Game design will always find better ways to do things. I don't see the sense in complaining about getting 720p to look like 4K if you can't tell the difference. Especially when in some cases, it even looks better than rendering natively.
 
I see nothing to complain about when it comes to cool new advancements which you can optionally disable. Bring on RTX tittie
This - if/when used - will definitely not be optional.
I mean 'ok' - for legacy GPU purposes some games 'might' ship lower-res texture packs on PC for awhile but it won't be a thing you'd actually 'choose' to use unless your GPU just can't run it.

Guess by that logic we should just say tesselation is just fake geometric detail. Virtual geometry is just a lazy man's LOD, great smoke effects and leaves are just 2d sprites....etc.
Amusingly - I'm pretty sure Virtualized geometry pre-dates discrete LODs in CG history 🤷‍♀️
But it's also fair to say that there's a sizeable difference between view-dependent optimizations (which is all of the above in your list) and synthesizing data that doesn't exist in the pipeline (ie. temporal reconstructions of all types, but especially non-analytical ones).
But I'll concede that the difference will blur over time - we may live to see a day when neural-prediction fills in inputs for you for that extra bit of true-zero input latency, and at that point, does it really matter if rendering is also completely made-up on the fly and reality is no longer a benchmark for it?
 
Yes, now take a moment to think why?

Hint: Moore's law is dead.
Everything is just badly optimized with also all the wrong priorities now.
Back in the ps4 era, we had a lot of games that looked as good / better than games nowadays, and with much, much better performances.
We you see stuff like Battlefront 2 from 2017 which runs incredibly well on hardware from back then, and then you see stuff from today looking worse and being 10 times heavier to run... and then they rely on dumb new tech to just barely run at all, there's a problem.
I think graphics kind of peaked around 2018 (RDR2) and it just went downhill with worse and worse performance after that.
 
Last edited:
Everything is just badly optimized with also all the wrong priorities now.
Back in the ps4 era, we had a lot of games that looked as good / better than games nowadays, and with much, much better performances.
We you see stuff like Battlefront 2 from 2017 which runs incredibly well on hardware from back then, and then you see stuff from today looking worse and being 10 times heavier to run... and then they rely on dumb new tech to just barely run at all, there's a problem.
I think graphics kind of peaked around 2018 (RDR2) and it just went downhill with worse and worse performance after that.

PS4 better performance? Most games were 30fps and some couldn't stick to that (bloodborne, cyberpunk 2077). There were no performance modes. If a game was running at 60fps, it was designed to run at 60fps (which due to the weak ass cpu cores in the ps4, meant quite a few cutbacks). The PS4 and XB1 variants were all hobbled with weak cpu cores and very slow storage speeds.
 
Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.

Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.
Now you can run the Witcher 3 with said tech (Nvidia hair works) and it runs great. That's the magic of PC, crisys was tough to run in 2007 but later PCs showed how much that game could shine. 2011 PCs couldn't handle Skyrim with a lot of mods now you can stack Skyrim with the most demanding mods and the game feels awesome. Let devs have options that scale to the top end and future hardware, it makes upgrading your PC so much more satisfying when games aren't held back.
 
And unlike other industry leaders like Intel, they are not exactly sleeping at the wheel and getting complacent. Nvidia is still the premiere company when it comes to pc graphics and it's not just due to their performance superiority.
Nvidia is the premier company when it comes to graphics period. No one comes close, not Apple, AMD, Qualcomm, Intel, Imagination, Arm, Microsoft etc.
 
I've said multiple times that neural shaders and texture compression are probably the next thing, but I can see it taking some time before the implementation gets nailed.

DLSS was just bad until we started to get into the 2nd version, but now it's a no-brainer to turn on because it looks better than any other TAA/TSR implementation, and gains performance. Frame-gen is the only part of the technology chain that has annoying drawbacks.
 
Last edited:
PS4 better performance? Most games were 30fps and some couldn't stick to that (bloodborne, cyberpunk 2077). There were no performance modes. If a game was running at 60fps, it was designed to run at 60fps (which due to the weak ass cpu cores in the ps4, meant quite a few cutbacks). The PS4 and XB1 variants were all hobbled with weak cpu cores and very slow storage speeds.
I'm sorry I guess I wasn't clear enough, I just meant the ps4 era, not actual ps4 games. I was mostly talking about games from that era on PC, not limited by a specific hardware, just games made during that time.
 
Depending on the resolution, the distance at which MipMap levels are loaded will be affected.
Even more so - a solution that streams or decompresses based on sample-coverage directly impacts how much of that 200MB texture needs to fit into pipeline.
Ie. the 4k vs 1080p will impact on how much load goes onto that decompressor in the OP demo.
 
Is any game announced to be using this in future? Could have widespread adoption, or it could be the next Sampler Feedback.
Judging by the Cerny x AMD presentation with Universal Compression I'm guessing we will get some standardised widespread use soon enough.
 
AMD has been working on something like this too, called Neural texture block compression. RDNA 5 and the next consoles will support this.

In the next years a standard will be adopted, wherever it's the Nvidia or the AMD one we'll see.
 
Last edited:
People might complain that these techs will push developers to laziness but I see them (and so should any sane developers) as a tool to reach stutter, bug etc. etc. free games. These techs are not magic bullets, you need to put your hand under the stone and work, work, work.

Hope we'll see more "tech magic" to reach a level that every system (low end, mid end, high end, consoles) can benefit. People might get angry with me but I really think native 4k is waste of resources. Good reference point for comparisons but waste on actual gameplay.
 
That there would be distance differences in mipmaps for a demo with an helmet just right in front of the camera?

That is not how mipmaps work.
Pretty much every game and tech demo in the last 2 decades use MipMaps.
And you can be sure just by looking at how stable the textures are, that MipMaps are being used.
 
Ok well then minimum impact on performances on that demo 🤷‍♂️

I'm trying to answer the dude who asked why the frame time don't budge. If you have a better answer take it on.
 
Ok well then minimum impact on performances on that demo 🤷‍♂️

I'm trying to answer the dude who asked why the frame time don't budge. If you have a better answer take it on.

You mean when he asked why resolution changes but memory stays the same?
Because it's reporting the texture data only. Not the other buffers, Mipmaps, etc.
 
Yea but I don't think this demo's setup would do this.
The texture compression format(The Nvidia one) natively works with mipmaps - in fact to get the full benefit - you 'have' to compress the whole mipmap pyramid (and a texture stack of 7-8). If you omit the mipmaps compression efficiency would be lowered to.
 
Top Bottom