Yes, but in the GOWR scenario was a little ML model they made exclusively for their game, this is a framework should be work with no additional training.Didn't Sony pretty much do that with GOWR on PS5?
Is any game announced to be using this in future? Could have widespread adoption, or it could be the next Sampler Feedback.
Guessing that Nvidia will sponsor a few titles within the next couple of years time to use it.
It's not 'fake' - however to get the improvements in the demo it's subject to specific conditions.I dont know the tech, but looks impressive to add fake vram soon!
Implementation principles are very similar - but they are also complementary, ie. you need some equivalent of SF to implement decompression on sampling on existing hardware.Is any game announced to be using this in future? Could have widespread adoption, or it could be the next Sampler Feedback.
Microsoft has patents in this area already, so does Sony.It would be in Playstation's favor to also adopt a very similar feature, we've already seen them develop tools to heavily optimise for things like memory bandwidth, most recently Universal Compression with the PS6 and RDNA 5.
I dont know the tech, but looks impressive to add fake vram soon!
![]()
Lower the resolution but the memory requirement stays the same??? Wha?
I see nothing to complain about when it comes to cool new advancements which you can optionally disable. Bring on RTX tittie
And unlike other industry leaders like Intel, they are not exactly sleeping at the wheel and getting complacent. Nvidia is still the premiere company when it comes to pc graphics and it's not just due to their performance superiority.Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.
Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.
Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.
Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.
And unlike other industry leaders like Intel, they are not exactly sleeping at the wheel and getting complacent. Nvidia is still the premiere company when it comes to pc graphics and it's not just due to their performance superiority.
Indeed and so I'm clear, I meant in the way that Intel was market leader and I supported them for years due to their superior performance. I've been on AMD since 2nd gen Ryzen. Nvidia still has this equity with me.Thing is this seem to be hardware agnostic as DirectX will update to have cooperative vectors. The demo right now is with a custom Nvidia API but really just for testing purposes. Same for neural shaders which will use cooperative vectors. So good news overall for big support when its implemented.
Nvidia is so involved with research papers in collaborations with universities that unless they completely change their stance on this, they'll always by at the front row of new technologies. This has helped them immensely and is one of the key factor to where they are today.
Publications | Research
Our publications provide insight into some of our leading-edge research.research.nvidia.com
They're really nothing like Intel.
Guess by that logic we should just say tesselation is just fake geometric detail. Virtual geometry is just a lazy man's LOD, great smoke effects and leaves are just 2d sprites....etc.The last few years has really just been trying to get 720p 30fps to look like 4k 120fps.
This - if/when used - will definitely not be optional.I see nothing to complain about when it comes to cool new advancements which you can optionally disable. Bring on RTX tittie
Amusingly - I'm pretty sure Virtualized geometry pre-dates discrete LODs in CG historyGuess by that logic we should just say tesselation is just fake geometric detail. Virtual geometry is just a lazy man's LOD, great smoke effects and leaves are just 2d sprites....etc.
Yes, now take a moment to think why?The last few years has really just been trying to get 720p 30fps to look like 4k 120fps.
Same as DLSS in fall 2018 when it was announced.And of course there are no games that use it, or even announced.
But is it working is the $100 question?The last few years has really just been trying to get 720p 30fps to look like 4k 120fps.
Everything is just badly optimized with also all the wrong priorities now.Yes, now take a moment to think why?
Hint: Moore's law is dead.
Everything is just badly optimized with also all the wrong priorities now.
Back in the ps4 era, we had a lot of games that looked as good / better than games nowadays, and with much, much better performances.
We you see stuff like Battlefront 2 from 2017 which runs incredibly well on hardware from back then, and then you see stuff from today looking worse and being 10 times heavier to run... and then they rely on dumb new tech to just barely run at all, there's a problem.
I think graphics kind of peaked around 2018 (RDR2) and it just went downhill with worse and worse performance after that.
Now you can run the Witcher 3 with said tech (Nvidia hair works) and it runs great. That's the magic of PC, crisys was tough to run in 2007 but later PCs showed how much that game could shine. 2011 PCs couldn't handle Skyrim with a lot of mods now you can stack Skyrim with the most demanding mods and the game feels awesome. Let devs have options that scale to the top end and future hardware, it makes upgrading your PC so much more satisfying when games aren't held back.Agreed. People complain about frame gen like it's mandatory. The only game Ive ever used it in was path traced cyberpunk, and it was totally worth it.
Nvidia has been adding weird experimental tech to games for a long time now. The original release of Witcher 3 had that weird hair tech that completely tanked your frame rate for the benefit of…. Better hair.
Nvidia is the premier company when it comes to graphics period. No one comes close, not Apple, AMD, Qualcomm, Intel, Imagination, Arm, Microsoft etc.And unlike other industry leaders like Intel, they are not exactly sleeping at the wheel and getting complacent. Nvidia is still the premiere company when it comes to pc graphics and it's not just due to their performance superiority.
I'm sorry I guess I wasn't clear enough, I just meant the ps4 era, not actual ps4 games. I was mostly talking about games from that era on PC, not limited by a specific hardware, just games made during that time.PS4 better performance? Most games were 30fps and some couldn't stick to that (bloodborne, cyberpunk 2077). There were no performance modes. If a game was running at 60fps, it was designed to run at 60fps (which due to the weak ass cpu cores in the ps4, meant quite a few cutbacks). The PS4 and XB1 variants were all hobbled with weak cpu cores and very slow storage speeds.
Not really comparable. A few Nvidia-sponsored games will use it, if at all. Otherwise, no one will do so voluntarily.Same as DLSS in fall 2018 when it was announced.
If you send a 200 MB texture in the pipeline, it doesn't matter if its 4k or 1080p![]()
Even more so - a solution that streams or decompresses based on sample-coverage directly impacts how much of that 200MB texture needs to fit into pipeline.Depending on the resolution, the distance at which MipMap levels are loaded will be affected.
Judging by the Cerny x AMD presentation with Universal Compression I'm guessing we will get some standardised widespread use soon enough.Is any game announced to be using this in future? Could have widespread adoption, or it could be the next Sampler Feedback.
Depending on the resolution, the distance at which MipMap levels are loaded will be affected.
Yea but I don't think this demo's setup would do this.
Why wouldn't this demo use MipMaps?
That there would be distance differences in mipmaps for a demo with an helmet just right in front of the camera?
Ok well then minimum impact on performances on that demo
I'm trying to answer the dude who asked why the frame time don't budge. If you have a better answer take it on.
The texture compression format(The Nvidia one) natively works with mipmaps - in fact to get the full benefit - you 'have' to compress the whole mipmap pyramid (and a texture stack of 7-8). If you omit the mipmaps compression efficiency would be lowered to.Yea but I don't think this demo's setup would do this.