• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Jen-Hsun: "We can't do computer graphics anymore without artificial intelligence", but players fear a hardware divide

LordOfChaos

Member

Upscaling tech like Nvidia's DLSS can enhance lower-resolution images and improve image quality while achieving higher frame rates. However, some gamers are concerned that this technology might become a requirement for good performance – a valid fear, even though only a few games currently list system requirements that include upscaling. As the industry continues to evolve, how developers address these concerns remains to be seen.

AI, in its current primitive form, is already benefiting a wide array of industries, from healthcare to energy to climate prediction, to name just a few. But when asked at the Goldman Sachs Communacopia + Technology Conference in San Francisco last week which AI use case excited him the most, Nvidia CEO Jensen Huang responded that it was computer graphics.

"We can't do computer graphics anymore without artificial intelligence," he said. "We compute one pixel, we infer the other 32. I mean, it's incredible... And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible."

Jensen is doubling down on observations that Nvidia and other tech executives have made about AI-based upscaling in PC gaming, arguing that it is a natural evolution in graphics technology, similar to past innovations like anti-aliasing or tessellation.

Lots of focus on upscaling in the article but what’s being ignored is the general topic of “neural rendering”.

Increasingly ML models will be involved in the rendering process itself. Not just upscaling the picture the raster hardware has computed the traditional way.

Intel:

https://www.intel.com/content/www/u...eural-prefiltering-for-correlation-aware.html

AMD:

https://gpuopen.com/download/publications/2024_NeuralTextureBCCompression.pdf

https://gpuopen.com/download/publications/HPG2023_NeuralIntersectionFunction.pdf

Nvidia:

https://research.nvidia.com/labs/rtr/neural_appearance_models/

https://research.nvidia.com/labs/rtr/publication/diolatzis2023mesogan/

https://research.nvidia.com/labs/rtr/publication/xu2022lightweight/

https://research.nvidia.com/labs/rtr/publication/muller2021nrc/

With AMD unifying RDNA and CDNA into UDNA and a commitment to AI upscaling for FSR4, I think the path is clear for a situation where all GPU vendors and all consoles, have some form of matrix acceleration hardware built in. At that point the door will be wide open for techniques like these to be leveraged.
 

Tazzu

Member
Slightly unrelated but I have a laptop with a RTX 4060 8gb, is DLSS worth it for me to go from a 720p/1080p image to something higher when playing on a laptop screen? What about on a 27" 4k monitor?
 

LordOfChaos

Member
Slightly unrelated but I have a laptop with a RTX 4060 8gb, is DLSS worth it for me to go from a 720p/1080p image to something higher when playing on a laptop screen? What about on a 27" 4k monitor?
You didn't say the resolution of the laptop screen, but yes
 

RCX

Member
Graphics arms race is a road of diminishing returns. How much better do they need to be and does absolute photorealism = a good game?

Only AAA can afford to do it and the track record of major studios in making truly original and excellent games has gotten more spotty as time goes on.

Gameplay first. Graphics second. Its not that hard.
 

Tg89

Member
Graphics arms race is a road of diminishing returns. How much better do they need to be and does absolute photorealism = a good game?

Only AAA can afford to do it and the track record of major studios in making truly original and excellent games has gotten more spotty as time goes on.

Gameplay first. Graphics second. Its not that hard.

Yep. With how good even "bad graphics" are at this point, graphics is way further down than second even.

Good enough is good enough.
 
  • Like
Reactions: RCX

DeepEnigma

Gold Member
Daniel Radcliffe Bullshit GIF by Oregon Trail
 

Haint

Member
Slightly unrelated but I have a laptop with a RTX 4060 8gb, is DLSS worth it for me to go from a 720p/1080p image to something higher when playing on a laptop screen? What about on a 27" 4k monitor?

DLSS is always going to dramatically outperform a monitor or TV (especially a monitor) in quality of upscaling, always. DLSS or not though, a laptop 4060 to 4K is probably not going to be a great experience.
 

XXL

Member
With all the progress AI has made in the last few years, you would have to be completely delusional to think it's not going to be insanely advanced in the next 10 years.
 

Hrk69

Member
With all the progress AI has made in the last few years, you would have to be completely delusional to think it's not going to be insanely advanced in the next 10 years.
Yup. People are downplaying AI like it's some kind of crypto scam.

Games are going to benefit hugely from AI
 
Last edited:

Tazzu

Member
DLSS is always going to dramatically outperform a monitor or TV (especially a monitor) in quality of upscaling, always. DLSS or not though, a laptop 4060 to 4K is probably not going to be a great experience.
Its a 14.5" OLED and the resolution is 2560x1600. Fair enough I was wondering if DLSS can make a 720/1080p image look less blurry, this way I can keep playing games on the laptop at good settings even when it starts struggling with newer games. What about FG, can you use it to get a 40-50 FPS game to 60, what about sub 30 to 30 FPS?
 
Last edited:

ap_puff

Banned
Its a 14.5" OLED and the resolution is 2560x1600. Fair enough I was wondering if DLSS can make a 720/1080p image look less blurry, this way I can keep playing games on the laptop at good settings even when it starts struggling with newer games. What about FG, can you use it to get a 40-50 FPS game to 60, what about sub 30 to 30 FPS?
DLSS is going to be very good for you. Yes it has artifacts when upscaling from lower resolutions but it's not a big deal.
 

Haint

Member
Its a 14.5" OLED and the resolution is 2560x1600. Fair enough I was wondering if DLSS can make a 720/1080p image look less blurry, this way I can keep playing games on the laptop at good settings even when it starts struggling with newer games. What about FG, can you use it to get a 40-50 FPS game to 60, what about sub 30 to 30 FPS?

Yeah DLSS should look pretty great on that size screen, much better than just raw 720p/1080p. Frame gen is not advised below 30fps, but may start to be worthwhile around 40fps. Something you'll just have to try out for yourself on a game by game basis and see if you like it better with or without.
 
Last edited:

Topher

Identifies as young
I just want this but on steroids for all current and next gen games. This immediately opens up new ways of playing games.
I don't need ray tracing and fancy graphics. Give me a generational leap in physics and NPC Ai !! 🤬

NxEr3D.gif

Honestly, I wouldn't mind some AAA games made with downgraded graphics by design. Look at Mass Effect Legendary Edition. That shit still rocks today.
 

Yoboman

Member
I just want this but on steroids for all current and next gen games. This immediately opens up new ways of playing games.
I don't need ray tracing and fancy graphics. Give me a generational leap in physics and NPC Ai !! 🤬

NxEr3D.gif
We already had the physics trend on PS360. Devs everywhere pushing it in all different directions. Nearly none of them knew how to make a game more fun with it

You don't need advanced hardware to make physics that are fun - BOTW and TOTK are two of the best showcases of physics. Half Life 2 is 20 years old and still has some of the best use of physics

Gameplay design is still king.

Same for AI. Doesn't matter how smart the AI system if it's not designed for fun. MGS or Hitman games always come to mind, there is no deep complexity in those AI systems but they are designed to be super manipulatable which is what makes for fun AI
 

Puscifer

Member
Graphics arms race is a road of diminishing returns. How much better do they need to be and does absolute photorealism = a good game?

Only AAA can afford to do it and the track record of major studios in making truly original and excellent games has gotten more spotty as time goes on.

Gameplay first. Graphics second. Its not that hard.
Look at the 8K thread, people are actually saying till we reach photo-realistic that there's more to climb.
 

Tazzu

Member
Yeah DLSS should look pretty great on that size screen, much better than just raw 720p/1080p. Frame gen is not advised below 30fps, but may start to be worthwhile around 40fps. Something you'll just have to try out for yourself on a game by game basis and see if you like it better with or without.
Sounds great! Bought this laptop almost accidentally, it was a great deal but I do the vast majority of my gaming on consoles. Nice to know its future proof.
 

BlackTron

Member
Slightly unrelated but I have a laptop with a RTX 4060 8gb, is DLSS worth it for me to go from a 720p/1080p image to something higher when playing on a laptop screen? What about on a 27" 4k monitor?

If your resolution exceeds the res of your monitor, you'll get supersampling, where the additional pixels are "squeezed" together to make a single pixel, resulting in a sharper image. Whether it is worth the extra processing cost for the improvement on the screen will really come down to the game itself and screen size/quality.

27" is small enough that 1080p won't look nearly as bad as on a 65" TV, but you'll still notice a tangible downgrade from 4k to 1080. 1440 might be the sweet spot, but hey you might be playing a game that can DLSS 4k anyway I don't know...try it. Since your laptop is smaller your eyes may or may not notice the difference especially not knowing whether its screen is any good (personally my laptop screen sucks but I knew when I got it lol).

Edit: For some reason I thought you were asking about increasing res beyond that of your screen but I guess you weren't lol, but my answer is still relevant so I'll leave it...
 
Last edited:

Bkdk

Member
AI is really good at generating pretty female faces, there shouldn't be an excuse for western devs to make ugly girl protags anymore, it's literally just copy and paste.
 

LordOfChaos

Member
I just want this but on steroids for all current and next gen games. This immediately opens up new ways of playing games.
I don't need ray tracing and fancy graphics. Give me a generational leap in physics and NPC Ai !! 🤬

NxEr3D.gif

It's crazy that this was 2007 and we've largely lost it with as much power as we've gained



 
Last edited:
D

Deleted member 1159

Unconfirmed Member
I don’t understand why anyone is worried or doubting this- just look at the tech in the PS5 Pro. It’s new tech that’s a smarter way of doing things vs brute forcing it.
 

ap_puff

Banned
I don’t understand why anyone is worried or doubting this- just look at the tech in the PS5 Pro. It’s new tech that’s a smarter way of doing things vs brute forcing it.
The best games of the year in Astrobot, Elden Ring DLC, and FF7Rebirth didn't "need" AI at all. It's just Jensen trying to upsell people like he always does. A game doesn't need machine learning for anything, Nintendo proves it every year. "The tools" aren't what makes a game good, it's a clear vision, development discipline, and a commitment to fun and excellence. Pathtracing doesn't make Cyberpunk a better game. If the story was dogshit, it'd be a nice looking game that was dogshit.
 
D

Deleted member 1159

Unconfirmed Member
The best games of the year in Astrobot, Elden Ring DLC, and FF7Rebirth didn't "need" AI at all. It's just Jensen trying to upsell people like he always does. A game doesn't need machine learning for anything, Nintendo proves it every year. "The tools" aren't what makes a game good, it's a clear vision, development discipline, and a commitment to fun and excellence. Pathtracing doesn't make Cyberpunk a better game. If the story was dogshit, it'd be a nice looking game that was dogshit.
Yeah man I still have fun playing games on hardware that’s about 40 years old at this point on a TV from 1999- that doesn’t mean I don’t want to experience new stuff at the highest possible quality and at acceptable frame rates with new tech too. FF7R could’ve benefited from upscaling and frame generation, because performance mode looked like ass and quality was stuck at 30FPS. The whole point here is there’s a better approach to graphical fidelity with these techniques than just brute forcing it, which will cost more and continue to have diminishing returns
 

TrueLegend

Member
Slightly unrelated but I have a laptop with a RTX 4060 8gb, is DLSS worth it for me to go from a 720p/1080p image to something higher when playing on a laptop screen? What about on a 27" 4k monitor?
With some games you can especially games till 2020. Set 4k and with DLSS at performance or balanced you will get much better IQ. Even on Laptop you can use dldsr and DLSS on combo to get better image quality but it's not worth it for laptop except you should output at your native resolution but use DLSS balanced. If you got 1080p screen DLSS balanced will save your power use and keep the device a bit cool.
 
Last edited:

BlackTron

Member
The best games of the year in Astrobot, Elden Ring DLC, and FF7Rebirth didn't "need" AI at all. It's just Jensen trying to upsell people like he always does. A game doesn't need machine learning for anything, Nintendo proves it every year. "The tools" aren't what makes a game good, it's a clear vision, development discipline, and a commitment to fun and excellence. Pathtracing doesn't make Cyberpunk a better game. If the story was dogshit, it'd be a nice looking game that was dogshit.

Dude Nintendo is going to have upscaling AI tech next year, by the guy in the OP's company.

You kinda don't even know what you are saying here. Say Switch 2 renders a game at 1080, then DLSS can upscale it to 4k, leaving a lot of room for improvements in the actual game. This is what Nvidia cards do on PC right now.

He's right. If the situation is "you can flick a switch and get 4k from 1080, or render at 4k and have no power left for anything else at all" what''s the choice?
 
We went from caring about "Bits" to "pixels"

Most gamers weren't around for the transition, they're new, and they're reactionary.

The bigger problem is they haven't found away other to market levels of fidelity. Fidelity is just fidelity. There has been a hardware divide for years... not sure why some people would be concerned about it now.
 

Tazzu

Member
With some games you can especially games till 2020. Set 4k and with DLSS at performance or balanced you will get much better IQ. Even on Laptop you can use dldsr and DLSS on combo to get better image quality but it's not worth it for laptop except you should output at your native resolution but use DLSS balanced. If you got 1080p screen DLSS balanced will save your power use and keep the device a bit cool.
So just to confirm, a 720p image with DLSS will look better than a raw 720p or even 1080p image? Like I said still feeling the sting of buying this laptop but if it means gaming on the go for the next few years then I can't complain!
 

TrueLegend

Member
So just to confirm, a 720p image with DLSS will look better than a raw 720p or even 1080p image? Like I said still feeling the sting of buying this laptop but if it means gaming on the go for the next few years then I can't complain!
No DLSS has multiple modes. When you use quality DLSS at 1080p you are rendering at 810p is better than native because you get superior antialiasing than native image. Although many games come with DLAA which runs actually at 1080p but gives you superior antialiasing. So if a game has DLAA native resolution+DLAA is superior but if it does not have DLAA than DLSS quality mode will provide better image than native while running at lower native resolution. Also any game looks best at native resolution of monitor. So go to nvidia settings and set scaling to no scaling than play on 1440p as you have 1440p monitor. You will get little black bars on top and bottom. You should always use DLSS in my opinion. In DLSS quality mode the game will run at 960p and get superior AA and get smooth performance.

Basically for 4K

1. Native+DLAA Best (2160p)
2. DLSS Quality (1440p)
3. DLSS Balanced (1250p)
4. DLSS Performance (1080p)

Even DLSS Performance i.e. 1080p can produce better image than 2160p native if AA implementation is not good or DLSS implementation is really the best. This goes for Remedy's Northlight engine. So Control looks better than native in performance mode than in native without DLSS and you can use all that reserve gpu power for RT.
 
Last edited:
People complain about fake DLSS pixels, as if Doom 3 shipped with tiny John Carmacks who were waving flashlights inside your PC to make shadows. Everything is maths and tricks in graphics, and DLSS is the best feature since shaders were introduced.
 

ap_puff

Banned
Dude Nintendo is going to have upscaling AI tech next year, by the guy in the OP's company.

You kinda don't even know what you are saying here. Say Switch 2 renders a game at 1080, then DLSS can upscale it to 4k, leaving a lot of room for improvements in the actual game. This is what Nvidia cards do on PC right now.

He's right. If the situation is "you can flick a switch and get 4k from 1080, or render at 4k and have no power left for anything else at all" what''s the choice?
Okay? And what does that do except make the graphics a little prettier?
 

cireza

Member
"We compute one pixel, we infer the other 32. I mean, it's incredible... And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible."
Live on drugs if you want but keep me out of it.
 
Top Bottom