• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Graphical Fidelity I Expect This Gen

The solution to the problem is simple. Tweak DLSS5 to make the character's faces change less. I am sure Nvidia will be able to do that

That way you keep the lighting improvements without making big visual changes into the characters
Not sure why the fuck people are panicking when that is the clear solution to the problem.
 
Not sure why the fuck people are panicking when that is the clear solution to the problem.
Because they are retards.

But its obvious that this DLSS5 is just the first application of neural rendering tech, and is implemented as an external layer which applies to existing games making it possible to play games with this tech in late 2026.

In future, neural rendering will be integrated into the rendering pipeline and the game engines natively, also removing duplicate work for each pixel. But this is probably happening next year, with games coming out in 2028 onwards.
 
Last edited:
Not to be dramatic or melancholic, but damn this whole negative ragebait shit around Nvidia DLSS 5 (due to Nvidia's stupid marketing around Grace's face) is making me feel really down, almost depressed in a way, I am very VERY disappointed by Nvidia's initial marketing around this. The tech is obviously transformative & it'll be the future that starts this year. The "art" people are pissing me off so bad, they think Neural Rendering is just a "make it realistic & disregard art", oh the misinformation...

What's also puzzling me is the never-ending amount of people longing for huge generational leaps, they keep mentioning & propping up tweets like "oh gaming used to be absolute beasts on graphics in the 2010s" & "oh no we will never see big graphics leaps ever again" and all that garbage cheap talk.

Hopefully Nvidia & Capcom show off a new RE9 gameplay soon that changes peoples' minds, slow down on the massive graphical leaps, people can't even comprehend them yet, and fix your AI face slider a bit, make it look closer to the game, gamey if you like, and build on from there as people acclimate to them.

The other showcases were phenomenal & something I wanted to happen for so long, the environments, lighting & materials are something else.

Phantom Blade 0 is another game I'm already excited to play, it'll have DLSS 5 too, hopefully this doesn't discourage other studios to ignore DLSS 5.
 
Last edited:
Not to be dramatic or melancholic, but damn this whole negative ragebait shit around Nvidia DLSS 5 (due to Nvidia's stupid marketing around Grace's face) is making me feel really down, almost depressed in a way, I am very VERY disappointed by Nvidia's initial marketing around this. The tech is obviously transformative & it'll be the future that starts this year. The "art" people are pissing me off so bad, they think Neural Rendering is just a "make it realistic & disregard art", oh the misinformation...

Hopefully they have a new RE9 showing soon that changes peoples' minds a bit, slow down on the massive graphical leaps, people can't even comprehend them yet, and fix your AI face slider a bit, make it look closer to the game, and build on from there as people acclimate to them.

The other showcases were phenomenal & something I wanted to happen for so long, the environments, lighting & materials are something else.

Phantom Blade 0 is another game I'm already excited to play, it'll have DLSS 5 too, hopefully this doesn't discourage other studios to ignore DLSS 5. Fu
How the hell does phantom blade have dlss 5 when you need 2 5090 to run it?
 
I'm finding myself refreshing DF's channel every few minutes, although I'm not keen on this tech yet I'm loving the discourse around it, watching AI haters and nvidia shills have a go at one another is beautiful.

Can't wait to see the damage control video from Digital Foundry.
 
Gotta say, the Oblivion footage at the end looks spectacular even with DLSS5 turned off. UE5 assets and meta humans look so good in this game without any AI slop magic. You can tell, the lighting is held back by it being software lumen instead of hardware, but those interiors are just gorgeous. ZThat shot of those the game looked like shit in the open world so i turned it off before i even got to the first town. Looks like i owe analog_future analog_future an apology.

iOKdBNaLjlo621i0.jpg
Oblivion? Isn't that from the Nvidia Zorah demo?
 
Can't wait to see the damage control video from Digital Foundry.
DF needs to stop these podcast style videos when it comes to these big main event kind of news. Just the other day I was bitching about how poor their PSSR 2.0 video was. It was Oliver and Alex just giggling and stumbling over their words trying to get relatively basic info out, and that took them 29 minutes. 29 minutes to cover 4 games?

They wont be taken seriously until they take themselves seriously. Doing a video like this in a hotel room is dumb and unprofessional. Do a proper video with pros and cons of the tech in a concise manner, and people might actually watch it. They do bring up several concerns but towards the very end of the video and almost in passing like an afterthought. An editor like Richard should know better.
 
How the hell does phantom blade have dlss 5 when you need 2 5090 to run it?
you dont need two 5090s to run it. DF mentioned that nvidia engineers already have this running on a single 5090 in their labs, and they think they will be able to run it on all 50 series cards by fall 2026 when this comes out. This scales with compute so it should downscale relatively well just like other games do. You reduce resolution or settings or both and it will run just fine on a 5070 or 5080. my guess is 4k on the 5090, 1440p on the 5080, and 1080p on the 5070.

Again, everyone missed this crucial detail because no one wants to sit through a 20 minute video of two unattractive dudes jerk off to tech porn.
 
Then perhaps this tech could and should be used to make the gameplay models match their cutscene counter-parts instead of trying to create these hyper realistic and uncanny models?
Idk, just an idea.
they didnt do that because they cant. their AI model is trained on photorealism, not game assets. And you cant train the model on your own game either. its one model.

i remember when DLSS first came out they said they would train the model on each game. then with DLSS2, it changed and one model was used for every game. I think they realized that it was simply not sustainable to train their AI model on every game so they chose a one size fits all model that is completely changing the look of every game to the point where they seem unrecognizable.

i would rather they work on adding path tracing support to previous games that dont support it. Or Use hero lighting on character models instead of completely overhauling them. Or use AI to make GPUs do more with less. We have hit a bottleneck here with the cost of shrinking dies exceeding the returns, and so we need AI's help to make a 10 tflops GPU run like a 20 tflops GPU. Work on that, no?

i still think its the most insane tech demo ive ever seen but there is so much more work to be done before we start inserting photorealism filters into our games. And yes, its a filter, i dont care how many times nvidia says its not.
 
40-50 fps at 1080p. With ray tracing effects set at 1/16 resolution. I mean i expected this but once again, there is no secret sauce that some devs know that Epic's engineers dont. If you want to push visuals, you are going to tax the GPU. It's not rocket science.

 
they didnt do that because they cant. their AI model is trained on photorealism, not game assets. And you cant train the model on your own game either. its one model.

i remember when DLSS first came out they said they would train the model on each game. then with DLSS2, it changed and one model was used for every game. I think they realized that it was simply not sustainable to train their AI model on every game so they chose a one size fits all model that is completely changing the look of every game to the point where they seem unrecognizable.

i would rather they work on adding path tracing support to previous games that dont support it. Or Use hero lighting on character models instead of completely overhauling them. Or use AI to make GPUs do more with less. We have hit a bottleneck here with the cost of shrinking dies exceeding the returns, and so we need AI's help to make a 10 tflops GPU run like a 20 tflops GPU. Work on that, no?

i still think its the most insane tech demo ive ever seen but there is so much more work to be done before we start inserting photorealism filters into our games. And yes, its a filter, i dont care how many times nvidia says its not.
Oh I know, I was just trying to subtly hint that if this tech can't match the results from your examples then perhaps hand-made art for videogames is still the superior alternative for now.
 
ASSCREEDGAF.jpg


I removed the text boxes from the image.

Bar some resolution issues (due to the image itself), these are the best graphics I've seen in my life.

Photo modes are gonna go crazy next gen.
yeah that looks crazy. Look if you can tweak it and use it as a technique - something like normal maps. I think this can actually be petty awesome. It just needs to be used more sparingly and specifically. But this happens all the time when new tech is introduced.
 
Death Stranding 2 getting ray tracing on PC. Only Reflectins and AO.

Additionally, players on PC with high-end PC hardware will have the additional option to enable Ray Tracing on Reflections and Ambient Occlusion via additional settings. When enabled, ray-traced reflections are applied on surfaces such as water and tar, while ray-traced ambient occlusion applies ambient lighting affects to scenes, resulting in more realistic shadows in crevices, corners, and between objects. These additional PC options are aimed at players with powerful hardware that want to push visual fidelity beyond the "Very High" graphics settings the game already offers. Though would like to note that Ray Tracing is not part of the minimum or recommended specifications for the core experience and is not required to enjoy the game as intended.
 
Last edited:
Nixxes has added the option to enable ray tracing on reflections and ambient occlusion for the PC port of DS2.

This means that the Decima Engine now supports ray tracing, which is great news for Horizon 3.
 
After viewing the DLSS 5 video the most transformative application that kept the intended "look" for me was Starfield ironically enough (a game i consider rough looking). Hogwarts would have been up there too if it didn't turn teenagers into grizzled 40 year olds. Loving the tech not so much the current implementation but that will change as more people get hands on experience with it.
 
Death Stranding 2 getting ray tracing on PC. Only Reflectins and AO.

Additionally, players on PC with high-end PC hardware will have the additional option to enable Ray Tracing on Reflections and Ambient Occlusion via additional settings. When enabled, ray-traced reflections are applied on surfaces such as water and tar, while ray-traced ambient occlusion applies ambient lighting affects to scenes, resulting in more realistic shadows in crevices, corners, and between objects. These additional PC options are aimed at players with powerful hardware that want to push visual fidelity beyond the "Very High" graphics settings the game already offers. Though would like to note that Ray Tracing is not part of the minimum or recommended specifications for the core experience and is not required to enjoy the game as intended.

Holy shit, first Decima game with RT. Was it so hard to do on PS5 Pro Kojima?
 
Holy shit, first Decima game with RT. Was it so hard to do on PS5 Pro Kojima?
I like how Nixxes was able to add additional RT effects while porting games like Returnal, Ratchet, Spiderman 2, and Death Stranding 2, but 30-40 year old studios with far more resources than Nixxes werent able to.

I am actually kinda disappointed that Nixxes didnt add RTGI. They had actually talked about RTGI with the Horizon Remake saying it was too much work but we should be on the lookout for it in the future, but settled for just reflections and AO.

With Sony no longer making PC ports, i hope they hold on to Nixxes instead of shutting them down tomorrow after the DS2 launch. Have them go back and add RT to the PS5 pro versions. Should be able to backport stuff like this. Make them an RT focused studio that handles RT for the PS5 pro versions of their upcoming games. Clearly, they are willing to do the work, and they're definitely not as lazy as Kojipro, so dont shut them down like a bunch of greedy corporate cucks and have them do something for PS5 owners who invested an additional $700 into your ecosystem.
 
The tech is obviously transformative & it'll be the future that starts this year. The "art" people are pissing me off so bad, they think Neural Rendering is just a "make it realistic & disregard art", oh the misinformation...
If you run Nvidia and you don't want people to walk away from a presentation with this take, then you and (the developers that worked with you) have to show better examples to begin with.

Or just be slightly more aware of the public's feelings on generative AI so that you can show something that gives off a better feeling when looking at it.
 
I like how Nixxes was able to add additional RT effects while porting games like Returnal, Ratchet, Spiderman 2, and Death Stranding 2, but 30-40 year old studios with far more resources than Nixxes werent able to.

I am actually kinda disappointed that Nixxes didnt add RTGI. They had actually talked about RTGI with the Horizon Remake saying it was too much work but we should be on the lookout for it in the future, but settled for just reflections and AO.

With Sony no longer making PC ports, i hope they hold on to Nixxes instead of shutting them down tomorrow after the DS2 launch. Have them go back and add RT to the PS5 pro versions. Should be able to backport stuff like this. Make them an RT focused studio that handles RT for the PS5 pro versions of their upcoming games. Clearly, they are willing to do the work, and they're definitely not as lazy as Kojipro, so dont shut them down like a bunch of greedy corporate cucks and have them do something for PS5 owners who invested an additional $700 into your ecosystem.
I had forgotten that, so that means that even back then the Decima supported Ray Tracing if Nixxes considered adding RTGI in the Horizon remaster.

Nixxes is probably fine, there will always be ports of multiplayer games to PC to do, now they also do support like on the development Saros and they have said that they are interested in doing other remasters.
 
you dont need two 5090s to run it. DF mentioned that nvidia engineers already have this running on a single 5090 in their labs, and they think they will be able to run it on all 50 series cards by fall 2026 when this comes out. This scales with compute so it should downscale relatively well just like other games do. You reduce resolution or settings or both and it will run just fine on a 5070 or 5080. my guess is 4k on the 5090, 1440p on the 5080, and 1080p on the 5070.

Again, everyone missed this crucial detail because no one wants to sit through a 20 minute video of two unattractive dudes jerk off to tech porn.
I have a 4080 so i probably need to upgrade to at least 6070, not gonna buy a 5000 series and i'm ever gonna play at anything lower than 4k.
 
Last edited:
The bare minimum...
I like how they said WATCH THIS SPACE almost 2 years ago, and this is the best they could do lol

Was ray tracing considered for Horizon Forbidden West, like the RT shadows in previous Nixxes PC port Shadow of the Tomb Raider?

Michiel Roza - Nixxes
: It was definitely considered, and it's good that you mentioned Shadow of the Tomb Raider. There, we had to do an entire lighting pass to get RT shadows to work. And for this project, with the hours of cinematics and the scope... we decided that the game already looks really good, there's a strong direction here and we really didn't want to mess with it.

Jeroen Krebbers - Guerrilla: Don't forget a lot of the content is alpha-tested trees. I mean, a lot. Of course we have settlements and some hard surfaces, but most of it is really hard to ray trace against, even for shadows. There's a lot of content, maybe 100 square kilometres of content or more, and we'd need to go through the whole game... it's just mind-boggling... obviously having the scalability between PS4 and PS5 meant that we wanted to focus on stuff that would work on both PS4 and PS5, that sadly excluded RT. And it's not an easy addition to the PC version. We really like the tech of course. Watch this space.

i do wonder if RTAO would fix the shadows in an area like this.

Gufwy8IWMAARW5U


Reflections are fine on Decima water. Its probably got the best water reflections on consoles. Didnt really need RT reflections there.
 
Sure they look good.
Are you okay? they just dont look good, they look 10x better than the AI slop filter.
This is how you know its some collective amnesia virus because to justify this, you have to downplay literal CGI graphics as "meh they look good". So you can pump your love for AI slop. Increadible
But.

One of these games is barely even a game, took 5 years to make, and has very little gameplay. Graphics at that level of quality are not possible today on a large scale.
We saw graphics close to that level in Hellblade 2, AW2, Resident Evil Requiem and acouple other games.
Next gen graphics will literally surpass that, yet you want to throw it away for AI slop. Are you even hearing yourself?
Until today.

The others are just tech demos. Games dont look like that yet

(until today)

And Kojima's game wont release for another 5 years if we're lucky. It will also likely not have any extended gameplay, like Hellblade 2.


DLSS 5 changes the equation. It can deliver faces that look just as good, at a fraction of the development time, in real-time, in full-scale games...something any studio could realistically achieve.
Are You okay ? those AI slop Starfield characters neither looks like this, nor does the environment look like this.
The "amazing graphic" (trash) would have been achieved next gen with better character rendering, better environments, etc.

What Are we doing here. You do realize that that 2nd 5090 power could be utilized to power a path tracing 2.0 for environments like below? The character rendering we are already getting, its a lock. But the environment rendering we can get on PC if that extra 5090 power was focused on running Path tracing with alot more samples and more powerful approximation. Do you know how much compute power and vram is takes to run these deep fake models? Especially in real time? Why do you think companies like OpenAI, Google, Meta, XAI are spending trillions on GPUs?


widen_1840x0.jpeg


18884e146438613.62b17c22c0a61.jpg
 
Last edited:
You don't let AI mess with human art. Nvidia engineers are all your typical clueless autists but surely they have a PR department?
They are too preoccupied with buying huang biker jackets so he feels cool.

Only the Asscreed pics looks good to me with DLSS5, the rest looks like cheap AI youtube videos reimagining whichever IP. Doesn't look human, doesn't look made for humans.

 
I like how they said WATCH THIS SPACE almost 2 years ago, and this is the best they could do lol



i do wonder if RTAO would fix the shadows in an area like this.

Gufwy8IWMAARW5U


Reflections are fine on Decima water. Its probably got the best water reflections on consoles. Didnt really need RT reflections there.

Water has all typical SSR problems

 
I wonder how reflections will work with dlss5, specifically off-screen stuff as it seems this is just a screen-space effect (with access to motion vectors, geometry etc). Maybe it still relies on a ray traced reflection to get the data before enhancing it.
 
this is why i think they shouldve just focused on adding path tracing to non-path traced games first instead of releasing a photorealistic filter that completely changes the look of the game. or make path tracing cheaper in general.

hell, invest in making framegen work at 30 fps so i dont have to reduce settings to get the game running at 60 fps before i can turn it on. this is too fast and too soon, and too different. i like it, but i can definitely see people just saying no, this is witchcraft.
Isn't that just RTX remix?
 
I wonder how reflections will work with dlss5, specifically off-screen stuff as it seems this is just a screen-space effect (with access to motion vectors, geometry etc). Maybe it still relies on a ray traced reflection to get the data before enhancing it.

Some detail here:


The AI model powering DLSS 5 is a single unified model. Same model for every game. It's not trained per-title, per-face, or per-object type. It takes the raw color buffer and motion vectors as input, analyzes the scene semantics from that single frame, and enhances the lighting and material response while staying anchored to the original 3D content. It recognizes the difference between skin and metal and water and stone and foliage, and it processes each of those materials differently based on how light should interact with them.
During the demo, the DLSS research talked through the level of granularity available. Developers don't just get an on/off switch. They get intensity controls that can be dialed anywhere, not just full strength. They get spatial masking, so they can set the water enhancement to 100%, wood to 30%, characters to 120%, all independently within the same scene. They get color grading controls for blending, contrast, saturation, and gamma. All of this runs through the existing SDK, which means studios already using DLSS and Reflex have a familiar pipeline to work with.

Water looks pretty spectacular. i think they are just replacing the existing water with whatever their AI model produces. So in effect, it doesnt really matter what was there before. it will detect water and replace it with their AI trained water.
 
I can see the point of contention but it is stupid to think that capable devs are just gonna apply a tiktok filter to their protagonists and call the day...
Is a technology that gonna work on high end PCs first. that is optional to the user, that is ooptinal to the developers. And that if it is available it has the artistic consent of the developer team. But is easier to yell AI SLOP and throw some memes. ( the memes are good though :messenger_grinning_sweat:)
 
I can see the point of contention but it is stupid to think that capable devs are just gonna apply a tiktok filter to their protagonists and call the day...

It's mostly memes at this point, but I don't even agree that the lighting is "improved". It's punchy but it's extremely gaudy and unrealistic. It looks like everything is lit in a photo studio.



nA9q4IdeXYOxT6xb.png


Is pre-rendered CGI lit like that? Are movies lit like that? Is real life lit like that?

The answer is no.


I think there's potential in the tech but the initial showing couldn't have been much worse than it was. It looks frankly ridiculous right now.


And I know it's in pre-alpha or whatever, but the fact that it took two 5090s to run this is crazy as well. Imagine what 2 5090's could do with traditional rendering. If this AI enhancement is this expensive, what's the fucking point?
 
Last edited:
I have a 4080 so i probably need to upgrade to at least 6070, not gonna buy a 5000 series and i'm ever gonna play at anything lower than 4k.
No way 6070 has enough of vram pool or actual oomph, it wont be stronger from current 5080 and thats way too weak, 5070 to 5080 is about +50% perf in games(and obviously 4gigs more vram) :P
Imho lowest we could hope for is 6070ti aka same die but cut down a bit than 6080, 24gigs of vram minimum and likely well over 1k usd streetprice too- thats why my take is, even in 1,5 year from now on dlss5 will still be niche af, kinda like rt/dlss was durning turing.
 
Top Bottom