• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Graphical Fidelity I Expect This Gen

The discourse around Crimson Desert is so bizarre to me. It's a beautiful and technically impressive game, no one in the reviews is denying that, it seems to have quite a few flaws and bugs but it was this studio's first try at a massive singleplayer game, if they learn the right lessons from this it can only get better from here on.
The doomerism from shills just because it's not being reviewed as a 10/10 game is hilariously idiotic.
 
Last edited:


6Wo1wh3Qtv88ZHXV.jpeg


Just for the record I am pro dlss5 but these made me laugh:messenger_grinning_smiling:

Lore accurate Kratos
 
Last edited:
Really good & well-balanced take on this whole DLSS 5 thing, from our friend Dallas. A must watch if you're interested. This is from a game developer who is also an Artist, he's making his own indie game AND he's making another game under a development studio.

 
Last edited:
From the other thread on Crimson Desert's benchmarks:
JX3zWJIZF17mqidw.png

This is 4K native. With DLSS everyone will be enjoying the path traced glory of the game.

Came to post this. Very impressive.

Going to run this at 4k dlss quality and ray reconstruction. i dont think this is enabled in these benchmarks.

Ray Reconstruction is not enabled in these benchmarks, and you lose like 70% performance with it on. With a 5080 you'd probably have to go DLSS Balanced + Frame Gen or even DLSS Performance + Frame Gen or you'll be dipping below 60fps. Just an FYI
 
Last edited:
This is some great work, less visually offensive than the default and looks much better integrated. I wonder if devs can tweak this within dlss, as this is much better for sure.

It's definitely less visually offensive, but if DLSS5 is going to be expensive to render, then what's the point? You might as well use those resources on RT/PT or textures or lighting or detail or whatever.


I know they say they'll have it running on one card when it releases, but the fact that it took two 5090s today and this tech is supposedly releasing in like 6 months doesn't make me confident that this is going to be a performant technology when it releases.
 
It's definitely less visually offensive, but if DLSS5 is going to be expensive to render, then what's the point? You might as well use those resources on RT/PT or textures or lighting or detail or whatever.


I know they say they'll have it running on one card when it releases, but the fact that it took two 5090s today and this tech is supposedly releasing in like 6 months doesn't make me confident that this is going to be a performant technology when it releases.
Yeah I agree, it seems likely it's relying on MFG currently too.
 
Just catching up on the DF post regarding DLSS 5 -
"it doesn't have access to original game assets, geometry, depth or per-material metadata,".
Surely they should work on integrating this and it would solve a lot of the potential issues.
 
Here we actually see how demanding game is trully maxed(including ray reconstruction) at 4k native on 5090 25-30fps:


To get mostly stable 60 u need to go with 4k dlss balanced, coz quality stays below 60 all the time:


Obviously no fake frames coz fk dat shit (personally i can somewhat tolerate 2x if i play on joypad but thats it) :P

Edit:
For 5080 peons like myself:
4k dlss perf(so internal 1080p, IQ is still solid enough)and ultra preset(so not maxed/cinematic) + ray reconstruction makes fps stay in the 40s, sometimes above 50s too.

 
Last edited:
38% improvement on 1% lows and 39% on avg fps between 4080 and 4090 is already huge gap, but:
+63% improvement on 1% lows and 60% on avg fps between 5080 and 5090 is basically digital colorado canion, no wonder those gpu's sell for 4k in the wild :messenger_astonished:
Dont even wanna imagine what will happen between 6080 and 6090 lool

You get half of the 5090 with 5080 🫠
 
The immensely negative feedback to DLSS5 has given me zero hope for the future of this industry.

Gamers are literally PETRIFIED of ANYTHING new.

This is first-gen tech. Of course it's not perfect. That's the point. The real value is where it leads, not where it starts.

You're all just screaming out A.I sLoP without even giving it a shot

Crying about needing 2 5090's when they clearly stated, multiple times, it would be optimized to run on 1 consumer level GPU by launch.

As if they would be fucking dumb enough to launch a thing that requires 2 5090's....

Crying about aRtIst InTent, when the first fucking thing 95% of you do in a game is disable Film Grain, C/A, Motion Blur... which all play into how the artist intended the game to look.

Crying about artist intent, when you willingly uglify the game, in order to improve performance.

This tech has the potential to cut YEARS off of development time and give us generational leaps in visual fidelity at the same time....

But no, we cant even fucking give it a chance? You're just going to make memes and shout out Ai Slop like a fucking programmed robot?

Holy christ, fucking no wonder the industry has been so stagnant for years.

Companies like Nvida, who clearly actually care about pushing visuals forward are demonized for embracing new tech all the fucking time.

Then people wonder why progress feels slow.

Companies that actually try to move things forward get dragged the hardest. Meanwhile, everyone complains about stagnation. You don't get both. You either accept early, imperfect innovation OR you stay stuck refining the same ideas over and over. Period.

We wont get anywhere like this, just blatant hate before you even fucking try the thing.

Its the very first iteration of a brand new technology - and instead of having an open mind and realizing how insane this tech can be for future games down the line, we have people making disingenuine memes, and screaming out whatever it is their alghorithm tells them to.

The fact that EVERY SINGLE YOUTUBER has the same "opinion" is just absurd.

The backlash feels very uniform. Every. Single. Major YouTuber lands on the exact same talking points, like it's prepackaged, down to the thumbnail. That's not organic skepticism, that's herd behavior, hating what the internet tells you to hate.

At this point, it's not even about DLSS anymore. It's about a community that refuses to engage with change unless it's already safe, proven, and polished.

We will never get anywhere with an industy full of fans who just want the same old graphics at 60fps every goddamn gen
 
Last edited:
The immensely negative feedback to DLSS5 has given me zero hope for the future of this industry.

Gamers are literally PETRIFIED of ANYTHING new.

This is first-gen tech. Of course it's not perfect. That's the point. The real value is where it leads, not where it starts.

You're all just screaming out A.I sLoP without even giving it a shot

Crying about needing 2 5090's when they clearly stated, multiple times, it would be optimized to run on 1 consumer level GPU by launch.

As if they would be fucking dumb enough to launch a thing that requires 2 5090's....

Crying about aRtIst InTent, when the first fucking thing 95% of you do in a game is disable Film Grain, C/A, Motion Blur... which all play into how the artist intended the game to look.

Crying about artist intent, when you willingly uglify the game, in order to improve performance.

This tech has the potential to cut YEARS off of development time and give us generational leaps in visual fidelity at the same time....

But no, we cant even fucking give it a chance? You're just going to make memes and shout out Ai Slop like a fucking programmed robot?

Holy christ, fucking no wonder the industry has been so stagnant for years.

Companies like Nvida, who clearly actually care about pushing visuals forward are demonized for embracing new tech all the fucking time.

Then people wonder why progress feels slow.

Companies that actually try to move things forward get dragged the hardest. Meanwhile, everyone complains about stagnation. You don't get both. You either accept early, imperfect innovation OR you stay stuck refining the same ideas over and over. Period.

We wont get anywhere like this, just blatant hate before you even fucking try the thing.

Its the very first iteration of a brand new technology - and instead of having an open mind and realizing how insane this tech can be for future games down the line, we have people making disingenuine memes, and screaming out whatever it is their alghorithm tells them to.

The fact that EVERY SINGLE YOUTUBER has the same "opinion" is just absurd.

The backlash feels very uniform. Every. Single. Major YouTuber lands on the exact same talking points, like it's prepackaged, down to the thumbnail. That's not organic skepticism, that's herd behavior, hating what the internet tells you to hate.

At this point, it's not even about DLSS anymore. It's about a community that refuses to engage with change unless it's already safe, proven, and polished.

We will never get anywhere with an industy full of fans who just want the same old graphics at 60fps every goddamn gen

We've seen tons of improvements & innovations in recent times. There's always going to be naysayers, you can't control people's opinions, but the community has largely accepted and adopted:

  • Temporal upscaling
  • Ray-tracing / Path-tracing
  • HDR rendering
  • Frame Generation
  • 60fps standard on consoles
  • Cloud streaming / in-home streaming
  • Virtualized geometry

etc.. etc..


People are pushing back hard on DLSS 5 because it was an extremely poor first showing, nvidia has been repeatedly dishonest about it (devs involvement / it's genAI nature / etc..), and Nvidia can make vague performance claims all they want, but it needing two 5090s to run when it's supposedly due out 6 months from now should be concerning to anyone with half a brain. No one thinks it's going to require two 5090s when it launches, obviously, but the performance concerns are completely valid.


And that's without even factoring in the moral/ethical dilemma that many have around genAI already.



I'm so sick of some people blaming the general public's reaction around this instead of, I don't know, maybe holding the highest valuated company in the world, larger than Apple, larger than Google, larger than Amazon, larger than Microsoft, to an actual standard of responsibility in the way in which they rolled out this absolute clusterfuck of a reveal.
 
Last edited:
We've seen tons of improvements & innovations in recent times. There's always going to be nayseyers, you can't control people's opinions, but the community has largely accepted and adopted:

  • Temporaral upscaling
  • Ray-tracing / Path-tracing
  • HDR rendering
  • Frame Generation
  • 60fps standard on consoles
  • Cloud streaming / in-home streaming
  • Virtualized geometry

etc.. etc..


People are pushing back hard on DLSS 5 because it was an extremely poor first showing, nvidia has been repeatedly dishonest about it (devs involvement / it's genAI nature / etc..), and Nvidia can make vague performance claims all they want, but it needing two 5090s to run when it's supposedly due out 6 months from should be concerning to anyone with half a brain. No one thinks it's going to require two 5090s when it launches, obviously, but the performance concerns are completely valid.


And that's without even factoring in the moral/ethical dilemma that many have around genAI already.



I'm so sick of some people blaming the general public's reaction around this instead of, I don't know, maybe holding the highest valuated company in the world, larger than Apple, larger than Google, larger than Amazon, larger than Microsoft, to an actual standard of responsibility in the way in which they rolled out this absolute clusterfuck of a reveal.
high quality GIF
 
Represent. Represent. It's not that deep.

They just need to admit the faults of early tech instead of doubling down and saying 'everyone else is wrong', improve the tech a bit more, and then have a better second rollout.

This is a fleeting moment in time. The important thing is that a message was made clear about what people don't want. That doesn't mean they should 'abandon the tech entirely'
 
Interesting thread on Reddit about fixing DLSS5. I tried to show with my photoshop a few pages back how developers could get rid of the AI look, but this guy goes way further:

The tonemapping in DLSS 5 is fucked, and somehow nobody in the chain of command thought to just not do that then. But the relighting underneath genuinely does look excellent, especially from worse baselines. You can't generally just undo overbaked HDR, because it loses data, but luckily we have most of what we need already, in the comparison shot. It requires near-pixel-perfect alignment, which we don't always get in the comparison, but when you have it, the recovery strategy is simple. Here's the one I used, after a little experimentation:

  • Use DLSS 5 as base
  • Apply original image's HSV Saturation — restores design-intent color grading
  • Apply original image's LCh Lightness at 50% — reduces the local HDR effect intensity
  • Apply original image using Darken Only at 50% — reduces overbrightening
You might need to apply some masking around blacks or greys when applying saturation, to avoid obvious artifacts. I used Gimp's Color to Alpha on black with as precise a filter as I could get away with, but it needed some tweaking and didn't work for greys, so I'm sure that's not actually the right approach.

Basically just by fixing Nvidia's shitty tonemapping he restores the original look of the game.

dlss-5-corrected-for-tone-mapping-v0-o9zyqtpeanpg1.jpeg


Some examples with image slider:


A few points:

1. Most of the 'slop' look is actually down to the tonemapping Nvidia employed in this reveal. I don't know why they did that. Maybe to try and either make the games look more realistic or to make DLSS5 look like a bigger leap than it is. It's like adding a "Cyberpunk FULL GAME 60FPS HDR ULTRA REALISM" mod to each game (since those mods mostly achieve what they do by changing tonemapping and color lut).

2. With the original artistic intent™ restored, the difference which DLSS5 brings is much more subtle, but awesome looking in my opinion. It just becomes another welcome layer of lighting enhancement. The only question is whether that is worth the performance cost of (presently) a whole ass RTX 5090.

3. Nvidia are actually telling the truth when they say DLSS5 is not changing the assets or geometry of the game, as the poster on Reddit shows that you can completely restore the look of the character models just by changing the tonemapping.

SUUidOmS.webp


FZ7XPndH.webp


4. Presumably tonemapping is an aspect of DLSS5 which Nvidia are referring to the developers having some control over. All the artists really need to do here is eschew any Nvidia tonemapping changes and stick to the lighting enhancements only.
 
Ray Reconstruction is not enabled in these benchmarks, and you lose like 70% performance with it on. With a 5080 you'd probably have to go DLSS Balanced + Frame Gen or even DLSS Performance + Frame Gen or you'll be dipping below 60fps. Just an FYI
FightinCowboy said in his video that playing at 4K, Cinematics, DLSS 4.5 Quality w/ FG uses ~9-9.5 GB VRAM with 70-80 FPS. If he enables RR, loses ~15 FPS.
 
Hardware Unboxed just did a nice analysis of Ray Reconstruction. Long story short is that it's a ~67% performance hit. DLSS Performance + Ray Reconstruction runs even slower than DLAA max settings:


KUm0uqXJaM02SbX7.png







Also, using Ray Reconstruction kills all rain in the game. The denoiser interprets rain as noise and removes it from the image.
 
Last edited:
Hardware Unboxed just did a nice analysis of Ray Reconstruction. Long story short is that it's a ~67% performance hit. DLSS Performance + Ray Reconstruction runs even slower than DLAA max settings:


KUm0uqXJaM02SbX7.png







Also, using Ray Reconstruction kills all rain in the game. The denoiser interprets rain as noise and removes it from the image.


So there is this dilemma of having better graphics most of the time vs. having better graphics when it's raining.
 
Interesting thread on Reddit about fixing DLSS5. I tried to show with my photoshop a few pages back how developers could get rid of the AI look, but this guy goes way further:



Basically just by fixing Nvidia's shitty tonemapping he restores the original look of the game.

dlss-5-corrected-for-tone-mapping-v0-o9zyqtpeanpg1.jpeg


Some examples with image slider:


A few points:

1. Most of the 'slop' look is actually down to the tonemapping Nvidia employed in this reveal. I don't know why they did that. Maybe to try and either make the games look more realistic or to make DLSS5 look like a bigger leap than it is. It's like adding a "Cyberpunk FULL GAME 60FPS HDR ULTRA REALISM" mod to each game (since those mods mostly achieve what they do by changing tonemapping and color lut).

2. With the original artistic intent™ restored, the difference which DLSS5 brings is much more subtle, but awesome looking in my opinion. It just becomes another welcome layer of lighting enhancement. The only question is whether that is worth the performance cost of (presently) a whole ass RTX 5090.

3. Nvidia are actually telling the truth when they say DLSS5 is not changing the assets or geometry of the game, as the poster on Reddit shows that you can completely restore the look of the character models just by changing the tonemapping.

SUUidOmS.webp


FZ7XPndH.webp


4. Presumably tonemapping is an aspect of DLSS5 which Nvidia are referring to the developers having some control over. All the artists really need to do here is eschew any Nvidia tonemapping changes and stick to the lighting enhancements only.
Awesome writeup dude. Thanks for the info. Be interesting to see how this develops. I highly doubt the "AI slop" look will be much of an issue when we start seeing the final product, it was just a shitty demo.
 
Very interesting, turns out Grace's Path Traced face in cutscenes IS the furthest one from the original look and that's due to PT's massive change to lighting that isn't necessarily accounted of by the developers' rather pre-baked and hand-placed lights. That's not a flaw of PT, that's just a misalignment by the developer's implementation of it in many cutscenes.

Now to the main point, it's actually DLSS 5 that is being closer to the Ray Traced one which is quite something! I favor it over all others IN THIS EXAMPLE.



PpUQUD3SxavzdCc9.jpeg
 
Last edited:
Very interesting, turns out Grace's Path Traced face in cutscenes IS the furthest one from the original look and that's due to PT's massive change to lighting that isn't necessarily accounted of by the developers' rather pre-baked and hand-placed lights. That's not a flaw of PT, that's just a misalignment by the developer's implementation of it in many cutscenes.

Now to the main point, it's actually DLSS 5 that is being closer to the Ray Traced one which is quite something! I favor it over all others IN THIS EXAMPLE.



PpUQUD3SxavzdCc9.jpeg

There are people seriously thinking DLSS 5 adds anything to the underlying arts. They think video games should work like sculpture/painting or something, muh artistic vision.
 
Digital Foundry alluded to the same thing in their video.

No idea why they'd hide higher settings behind a denoiser toggle.
The terrible thing is that rain disappear with RR...

Now i don't know what to chose, rain looks pretty great in this game...

RR is also beyond heavy...
 
Last edited:
Hardware Unboxed just did a nice analysis of Ray Reconstruction. Long story short is that it's a ~67% performance hit. DLSS Performance + Ray Reconstruction runs even slower than DLAA max settings:


KUm0uqXJaM02SbX7.png







Also, using Ray Reconstruction kills all rain in the game. The denoiser interprets rain as noise and removes it from the image.

Absolute nonsense. The devs said that the ray reconstruction adds more bounces to the ray tracing. Yeah, you couldve done that without toggling ray reconstruction. Increase the rays as you go up graphics presets. But they wanted to trick everyone into thinking that you can max out the game at native 4k 60 fps.

Max out my ass. it looks like a completely different game with ray reconstruction off. especially indoors. So lame for devs to be this deceptive.
 
Absolute nonsense. The devs said that the ray reconstruction adds more bounces to the ray tracing. Yeah, you couldve done that without toggling ray reconstruction. Increase the rays as you go up graphics presets. But they wanted to trick everyone into thinking that you can max out the game at native 4k 60 fps.

Max out my ass. it looks like a completely different game with ray reconstruction off. especially indoors. So lame for devs to be this deceptive.

Agreed. Plus as I shared, RR kills the rain. You literally hear the rain sound effects and see the wet ground but there's no actual rain coming down with RR on. And in fact there's this weird translucent static image retention of rain over the image.

Here's an example:

qcXRDrWbOCsR7K9Z.gif



Half of me is thinking of just putting this game on hold and playing it in a few months when shit like this is (hopefully) ironed out.
 
Last edited:
Maybe RR off, cinematic settings and 4k dlaa is the play here unless you have a 5080-90 and can do RR+DLAA.
 
Last edited:
min 11:18






Please tell me i understood wrong...

So this sounds like a bug. Basically, the whole point of DLSS is that even if you are going down to 1080p when using DLSS 4k Performance, it will use 4k textures and effects. This game seems to be causing blurry textures as you enable DLSS which could mean that its using 1080p textures and effects, but this guy thinks its a bug, and not by design.
 
So this sounds like a bug. Basically, the whole point of DLSS is that even if you are going down to 1080p when using DLSS 4k Performance, it will use 4k textures and effects. This game seems to be causing blurry textures as you enable DLSS which could mean that its using 1080p textures and effects, but this guy thinks its a bug, and not by design.
Bug or not, until they fix this, it is pretty terrible for all the people who can't do native or dlaa...
 
Agreed. Plus as I shared, RR kills the rain. You literally hear the rain sound effects and see the wet ground but there's no actual rain coming down with RR on. And in fact there's this weird translucent static image retention of rain over the image.

Here's an example:

qcXRDrWbOCsR7K9Z.gif



Half of me is thinking of just putting this game on hold and playing it in a few months when shit like this is (hopefully) ironed out.
I will just turn off ray reconstruction when i see the rain come down lol

so stupid. Stuff like this and the DLSS bug shouldve been fixed by launch.

Maybe RR off, cinematic settings and 4k dlaa is the play here unless you have a 5080-90 and can do RR+DLAA.
RR is necessary. Just drop to high or ultra settings. I dont really see a massive difference in textures when enabling DLSS.
 
So this sounds like a bug. Basically, the whole point of DLSS is that even if you are going down to 1080p when using DLSS 4k Performance, it will use 4k textures and effects. This game seems to be causing blurry textures as you enable DLSS which could mean that its using 1080p textures and effects, but this guy thinks its a bug, and not by design.

As he said in the video though, the render resolution of the game also affects the sample count of lighting effects.


The fuck is going on with this game?
 
Last edited:
I will just turn off ray reconstruction when i see the rain come down lol

so stupid. Stuff like this and the DLSS bug shouldve been fixed by launch.


RR is necessary. Just drop to high or ultra settings. I dont really see a massive difference in textures when enabling DLSS.
Watch his settings guide, many settings have no impact or difference between cinematic and fucking medium...

This game is weird as fuck.

Did ue5 ever had any of this problems on pc?
 
Last edited:
We've seen tons of improvements & innovations in recent times. There's always going to be naysayers, you can't control people's opinions, but the community has largely accepted and adopted:

  • Temporal upscaling
  • Ray-tracing / Path-tracing
  • HDR rendering
  • Frame Generation
  • 60fps standard on consoles
  • Cloud streaming / in-home streaming
  • Virtualized geometry

etc.. etc..


People are pushing back hard on DLSS 5 because it was an extremely poor first showing, nvidia has been repeatedly dishonest about it (devs involvement / it's genAI nature / etc..), and Nvidia can make vague performance claims all they want, but it needing two 5090s to run when it's supposedly due out 6 months from now should be concerning to anyone with half a brain. No one thinks it's going to require two 5090s when it launches, obviously, but the performance concerns are completely valid.


And that's without even factoring in the moral/ethical dilemma that many have around genAI already.



I'm so sick of some people blaming the general public's reaction around this instead of, I don't know, maybe holding the highest valuated company in the world, larger than Apple, larger than Google, larger than Amazon, larger than Microsoft, to an actual standard of responsibility in the way in which they rolled out this absolute clusterfuck of a reveal.
I keep thinking of the initial UE5 videos from 2020. The Matrix Demo or the Lara Croft looking game. Those demos showed the potential of UE5, but 6 years later few games have come close to replicating the demo videos. If Nvidia can't even get their reveal videos of DLSS5 to look super polished, I am guessing we are still many years away from DLSS5 actually being worth using. That reveal was not good
 
As he said in the video though, the render resolution of the game also affects the sample count of lighting effects.


The fuck is going on with this game?
Watch his settings guide, many settings have no impact or difference between cinematic and fucking medium...

This game is weird as fuck.

Did ue5 ever had any of this problems on pc?
Sorry, i will watch the vid.
 
Bug or not, until they fix this, it is pretty terrible for all the people who can't do native or dlaa...

If this is related with negative LOD bias having wrong values when upscaling, it can be easily fixed in nvidia profile inspector.

As he said in the video though, the render resolution of the game also affects the sample count of lighting effects.


The fuck is going on with this game?

That's normal for RT games. RT scales with internal res.
 
Top Bottom