• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Graphical Fidelity I Expect This Gen

I briefly tried dlaa the other day and it looked like the iq was definitely cleaner but it was beyond heavy, it killed my framerate if used with path-tracing.

I really can't play at lower than 4k, even if i fix the iq by going native and lowering the res, the game is still gonna look like native 1440p and that's a big nono for me.

Maybe i could try native 4k\dlss, framegen and no path tracing, maybe i can reach 60 almost stable...
nvidia released the fixed drivers. just install that and see if that improves your performance. try and run the game at dlss balanced.
 
I dont understand why one guy is allowed to derail the thread repeatedly. We already have had one regular cowboylou banned from this thread because of this, I dont want others to get banned after falling for clear trolling and fanboyism.

Let's put these people on ignore to avoid future derails, and get mods to lift the ban on CowboyLou CowboyLou .

I have lifted the thread ban for CowboyLou CowboyLou .

Let's keep it civil.
 
begs the question why the RTGI in Arc Raiders is so bad when some very talented ex-DICE devs worked on it for over 7 years.

HyToXobgnKZn95wE.jpg


8ugAHaQxlH4gkyCY.jpg

ksMqaRNczJenE6Uv.jpg

Looks like either baked lighting where the probes are misplaced inside geometry leading to light leaks, or the geometry is just made such there are gaps somewhere which leads to light leaks when using RTGI.

Either way, if those are still in the game, I am very surprised they let those go unnoticed.
 
I have lifted the thread ban for CowboyLou CowboyLou .

Let's keep it civil.
Thank you. I will behave.

begs the question why the RTGI in Arc Raiders is so bad when some very talented ex-DICE devs worked on it for over 7 years.

HyToXobgnKZn95wE.jpg


8ugAHaQxlH4gkyCY.jpg

ksMqaRNczJenE6Uv.jpg
The flaws are super obvious in shots like these, but I don't think I've ever registered anything looking quite this bad after 60 hours in game. There are a lot of weather conditions and the overcast ones can look a bit better imo. These shots look like some of the brightest possible weather conditions which I don't see too often.

But most people, including me, aren't stopping to pore over the graphics in a game like this. Embark made the overall right call by optimizing for a lightweight solution and the game can look quite striking overall, but it's nowhere near the standards of this thread and I really wish there was a higher-quality option. I have the performance to spare. The overall visual makeup is essentially UE4 with a sub-sub-Lumen lightweight GI solution on top.

If I knew exactly where those locations were in game I'd make an effort to go check them out and see if anything's changed, but there's a lot of asset/structure reuse so I have no idea.
 
Last edited:
I think it's not needed.

Preset E is older than .dll file RE9 uses. I have no idea why game defaults to D.
Well i did it anyway, for both rr and dlss, i guess it can't hurt, since swapper doesn't let me change profile, i can only do it from inspector (i dont have the nvidia app).

Hopefully i don't pass all the night troubleshooting instead of playing, if a couple of things don't work i just leave everything as it is and just play, other than the trash iq the game runs well otherwise.
 
No i already used older drivers for the performance uplift but dude, PT+dlaa at 4k is too much for a 4080 even if you have normal performances.
these are the game ready drivers for Requiem that were pulled back before requiem ever came out. they were causing all kinds of issues on nvidia GPUs.

The game was meant to be played on these drivers. Everyone should see an improvement, 4000 or 5000 series should all see better or at least more consistent performance. remember nvidia likely implemented the path tracing so the latest drivers were probably needed to get the most of out of this game's path tracing mode.
 
Looks like either baked lighting where the probes are misplaced inside geometry leading to light leaks, or the geometry is just made such there are gaps somewhere which leads to light leaks when using RTGI.

Either way, if those are still in the game, I am very surprised they let those go unnoticed.
Thank you. I will behave.


The flaws are super obvious in shots like these, but I don't think I've ever registered anything looking quite this bad after 60 hours in game. There are a lot of weather conditions and the overcast ones can look a bit better imo. These shots look like some of the brightest possible weather conditions which I don't see too often.
the point is that some of the most technically accomplished devs in the industry. the guys who literally built Frostbite were not able to get the most out the same RT technique this guy on twitter is saying is so much better than lumen.

Lumen is expensive because its better than even some hardware based ray tracing techniques. Arc Raiders runs at 1440p 60 fps precisely because it has these edge cases where the game does not look nearly as good. Just like games like TLOU2 which look phenomenal to this day but have issues in some rooms that dont get proper light bounce.
But most people, including me, aren't stopping to pore over the graphics in a game like this. Embark made the overall right call by optimizing for a lightweight solution and the game can look quite striking overall, but it's nowhere near the standards of this thread and I really wish there was a higher-quality option. I have the performance to spare. The overall visual makeup is essentially UE4 with a sub-sub-Lumen lightweight GI solution on top.

If I knew exactly where those locations were in game I'd make an effort to go check them out and see if anything's changed, but there's a lot of asset/structure reuse so I have no idea.
I agree with this. I believe i said this in the other UE5 thread last week. Some games, especially multiplayer games, dont need all this fancy tech and the priority should be higher resolutions at 60 fps. hell im not even that mad at Forza Horizon 6 anymore because i realized that anything below 1440p 60 fps is going to kill the IQ on consoles, and any further increase in foliage quality wouldnt really show up at lower resolutions.
 
Last edited:
these are the game ready drivers for Requiem that were pulled back before requiem ever came out. they were causing all kinds of issues on nvidia GPUs.

The game was meant to be played on these drivers. Everyone should see an improvement, 4000 or 5000 series should all see better or at least more consistent performance. remember nvidia likely implemented the path tracing so the latest drivers were probably needed to get the most of out of this game's path tracing mode.
They already tested them on reddit, they don't have the same perf uplift of older drivers apparently.

I'm gonna wait to read some more testing before upgrading, nvidia has been kinda terrible with drivers lately.
 
Last edited:
the point is that some of the most technically accomplished devs in the industry. the guys who literally built Frostbite were not able to get the most out the same RT technique this guy on twitter is saying is so much better than lumen.
I'm not familiar with this guy on Twitter and I'm not gonna pretend to understand 100% of what he's explaining, but I'm reluctant to outright take his word on this. He's not the first one claiming to have cracked Unreal Engine wide open with his oh so special custom fork. As you've pointed out many times, the other engines out there attempting UE5-level fidelity are are dropping res and performance to fairly comparable levels in their console 30 fps modes.

I'll be playing some ARC Raiders next couple days before Marathon drops, so if I see anything like Alex's shots I'll grab pics for comparison. I think its GI looks great - for what it is - the vast majority of the time. But maybe I need to look closer, which admittedly I'm usually not doing while in a raid.
 
Last edited:
And this is why you need UE5.
R638HaNjYk3fsUaL.jpeg
It looks a bit rough, but to be honest not that bad, didn't reach this area. But overall yeah, idk, the 1st area it's like in a league of its own.

From that point on everything is a little downgraded. The first big area you explore with Grace still has that amazing lighting, but assets and geometry are a bit lower quality.
Playing more and more, I start to notice things that aren't so next gen anymore. It's like slowly becoming 50-50, lighting takes the cake here imo, it masks a lot of low quality stuff, with textures and assets.

Textures especially if you play in 1st person, and assets, you notice this a lot with pipes and cables, those are last gen as hell, very blocky.

And man, if you disable depth of field, some of the cutscenes are poor. I usually don't mind depth field, but here it's weird and I disabled it.
 
They already tested them on reddit, they don't have the same perf uplift of older drivers apparently.

I'm gonna wait to read some more testing before upgrading, nvidia has been kinda terrible with drivers lately.

Yeah, both 591 and two 595 drivers are shit in this game.

WXNTKfnmsMr3hbnP.jpg


581 is the last driver that performs good in this game, I have seen more than 33% better performance. Apparently 5xxx series also performs better on older drivers.
 
Some people may disagree with me here, but Resident Evil Requiem to me is very similar to Death Stranding 2.

The most impressive parts of both games visually are right at the very start. Not to say what comes after isn't impressive, because some parts in both come close, but it's very clear and obvious to me that the highest peaks are the openings.
 
Some people may disagree with me here, but Resident Evil Requiem to me is very similar to Death Stranding 2.

The most impressive parts of both games visually are right at the very start. Not to say what comes after isn't impressive, because some parts in both come close, but it's very clear and obvious to me that the highest peaks are the openings.

DS2 never reaches heights of opening sequence again (at least in open world gameplay). RE9 (so far) as well but gap is much smaller IMO

c5mcJa08kaV3ZOSM.jpeg
1iWYEqnuK7tMmeCD.jpeg


Fot this shot - so many small things are added thanks to PT (rough reflections of the scene on monitors, mirror like reflection on clock, lots of precise RT shadows etc.):

MwxRClNDK025BN13.jpeg

zmLWVwajL44cojQm.jpeg
Ap0j7yTTpstD6Tzk.jpeg
 
begs the question why the RTGI in Arc Raiders is so bad when some very talented ex-DICE devs worked on it for over 7 years.

HyToXobgnKZn95wE.jpg


8ugAHaQxlH4gkyCY.jpg

ksMqaRNczJenE6Uv.jpg


because RTXGI is designed around performance not quality. RTXGI is also not traditional raytracing btw... it's just called RTXGI because it's made by Nvidia and part of their RTX branch of Unreal Engine. it's very coarse and uses very little rays, which is why it runs fast on non-RT accelerated hardware.

Vite uses DDGI which is far higher quality but also has a higher render cost.



of course it still shits all over Lumen while only having like 10% of the render cost. the guy did a, let's call it "ground truth" check of this scene, by doing a full path tracing render of it, and DDGI is higher quality than Lumen, and closer to the "ground truth" render.

Lumen is pure dog shit. it's time for you to just accept that fact.
 
Last edited:
I'm not familiar with this guy on Twitter and I'm not gonna pretend to understand 100% of what he's explaining, but I'm reluctant to outright take his word on this. He's not the first one claiming to have cracked Unreal Engine wide open with his oh so special custom fork. As you've pointed out many times, the other engines out there attempting UE5-level fidelity are are dropping res and performance to fairly comparable levels in their console 30 fps modes.

I'll be playing some ARC Raiders next couple days before Marathon drops, so if I see anything like Alex's shots I'll grab pics for comparison. I think its GI looks great - for what it is - the vast majority of the time. But maybe I need to look closer, which admittedly I'm usually not doing while in a raid.
watch the Alex video or maybe it was the PS5XSX video by Oliver. Plenty of examples of interiors looking like this. Outdoors the game looks fantastic with really great lighting during the day. Foliage and trees look good too.

It looks a bit rough, but to be honest not that bad, didn't reach this area. But overall yeah, idk, the 1st area it's like in a league of its own.

From that point on everything is a little downgraded. The first big area you explore with Grace still has that amazing lighting, but assets and geometry are a bit lower quality.
Playing more and more, I start to notice things that aren't so next gen anymore. It's like slowly becoming 50-50, lighting takes the cake here imo, it masks a lot of low quality stuff, with textures and assets.

Textures especially if you play in 1st person, and assets, you notice this a lot with pipes and cables, those are last gen as hell, very blocky.

And man, if you disable depth of field, some of the cutscenes are poor. I usually don't mind depth field, but here it's weird and I disabled it.
I dont think its bad either. just something to keep in mind why some games arent as heavy on the GPU. It's like that saying you get what you pay for, well, when it comes to graphics, you get what you invest in. The game is hitting 60 fps at high resolutions because they cut corners.
 
because RTXGI is designed around performance not quality. RTXGI is also not traditional raytracing btw... it's just called RTXGI because it's made by Nvidia and part of their RTX branch of Unreal Engine. it's very coarse and uses very little rays, which is why it runs fast on non-RT accelerated hardware.

Vite uses DDGI which is far higher quality but also has a higher render cost.



of course it still shits over Lumen while only having like 10% of the render cost. the guy did a, let's call it "ground truth" check of this scene, by doing a full path tracing render of it, and DDGI is higher quality than Lumen, and closer to the "ground truth" render.

Lumen is pure dog shit. it's time for you to just accept that fact.

why didnt embark studios which was formed in 2018 go with DDGI if its so performant and has better visuals than RTXGI? These guys built Frostbite and stunners like Battlefront 2 and BF1. 8 years of working with this engine, two shipped games, and they are still not using this tech. Why? Maybe, just maybe its not feasible.

i dont really care about lumen. if it can be replaced by other tech thats better than fine. I dont even care about ray tracing. as long as the game looks next gen, devs should feel free to use whatever tech they want. Baked lighting, last get LOD based assets, screenspace shadows and reflections, go nuts. I just go by the final results. hell, personally id prefer baked GI so the remainder of the GPU power can be used to push better volumetric effects, physics, destruction, and CG quality assets.
 
watch the Alex video or maybe it was the PS5XSX video by Oliver. Plenty of examples of interiors looking like this. Outdoors the game looks fantastic with really great lighting during the day. Foliage and trees look good too.
I watched Alex's vid at launch, might give it another watch just so I can see if anything's changed in the game's current state. I agreed with his take that there should be a higher quality lighting option for those who want it, which predictably let to a bunch of comments calling him a snob and elitist and "the game already looks good enough". Well yeah it looks good but why not give me the option? Just gimme a Lumen option, c'mon.

My biggest issue with the visuals is the pop-in. Not necessarily the LOD management, which is comparable to most other games that aren't using virtualized geometry, but sometimes it feels there's something weird going on with culling. Camera pans, especially fast ones, can reveal a lot of assets popping in and out. Happens even on the practice range which is a tiny map. I think this occurred in a couple UE4 games like Jedi Survivor (it's not as bad as the examples I recall seeing from that game though).
 
why didnt embark studios which was formed in 2018 go with DDGI if its so performant and has better visuals than RTXGI? These guys built Frostbite and stunners like Battlefront 2 and BF1. 8 years of working with this engine, two shipped games, and they are still not using this tech. Why? Maybe, just maybe its not feasible.

they use RTXGI because they use UE RTX and not UE Vite.
Vite is a branch made by enthusiasts not by a big company. also I'm not even sure how old Vite is and how far in development it was when Arc started development.

also they did use the RTX branch for The Finals, so they probably just are familiar with it and didn't want to change branches.


i dont really care about lumen. if it can be replaced by other tech thats better than fine. I dont even care about ray tracing. as long as the game looks next gen, devs should feel free to use whatever tech they want. Baked lighting, last get LOD based assets, screenspace shadows and reflections, go nuts. I just go by the final results. hell, personally id prefer baked GI so the remainder of the GPU power can be used to push better volumetric effects, physics, destruction, and CG quality assets.

we wouldn't need baked GI if developers would stop using UE5... but sadly that's not gonna happen because devs want quick and dirty results where developers can be swapped out like AAA batteries and don't need training on a specific engine or enfine branch.

imagine a AAA studio embracing UE Vite, and fully utilising it instead of using Lumen!



3+ times the performance, better results in terms of accuracy AND stability...
but that's not happening, because everyone is already using UE5 and now used to just smearing Lumen over everything, even tho it literally looks worse than last gen games
 
Last edited:
Well, no pt-rr, dlaa and framegen solved the iq problem.

Sure the game took a graphic hit for the lack of pt but the gain in iq is major, also, also raccoon city even with pt looked pretty dull compared to the previous locations.

Edit: nope, with profile m and dlss 4.5 dlaa is way too sharp and introduce weird stuff on metal surfaces, perf mode and no framegen is the way, now the game looks perfect and run at 100 fps almost maxed out.

4.5 is black magic i swear.
 
Last edited:
they use RTXGI because they use UE RTX and not UE Vite.
Vite is a branch made by enthusiasts not by a big company. also I'm not even sure how old Vite is and how far in development it was when Arc started development.

also they did use the RTX branch for The Finals, so they probably just are familiar with it and didn't want to change branches.




we wouldn't need baked GI if developers would stop using UE5... but sadly that's not gonna happen because devs want quick and dirty results where developers can be swapped out like AAA batteries and don't need training on a specific engine or enfine branch.

imagine a AAA studio embracing UE Vite, and fully utilising it instead of using Lumen!



3+ times the performance, better results in terms of accuracy AND stability...
but that's not happening, because everyone is already using UE5 and now used to just smearing Lumen over everything, even tho it literally looks worse than last gen games

if this thing is legit then i hope devs use it. no point in wasting GPU cycles on something that looks and performs worse.
 
Been playing more BF6 and DICE's DLSS implementation is still fucked. DLAA and DLSSQ are way too sharp unless I'm a dumbass and missed a sharpening slider somewhere. Absolutely mangles character faces. And you still sometimes get glowing white outlines around characters etc. in lobby. This was an issue in BF2042 and I'm disappointed it's still not addressed. Obviously 99% of the time you're not looking at this stuff, but I'm still sticking with the default TAA. It's blurrier but it's the lesser of two evils.

I hate when DLSS breaks random shit. There are some older implementations that completely remove DoF in a couple games for example.

I remember hearing speculation back in the day that BF2042's DLSS was busted because it wasn't fully replacing the existing TAA, like they were getting layered somehow and that was messing things up. Can't remember where I heard it though or if they were just talking out their ass.
 
Last edited:
watch the Alex video or maybe it was the PS5XSX video by Oliver. Plenty of examples of interiors looking like this. Outdoors the game looks fantastic with really great lighting during the day. Foliage and trees look good too.


I dont think its bad either. just something to keep in mind why some games arent as heavy on the GPU. It's like that saying you get what you pay for, well, when it comes to graphics, you get what you invest in. The game is hitting 60 fps at high resolutions because they cut corners.
Yeah, they cut corners, idk where to put this game, at 1st I thought ur was way better Silent Hill 2, now I'm not so sure. Maybe it's my fault, I've disabled all those effects, like lens flares, lens dirt, distortion, chromatic aberration, depth of field.

But these settings shouldn't affect stuff like this:
TyRbV5TWdZm8WEOc.png


Or this, the lighting is still nice, but that pipe in the upper left is rough, some of these assets are like a fork in my eye. I mean, no game has perfect round shapes, but in this game these are always like this.
DkGv3TnfKnBaOTsX.png
 
Are you sure this is not a ue5 console related problem because of bad upscaling?

I don't think i saw hairs so bad to be distracting tbh.

Maybe they were slightly fuzzy sometimes but artifacting mess?

1379103.jpg

hq720.jpg

Are you sure this is not a ue5 console related problem because of bad upscaling?

I don't think i saw hairs so bad to be distracting tbh.

Maybe they were slightly fuzzy sometimes but artifacting mess?

1379103.jpg

hq720.jpg
Most UE5 games do seem to struggle with hair rendering imo, just looks fizzly and artifacty even at high resolutions. Looks great in these shots though.
 
Top Bottom