• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Graphical Fidelity I Expect This Gen

I briefly tried dlaa the other day and it looked like the iq was definitely cleaner but it was beyond heavy, it killed my framerate if used with path-tracing.

I really can't play at lower than 4k, even if i fix the iq by going native and lowering the res, the game is still gonna look like native 1440p and that's a big nono for me.

Maybe i could try native 4k\dlss, framegen and no path tracing, maybe i can reach 60 almost stable...
nvidia released the fixed drivers. just install that and see if that improves your performance. try and run the game at dlss balanced.
 
I dont understand why one guy is allowed to derail the thread repeatedly. We already have had one regular cowboylou banned from this thread because of this, I dont want others to get banned after falling for clear trolling and fanboyism.

Let's put these people on ignore to avoid future derails, and get mods to lift the ban on CowboyLou CowboyLou .

I have lifted the thread ban for CowboyLou CowboyLou .

Let's keep it civil.
 
begs the question why the RTGI in Arc Raiders is so bad when some very talented ex-DICE devs worked on it for over 7 years.

HyToXobgnKZn95wE.jpg


8ugAHaQxlH4gkyCY.jpg

ksMqaRNczJenE6Uv.jpg

Looks like either baked lighting where the probes are misplaced inside geometry leading to light leaks, or the geometry is just made such there are gaps somewhere which leads to light leaks when using RTGI.

Either way, if those are still in the game, I am very surprised they let those go unnoticed.
 
I have lifted the thread ban for CowboyLou CowboyLou .

Let's keep it civil.
Thank you. I will behave.

begs the question why the RTGI in Arc Raiders is so bad when some very talented ex-DICE devs worked on it for over 7 years.

HyToXobgnKZn95wE.jpg


8ugAHaQxlH4gkyCY.jpg

ksMqaRNczJenE6Uv.jpg
The flaws are super obvious in shots like these, but I don't think I've ever registered anything looking quite this bad after 60 hours in game. There are a lot of weather conditions and the overcast ones can look a bit better imo. These shots look like some of the brightest possible weather conditions which I don't see too often.

But most people, including me, aren't stopping to pore over the graphics in a game like this. Embark made the overall right call by optimizing for a lightweight solution and the game can look quite striking overall, but it's nowhere near the standards of this thread and I really wish there was a higher-quality option. I have the performance to spare. The overall visual makeup is essentially UE4 with a sub-sub-Lumen lightweight GI solution on top.

If I knew exactly where those locations were in game I'd make an effort to go check them out and see if anything's changed, but there's a lot of asset/structure reuse so I have no idea.
 
Last edited:
I think it's not needed.

Preset E is older than .dll file RE9 uses. I have no idea why game defaults to D.
Well i did it anyway, for both rr and dlss, i guess it can't hurt, since swapper doesn't let me change profile, i can only do it from inspector (i dont have the nvidia app).

Hopefully i don't pass all the night troubleshooting instead of playing, if a couple of things don't work i just leave everything as it is and just play, other than the trash iq the game runs well otherwise.
 
No i already used older drivers for the performance uplift but dude, PT+dlaa at 4k is too much for a 4080 even if you have normal performances.
these are the game ready drivers for Requiem that were pulled back before requiem ever came out. they were causing all kinds of issues on nvidia GPUs.

The game was meant to be played on these drivers. Everyone should see an improvement, 4000 or 5000 series should all see better or at least more consistent performance. remember nvidia likely implemented the path tracing so the latest drivers were probably needed to get the most of out of this game's path tracing mode.
 
Looks like either baked lighting where the probes are misplaced inside geometry leading to light leaks, or the geometry is just made such there are gaps somewhere which leads to light leaks when using RTGI.

Either way, if those are still in the game, I am very surprised they let those go unnoticed.
Thank you. I will behave.


The flaws are super obvious in shots like these, but I don't think I've ever registered anything looking quite this bad after 60 hours in game. There are a lot of weather conditions and the overcast ones can look a bit better imo. These shots look like some of the brightest possible weather conditions which I don't see too often.
the point is that some of the most technically accomplished devs in the industry. the guys who literally built Frostbite were not able to get the most out the same RT technique this guy on twitter is saying is so much better than lumen.

Lumen is expensive because its better than even some hardware based ray tracing techniques. Arc Raiders runs at 1440p 60 fps precisely because it has these edge cases where the game does not look nearly as good. Just like games like TLOU2 which look phenomenal to this day but have issues in some rooms that dont get proper light bounce.
But most people, including me, aren't stopping to pore over the graphics in a game like this. Embark made the overall right call by optimizing for a lightweight solution and the game can look quite striking overall, but it's nowhere near the standards of this thread and I really wish there was a higher-quality option. I have the performance to spare. The overall visual makeup is essentially UE4 with a sub-sub-Lumen lightweight GI solution on top.

If I knew exactly where those locations were in game I'd make an effort to go check them out and see if anything's changed, but there's a lot of asset/structure reuse so I have no idea.
I agree with this. I believe i said this in the other UE5 thread last week. Some games, especially multiplayer games, dont need all this fancy tech and the priority should be higher resolutions at 60 fps. hell im not even that mad at Forza Horizon 6 anymore because i realized that anything below 1440p 60 fps is going to kill the IQ on consoles, and any further increase in foliage quality wouldnt really show up at lower resolutions.
 
Last edited:
these are the game ready drivers for Requiem that were pulled back before requiem ever came out. they were causing all kinds of issues on nvidia GPUs.

The game was meant to be played on these drivers. Everyone should see an improvement, 4000 or 5000 series should all see better or at least more consistent performance. remember nvidia likely implemented the path tracing so the latest drivers were probably needed to get the most of out of this game's path tracing mode.
They already tested them on reddit, they don't have the same perf uplift of older drivers apparently.

I'm gonna wait to read some more testing before upgrading, nvidia has been kinda terrible with drivers lately.
 
Last edited:
the point is that some of the most technically accomplished devs in the industry. the guys who literally built Frostbite were not able to get the most out the same RT technique this guy on twitter is saying is so much better than lumen.
I'm not familiar with this guy on Twitter and I'm not gonna pretend to understand 100% of what he's explaining, but I'm reluctant to outright take his word on this. He's not the first one claiming to have cracked Unreal Engine wide open with his oh so special custom fork. As you've pointed out many times, the other engines out there attempting UE5-level fidelity are are dropping res and performance to fairly comparable levels in their console 30 fps modes.

I'll be playing some ARC Raiders next couple days before Marathon drops, so if I see anything like Alex's shots I'll grab pics for comparison. I think its GI looks great - for what it is - the vast majority of the time. But maybe I need to look closer, which admittedly I'm usually not doing while in a raid.
 
Last edited:
And this is why you need UE5.
R638HaNjYk3fsUaL.jpeg
It looks a bit rough, but to be honest not that bad, didn't reach this area. But overall yeah, idk, the 1st area it's like in a league of its own.

From that point on everything is a little downgraded. The first big area you explore with Grace still has that amazing lighting, but assets and geometry are a bit lower quality.
Playing more and more, I start to notice things that aren't so next gen anymore. It's like slowly becoming 50-50, lighting takes the cake here imo, it masks a lot of low quality stuff, with textures and assets.

Textures especially if you play in 1st person, and assets, you notice this a lot with pipes and cables, those are last gen as hell, very blocky.

And man, if you disable depth of field, some of the cutscenes are poor. I usually don't mind depth field, but here it's weird and I disabled it.
 
They already tested them on reddit, they don't have the same perf uplift of older drivers apparently.

I'm gonna wait to read some more testing before upgrading, nvidia has been kinda terrible with drivers lately.

Yeah, both 591 and two 595 drivers are shit in this game.

WXNTKfnmsMr3hbnP.jpg


581 is the last driver that performs good in this game, I have seen more than 33% better performance. Apparently 5xxx series also performs better on older drivers.
 
Some people may disagree with me here, but Resident Evil Requiem to me is very similar to Death Stranding 2.

The most impressive parts of both games visually are right at the very start. Not to say what comes after isn't impressive, because some parts in both come close, but it's very clear and obvious to me that the highest peaks are the openings.
 
Some people may disagree with me here, but Resident Evil Requiem to me is very similar to Death Stranding 2.

The most impressive parts of both games visually are right at the very start. Not to say what comes after isn't impressive, because some parts in both come close, but it's very clear and obvious to me that the highest peaks are the openings.

DS2 never reaches heights of opening sequence again (at least in open world gameplay). RE9 (so far) as well but gap is much smaller IMO

c5mcJa08kaV3ZOSM.jpeg
1iWYEqnuK7tMmeCD.jpeg


Fot this shot - so many small things are added thanks to PT (rough reflections of the scene on monitors, mirror like reflection on clock, lots of precise RT shadows etc.):

MwxRClNDK025BN13.jpeg

zmLWVwajL44cojQm.jpeg
Ap0j7yTTpstD6Tzk.jpeg
 
begs the question why the RTGI in Arc Raiders is so bad when some very talented ex-DICE devs worked on it for over 7 years.

HyToXobgnKZn95wE.jpg


8ugAHaQxlH4gkyCY.jpg

ksMqaRNczJenE6Uv.jpg


because RTXGI is designed around performance not quality. RTXGI is also not traditional raytracing btw... it's just called RTXGI because it's made by Nvidia and part of their RTX branch of Unreal Engine. it's very coarse and uses very little rays, which is why it runs fast on non-RT accelerated hardware.

Vite uses DDGI which is far higher quality but also has a higher render cost.



of course it still shits all over Lumen while only having like 10% of the render cost. the guy did a, let's call it "ground truth" check of this scene, by doing a full path tracing render of it, and DDGI is higher quality than Lumen, and closer to the "ground truth" render.

Lumen is pure dog shit. it's time for you to just accept that fact.
 
Last edited:
I'm not familiar with this guy on Twitter and I'm not gonna pretend to understand 100% of what he's explaining, but I'm reluctant to outright take his word on this. He's not the first one claiming to have cracked Unreal Engine wide open with his oh so special custom fork. As you've pointed out many times, the other engines out there attempting UE5-level fidelity are are dropping res and performance to fairly comparable levels in their console 30 fps modes.

I'll be playing some ARC Raiders next couple days before Marathon drops, so if I see anything like Alex's shots I'll grab pics for comparison. I think its GI looks great - for what it is - the vast majority of the time. But maybe I need to look closer, which admittedly I'm usually not doing while in a raid.
watch the Alex video or maybe it was the PS5XSX video by Oliver. Plenty of examples of interiors looking like this. Outdoors the game looks fantastic with really great lighting during the day. Foliage and trees look good too.

It looks a bit rough, but to be honest not that bad, didn't reach this area. But overall yeah, idk, the 1st area it's like in a league of its own.

From that point on everything is a little downgraded. The first big area you explore with Grace still has that amazing lighting, but assets and geometry are a bit lower quality.
Playing more and more, I start to notice things that aren't so next gen anymore. It's like slowly becoming 50-50, lighting takes the cake here imo, it masks a lot of low quality stuff, with textures and assets.

Textures especially if you play in 1st person, and assets, you notice this a lot with pipes and cables, those are last gen as hell, very blocky.

And man, if you disable depth of field, some of the cutscenes are poor. I usually don't mind depth field, but here it's weird and I disabled it.
I dont think its bad either. just something to keep in mind why some games arent as heavy on the GPU. It's like that saying you get what you pay for, well, when it comes to graphics, you get what you invest in. The game is hitting 60 fps at high resolutions because they cut corners.
 
because RTXGI is designed around performance not quality. RTXGI is also not traditional raytracing btw... it's just called RTXGI because it's made by Nvidia and part of their RTX branch of Unreal Engine. it's very coarse and uses very little rays, which is why it runs fast on non-RT accelerated hardware.

Vite uses DDGI which is far higher quality but also has a higher render cost.



of course it still shits over Lumen while only having like 10% of the render cost. the guy did a, let's call it "ground truth" check of this scene, by doing a full path tracing render of it, and DDGI is higher quality than Lumen, and closer to the "ground truth" render.

Lumen is pure dog shit. it's time for you to just accept that fact.

why didnt embark studios which was formed in 2018 go with DDGI if its so performant and has better visuals than RTXGI? These guys built Frostbite and stunners like Battlefront 2 and BF1. 8 years of working with this engine, two shipped games, and they are still not using this tech. Why? Maybe, just maybe its not feasible.

i dont really care about lumen. if it can be replaced by other tech thats better than fine. I dont even care about ray tracing. as long as the game looks next gen, devs should feel free to use whatever tech they want. Baked lighting, last get LOD based assets, screenspace shadows and reflections, go nuts. I just go by the final results. hell, personally id prefer baked GI so the remainder of the GPU power can be used to push better volumetric effects, physics, destruction, and CG quality assets.
 
watch the Alex video or maybe it was the PS5XSX video by Oliver. Plenty of examples of interiors looking like this. Outdoors the game looks fantastic with really great lighting during the day. Foliage and trees look good too.
I watched Alex's vid at launch, might give it another watch just so I can see if anything's changed in the game's current state. I agreed with his take that there should be a higher quality lighting option for those who want it, which predictably let to a bunch of comments calling him a snob and elitist and "the game already looks good enough". Well yeah it looks good but why not give me the option? Just gimme a Lumen option, c'mon.

My biggest issue with the visuals is the pop-in. Not necessarily the LOD management, which is comparable to most other games that aren't using virtualized geometry, but sometimes it feels there's something weird going on with culling. Camera pans, especially fast ones, can reveal a lot of assets popping in and out. Happens even on the practice range which is a tiny map. I think this occurred in a couple UE4 games like Jedi Survivor (it's not as bad as the examples I recall seeing from that game though).
 
why didnt embark studios which was formed in 2018 go with DDGI if its so performant and has better visuals than RTXGI? These guys built Frostbite and stunners like Battlefront 2 and BF1. 8 years of working with this engine, two shipped games, and they are still not using this tech. Why? Maybe, just maybe its not feasible.

they use RTXGI because they use UE RTX and not UE Vite.
Vite is a branch made by enthusiasts not by a big company. also I'm not even sure how old Vite is and how far in development it was when Arc started development.

also they did use the RTX branch for The Finals, so they probably just are familiar with it and didn't want to change branches.


i dont really care about lumen. if it can be replaced by other tech thats better than fine. I dont even care about ray tracing. as long as the game looks next gen, devs should feel free to use whatever tech they want. Baked lighting, last get LOD based assets, screenspace shadows and reflections, go nuts. I just go by the final results. hell, personally id prefer baked GI so the remainder of the GPU power can be used to push better volumetric effects, physics, destruction, and CG quality assets.

we wouldn't need baked GI if developers would stop using UE5... but sadly that's not gonna happen because devs want quick and dirty results where developers can be swapped out like AAA batteries and don't need training on a specific engine or enfine branch.

imagine a AAA studio embracing UE Vite, and fully utilising it instead of using Lumen!



3+ times the performance, better results in terms of accuracy AND stability...
but that's not happening, because everyone is already using UE5 and now used to just smearing Lumen over everything, even tho it literally looks worse than last gen games
 
Last edited:
Well, no pt-rr, dlaa and framegen solved the iq problem.

Sure the game took a graphic hit for the lack of pt but the gain in iq is major, also, also raccoon city even with pt looked pretty dull compared to the previous locations.

Edit: nope, with profile m and dlss 4.5 dlaa is way too sharp and introduce weird stuff on metal surfaces, perf mode and no framegen is the way, now the game looks perfect and run at 100 fps almost maxed out.

4.5 is black magic i swear.
 
Last edited:
they use RTXGI because they use UE RTX and not UE Vite.
Vite is a branch made by enthusiasts not by a big company. also I'm not even sure how old Vite is and how far in development it was when Arc started development.

also they did use the RTX branch for The Finals, so they probably just are familiar with it and didn't want to change branches.




we wouldn't need baked GI if developers would stop using UE5... but sadly that's not gonna happen because devs want quick and dirty results where developers can be swapped out like AAA batteries and don't need training on a specific engine or enfine branch.

imagine a AAA studio embracing UE Vite, and fully utilising it instead of using Lumen!



3+ times the performance, better results in terms of accuracy AND stability...
but that's not happening, because everyone is already using UE5 and now used to just smearing Lumen over everything, even tho it literally looks worse than last gen games

if this thing is legit then i hope devs use it. no point in wasting GPU cycles on something that looks and performs worse.
 
Been playing more BF6 and DICE's DLSS implementation is still fucked. DLAA and DLSSQ are way too sharp unless I'm a dumbass and missed a sharpening slider somewhere. Absolutely mangles character faces. And you still sometimes get glowing white outlines around characters etc. in lobby. This was an issue in BF2042 and I'm disappointed it's still not addressed. Obviously 99% of the time you're not looking at this stuff, but I'm still sticking with the default TAA. It's blurrier but it's the lesser of two evils.

I hate when DLSS breaks random shit. There are some older implementations that completely remove DoF in a couple games for example.

I remember hearing speculation back in the day that BF2042's DLSS was busted because it wasn't fully replacing the existing TAA, like they were getting layered somehow and that was messing things up. Can't remember where I heard it though or if they were just talking out their ass.
 
Last edited:
watch the Alex video or maybe it was the PS5XSX video by Oliver. Plenty of examples of interiors looking like this. Outdoors the game looks fantastic with really great lighting during the day. Foliage and trees look good too.


I dont think its bad either. just something to keep in mind why some games arent as heavy on the GPU. It's like that saying you get what you pay for, well, when it comes to graphics, you get what you invest in. The game is hitting 60 fps at high resolutions because they cut corners.
Yeah, they cut corners, idk where to put this game, at 1st I thought ur was way better Silent Hill 2, now I'm not so sure. Maybe it's my fault, I've disabled all those effects, like lens flares, lens dirt, distortion, chromatic aberration, depth of field.

But these settings shouldn't affect stuff like this:
TyRbV5TWdZm8WEOc.png


Or this, the lighting is still nice, but that pipe in the upper left is rough, some of these assets are like a fork in my eye. I mean, no game has perfect round shapes, but in this game these are always like this.
DkGv3TnfKnBaOTsX.png
 
Are you sure this is not a ue5 console related problem because of bad upscaling?

I don't think i saw hairs so bad to be distracting tbh.

Maybe they were slightly fuzzy sometimes but artifacting mess?

1379103.jpg

hq720.jpg

Are you sure this is not a ue5 console related problem because of bad upscaling?

I don't think i saw hairs so bad to be distracting tbh.

Maybe they were slightly fuzzy sometimes but artifacting mess?

1379103.jpg

hq720.jpg
Most UE5 games do seem to struggle with hair rendering imo, just looks fizzly and artifacty even at high resolutions. Looks great in these shots though.
 
Yeah, they cut corners, idk where to put this game, at 1st I thought ur was way better Silent Hill 2, now I'm not so sure. Maybe it's my fault, I've disabled all those effects, like lens flares, lens dirt, distortion, chromatic aberration, depth of field.

But these settings shouldn't affect stuff like this:
TyRbV5TWdZm8WEOc.png


Or this, the lighting is still nice, but that pipe in the upper left is rough, some of these assets are like a fork in my eye. I mean, no game has perfect round shapes, but in this game these are always like this.
DkGv3TnfKnBaOTsX.png
at times it looks better than silent hill 2, other times it looks worse than callisto lol

Xq4zvpL.gif

0yMJLBs.gif



thats ok though. some of the areas in the last couple of hours... holy shit. the lighting, the asset quality, the reflections. ive seen the light. capcom am god.

Most UE5 games do seem to struggle with hair rendering imo, just looks fizzly and artifacty even at high resolutions. Looks great in these shots though.

Yep. exp 33 characters had the most distracting dithering on hair. really poor. though dlss should be cleaning these up.
 
Last edited:
Just tried a few UE5, Anvil and Snowdrops games to see where they land performance wise when running at native resolutions. My card is a 5080 which is around 13% faster than the 7900xtx used in Crimson Desert's native 4k 60 fps benchmark.

- MGS3 - 45 fps. Cant use FSR with native upscaling so had to use DLAA which has its own cost. also had hardware ray tracing enabled in the config files.
- Mafia - 43 fps native 4k fsr.
- Robocop - Easy native 4k 60 fps with dlaa with around 20% of the gpu left. didnt bother uncapping. Probably because its an older game that wasnt pushing the gpus a lot. Still looks fantastic though. Some new gifs below.
- Star Wars Outlaws - 37 fps native 4k dlaa. Didnt bother enabling path tracing.
- Avatar - 41 fps native 4k dlaa
- AC Shadows - 40 fps native 4k dlaa
- Cyberpunk - 30 fps native 4k dlaa (no path tracing)
- kingdom Come 2 - 68 fps native 4k dlaa

So basically Crimson Desert's engine is around 50% more performant than UE5, RED engine Anvil and Snowdrop. It shares the same performance profile of Kingdom Come 2, which is a very handsome looking game that uses software based RTGI. Could be they are using a slightly less intensive form of ray tracing, and are skipping on virtualized geometry to gain some more performance.

It IS kinda funny to see all these high end games all have virtually the same performance profile. Almost as if next gen graphics demand next gen specs.

BrKyRlJ.gif


aU7PluT.jpeg
 
Last edited:
That's like saying the view across the Grand Canyon is beautiful, just not in a technical way. True, but irrelevant - and absurd.
It's actually not and two things can be true at once. The art direction, scenes and visuals in Yotei are beautiful from far away, but when you look at the assets during normal gameplay and zoom in closer and consider the hardaware, developer and time period of the current generation, its ugly, last gen and bad looking except for a few textures, hair quality? and RTGI. Yotei is lacking next gen geometric density, texture quality, better RTGI implementation, better animation quality etc. If I look at the Grand Canyon much closer it doesn't break down or if I look at the scene in motion it doesn't break down. Things can look last gen and beautiful at the same time. Current gen should have a standard for visuals and too many games look last gen at 60 fps with RT.
 
I have lifted the thread ban for CowboyLou CowboyLou .

Let's keep it civil.
Hey, thanks very much. ;)

Honestly if you can achieve a certain level of fidelity without using any of these fancy next gen features like ray tracing, virtualized geometry or metahumans, i dont really care if you dont use them.
This exactly, I just want the results to speak. Look at The Order 1886...
 
Last edited:
Just tried a few UE5, Anvil and Snowdrops games to see where they land performance wise when running at native resolutions. My card is a 5080 which is around 13% faster than the 7900xtx used in Crimson Desert's native 4k 60 fps benchmark.

- MGS3 - 45 fps. Cant use FSR with native upscaling so had to use DLAA which has its own cost. also had hardware ray tracing enabled in the config files.
- Mafia - 43 fps native 4k fsr.
- Robocop - Easy native 4k 60 fps with dlaa with around 20% of the gpu left. didnt bother uncapping. Probably because its an older game that wasnt pushing the gpus a lot. Still looks fantastic though. Some new gifs below.
- Star Wars Outlaws - 37 fps native 4k dlaa. Didnt bother enabling path tracing.
- Avatar - 41 fps native 4k dlaa
- AC Shadows - 40 fps native 4k dlaa
- Cyberpunk - 30 fps native 4k dlaa (no path tracing)
- kingdom Come 2 - 68 fps native 4k dlaa

So basically Crimson Desert's engine is around 50% more performant than UE5, RED engine Anvil and Snowdrop. It shares the same performance profile of Kingdom Come 2, which is a very handsome looking game that uses software based RTGI. Could be they are using a slightly less intensive form of ray tracing, and are skipping on virtualized geometry to gain some more performance.

It IS kinda funny to see all these high end games all have virtually the same performance profile. Almost as if next gen graphics demand next gen specs.

BrKyRlJ.gif


aU7PluT.jpeg
LOL at the guy sliding across the ground in the first gif :messenger_tears_of_joy:

I love how this game went for a photoreal color grading, combined with the UE5 level of detail it's just stunning. But the characters and animations could use some work. Hopefully they level up again for their next project. Terminator to Robocop was an INSANE jump.
 
Last edited:
Resident Evil Requiem Foliage on PC (Path Tracing etc). Foliage is the one thing next gen hasn't really conquered (barring some expections like Black Myth Wukong).

9hWW6or.gif



Resident Evil Requiem Fluid Simulation:

gFhuev4.gif
 
Last edited:
One thing I really hope for Nvidia to improve is Pathtracing ghosting. In Requiem Pathtracing looks spectacular, but there is too much ghosting trails when moving the camera under certain conditions. It needs to look cleaner

There must be a way to fix this
The way to fix this is to shoot more rays per pixel leading to less noisy image which would require less temporal accumulation to begin with.
Or just do more with AI.
 
It's actually not and two things can be true at once. The art direction, scenes and visuals in Yotei are beautiful from far away, but when you look at the assets during normal gameplay and zoom in closer and consider the hardaware, developer and time period of the current generation, its ugly, last gen and bad looking except for a few textures, hair quality? and RTGI. Yotei is lacking next gen geometric density, texture quality, better RTGI implementation, better animation quality etc. If I look at the Grand Canyon much closer it doesn't break down or if I look at the scene in motion it doesn't break down. Things can look last gen and beautiful at the same time. Current gen should have a standard for visuals and too many games look last gen at 60 fps with RT.

I don't know what to tell you. In my honest opinion, Yotei is at least perfectly adequate in most aspects of its visuals, and the overall effect is generally - but not always, obviously - stunning. Funnily enough, the game it reminds me a lot of is Uncharted 4 in its better looking areas, just in terms of its overall look and visual quality. A bit higher fidelity, maybe 20% better or so, but that kind of ballpark. No doubt the usual cretins will find that incredibly hilarious because Uncharted 4 is a last gen game, but no open world game last gen looked close to it. But what elevates Yotei is that they push that fidelity out to the horizon with the best draw distance I've ever seen on console. If you looked into the distance in Uncharted 4 you were basically looking at PS2 graphics.

I wouldn't mind the criticism if it were balanced. Maybe these guys simply have no clue what "true next gen" games really look like on console, but games like Avatar and Alan Wake 2 and yes, even AC Shadows, are chock full of blocky models, blurry textures, abysmal IQ, billboard trees, and the like. Generally the most you can say of them in terms of unconditional praise is that they're very ambitious. Mostly too ambitious for consoles. And you know what? I think that's just fine. I think PC gamers were starved of really good high tech games for way too long, and god knows they spend enough nowadays to deserve a decent delta. But I resent the idea that for a game to be truly "current gen" it must be too ambitious to look good on CURRENT GEN hardware. That is patently absurd.
 
Last edited:
Been playing more BF6 and DICE's DLSS implementation is still fucked. DLAA and DLSSQ are way too sharp unless I'm a dumbass and missed a sharpening slider somewhere. Absolutely mangles character faces. And you still sometimes get glowing white outlines around characters etc. in lobby. This was an issue in BF2042 and I'm disappointed it's still not addressed. Obviously 99% of the time you're not looking at this stuff, but I'm still sticking with the default TAA. It's blurrier but it's the lesser of two evils.

I hate when DLSS breaks random shit. There are some older implementations that completely remove DoF in a couple games for example.

I remember hearing speculation back in the day that BF2042's DLSS was busted because it wasn't fully replacing the existing TAA, like they were getting layered somehow and that was messing things up. Can't remember where I heard it though or if they were just talking out their ass.
BF6 isn't leading anything visually anyways, but yeah I find the sharpness kinda awkward. It looks soft and oversharpened at the same time. I've tried DLAA and different DLSS settings at different resolutions and sharpness settings and it always looks slightly off. Great game though, just wish it was a current game game tech wise.
 
Yeah, they cut corners, idk where to put this game, at 1st I thought ur was way better Silent Hill 2, now I'm not so sure. Maybe it's my fault, I've disabled all those effects, like lens flares, lens dirt, distortion, chromatic aberration, depth of field.

But these settings shouldn't affect stuff like this:
TyRbV5TWdZm8WEOc.png


Or this, the lighting is still nice, but that pipe in the upper left is rough, some of these assets are like a fork in my eye. I mean, no game has perfect round shapes, but in this game these are always like this.
DkGv3TnfKnBaOTsX.png
Damn that looks absolutely ass, how could they ship with that. Japan really needs another QA pass, that shouldn't be ok to ship. Looks like a ps2 asset, totally immersion breaking.
 
Is it me or dokev looks better than crimson graphically?? Maybe because they are not chasing realism but a more pixar artstyle.


 
Last edited:
That's because it looks ass. I sold it after 3 hours. Besides the visuals, it's a copy-paste version of Tsushima.
I remember when Yotei released i found the vistas more beautiful than Assassin creed Shadows' ones.
There was something about the sense of scale and the foliage from far that looked really great. The autumn region for exemple is beautiful. Or the top of Mount Yotei.
But then again every region is quite small, without much to do, and very basic level design. And some of the gameplay choices were mindboggingly stupid. Not having a weather changing option when there was one before is the best exemple of this decline.
Yotei was truly a step back from Tsushima in every aspect.
 
Top Bottom