Graphical Fidelity I Expect This Gen

DS2 looks good enough, but the image quality and stable framerate is much more important to me, than some RT effects. So i prefer DS2 over any UE5 game on console that runs at 800p with heavy drops. The IQ here is PRISTINE and sadly this is not a common thing this generation

This is my main issue with UE5. It can look good but the image quality is poor always, even if you use TSR, the ghosting is disgusting and the performance in general is trash and I say this as a guy with a 4090 and one of the best amd cpus in the market.

The engine has potential but it's not good enough. I prefer playing a game like DS2 which looks and feels great in comparison
 
This is my main issue with UE5. It can look good but the image quality is poor always, even if you use TSR, the ghosting is disgusting and the performance in general is trash and I say this as a guy with a 4090 and one of the best amd cpus in the market.

The engine has potential but it's not good enough. I prefer playing a game like DS2 which looks and feels great in comparison
Image quality is not poor using DLSS
 
Cyberpunk 2077 still stands out.

Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_00_01.jpg


Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_00_50.jpg


Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_01_20.jpg


Cyberpunk-2077-_C_-2020-by-CD-Projekt-RED-2025_6_20-16_03_32.jpg
Yup, re-playing it currently and with normal rt on ultra(no psycho/no pathtracing) it still looks great on my 3080ti thx to that transformer dlss4 model, artifacts from upscaling are substantially reduced vs previous method.
Kinda shame new xbox gonna be made with coop of amd since going nvidia could have been 1 objective advantage it could get over amd based ps6...
 
Got any examples to post of what you see?

These types of visual issues are mostly visible in motion. Is not something you can see on static screenshots. When you are playing the game, you get to see these artifacts and problems that only happen in this engine. Ray tracing in combination with upscalers seems to exacerbate this issue even more. Another engine that has this issue as well is the Red Engine, in Cyberpunk there is a lot of temporal ghosting too, even with DLAA and such but its not so bad as an unreal engine game luckily, and at least Cyberpunk performs well without stuttering or any issues

If you want to see real ghosting, go and watch gameplay of Wukong on playstation 5. Its almost like having a glaucoma
 
Image quality is not poor using DLSS

A side note, that affects all temporal upscalers is that frame rate actually affects the quality of the upscaling.
The reason is that temporal upscalers rely on accumulating data from several frames. So if a temporal upscaler requires 32 samples, then a game running at low frame rates will take much longer to accumulate the necessary data.
This is from one of the developers that made FSR4. He is comparing the RTX5090 and 9070XT, but the same logic applies to any GPU in the same performance range.
This means that FSR4 or DLSS4 running on a console at 30 fps, will look inferior FSR4 or DLSS4, running on a PC at 100+ fps. Even if the base resolution and settings are the same.

For a 2x upscale, the sequence length that is recommended for FSR and XeSS is 32. That means 32 frames are required to complete the sequence, and fully converge to a super-sampled output pixel. You may think this sounds very long - and it is. The algorithm utilizes motion vectors and other tricks to keep samples valid so previous data can be used throughout the convergence.
Finding benchmark data for FSR 4 vs DLSS 4 at the same modes had proven tricky. However, if we grab 1440p average game data for the GPUs in question from Hardware Unboxed as a ballpark figure, we arrive at Radeon RX 9070 XT at 119fps versus Geforce RTX 5090 at 192fps.

If our jitter sequence is 32 frames for a 2x upscale, that means there are 3.71 full accumulations per second capable on Radeon RX 9070 XT, or 6 on Geforce RTX 5090.

In other terms, at these frame rates it takes 269ms to fully accumulate on Radeon RX 9070 XT, or 166ms on Geforce RTX 5090. A huge 100+ms difference in convergence rate. With animation, particles and then camera motion in play during these quality comparisons, that is a lot of time to display lower quality pixels.

These days we are all getting told anti lag improvements of 15-20ms are groundbreaking. That frame generation latency hits of 30ms are unacceptable. So should a graphical convergence difference 2-3 times that be acceptable during a quality review?


 
Last edited:
These types of visual issues are mostly visible in motion. Is not something you can see on static screenshots. When you are playing the game, you get to see these artifacts and problems that only happen in this engine. Ray tracing in combination with upscalers seems to exacerbate this issue even more. Another engine that has this issue as well is the Red Engine, in Cyberpunk there is a lot of temporal ghosting too, even with DLAA and such but its not so bad as an unreal engine game luckily, and at least Cyberpunk performs well without stuttering or any issues

If you want to see real ghosting, go and watch gameplay of Wukong on playstation 5. Its almost like having a glaucoma

You are probably talking about lumen artifacts and noise created by poor denoiser. There are also motion trails with lumen not related to dlss or any other reconstruction technique.
 
Last edited:
These types of visual issues are mostly visible in motion. Is not something you can see on static screenshots. When you are playing the game, you get to see these artifacts and problems that only happen in this engine. Ray tracing in combination with upscalers seems to exacerbate this issue even more. Another engine that has this issue as well is the Red Engine, in Cyberpunk there is a lot of temporal ghosting too, even with DLAA and such but its not so bad as an unreal engine game luckily, and at least Cyberpunk performs well without stuttering or any issues

If you want to see real ghosting, go and watch gameplay of Wukong on playstation 5. Its almost like having a glaucoma
You are probably talking about lumen artifacts and noise created by poor denoiser. There are also motion trails with lumen not related to dlss or any other reconstruction technique.

I will "play" some Hellblade 2 and look out for these artifacts because I did not notice them but now I know what to look for I'm interested. Will definitely agree though that UE5 is lacking on consoles
 
This is my main issue with UE5. It can look good but the image quality is poor always, even if you use TSR, the ghosting is disgusting and the performance in general is trash and I say this as a guy with a 4090 and one of the best amd cpus in the market.

The engine has potential but it's not good enough. I prefer playing a game like DS2 which looks and feels great in comparison
lol you have a 4090 and are complaining about image quality? dude i play on my 3080 and nearly every game runs at 4k dlss quality 60 fps. all i have to do is set graphics to high instead of ultra. there is no way you arent able to max out these games on 4k dlss quality.

absolute nonsense.
 
lol you have a 4090 and are complaining about image quality? dude i play on my 3080 and nearly every game runs at 4k dlss quality 60 fps. all i have to do is set graphics to high instead of ultra. there is no way you arent able to max out these games on 4k dlss quality.

absolute nonsense.

Do you know how to read?
 
Speaking of IQ, I dont think they are using GG's new upscaling system that we saw in the HFW's Pro update. There is a lot of breakup and shimmering in foliage and water that i typically see in PSSR. Or in the older version of HFW. It doesnt always happen but i see it every now and then when rain, or other alpha effects come into play. Horizon fixed this 3 years ago. And the Pro's scalar made it even better.

i wouldnt be surprised if KojiPro basically went with the very first version of checkerbaording here that GG launched with 3.5 years ago.
 
I admit my ability for reading nonsense is a bit limited.


Man, your fanaticism blinds you sometimes, you can't even read between the lines

Just because I'm calling the performance trash doesn't mean my PC isn't pulling high frame rates. Of course it is. I'm talking about, stuttering, frame time inconsistencies, and general instability, the kind of issues that ruin the experience while playing, regardless of how powerful your hardware is.

This has become common with Unreal Engine 5 games lately. Just look at Oblivion Remastered, it's one of the worst offenders. No matter how high end your setup is, the performance is baffling.

So yes, UE5 can deliver impressive visuals, but that means nothing if the performance is a mess. Add to that the smearing, ghosting, and poor image quality and the end result just isn't worth it.

That said, I'll give credit where it's due, as Senua Senua mentioned, Hellblade 2 is one of the few titles that show UE5's potential. But overall, this engine has been more of a pain than anything else, no matter how hard that is for you to accept
 
That said, I'll give credit where it's due, as Senua Senua mentioned, Hellblade 2 is one of the few titles that show UE5's potential. But overall, this engine has been more of a pain than anything else, no matter how hard that is for you to accept
Add Robocop to the list. The only UE5 with no performance issues in my experience.
 
Top Bottom