• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Digital Foundry: What Can Be Done About Unreal Engine 5 Games With Image Quality/Performance Issues?

Wukong looks very good in motion. Especially with DLSS.
Ah… The famous DLSS of the XSX version!
Jim Carrey Reaction GIF
 
It depends on particular game, even native 4k60 even w/o rt is out of the question in some games, but dlss transformative model makes huge difference, but u still gotta fiddle with settings and for example use dlss balanced so native 1080p dlssed to 4k is fine enough.
There are so many ways u can make it work but again true native 4k at 60fps stable and maxed everything can even put much stronger 5090(that has base tdp of 575W) to its knees.

For comparision vs ps5(or pr0) kcd recently got next gen patch and it obviously doesnt hold stable 60 at those settings even on pr0, here on my 9800x3d and 5080 i max it out at 4k and i easily get 4k60, most of the time %use of gpu varries between 50 and 70%(there is one room where gpu usage spikes but dunno why, its ur own room in the inn near ratay ).

Forget about native 4k at maxed settings in current gen games but dlss4(transformer model) makes it much lighter on the gpu to the point that u can max or close to max those games at 1080p ai upscaled to 4k and IQ is still solid(not perfect but way above what consoles can offer atm).

Here quick example with cp2077, just coz every1 and their momma by now knows how it runs.


As u can see in the vid dlss quality(so 1440p to 4k upscale using transformer model) gives 70-90fps and still 0 rt.

Now lets check 4k ultra and rt at ultra(pt off)

U can see its barely 4k30-35 in this case

U drop res from native 4k to 1440p native(same settings so ultra/max with rt ultra but no pt) and suddenly 4k60-65

TLDR: At that lvl of hardware if we talking impressive current gen games u still cant go 4k60 at max and lowering settings below high isnt usually worth it, instead u take advantage of literal black magic dlss4 is(not framegen, fuck fake frames, they are only good if ur starting fps is 50-60, otherwise total bust, i mean dlss transformer model) and simply AI upscale from native 1080p or 1440p depending on the game to still get those max/close to max settings and solid IQ.

Here video with many more games tested, interesting point is- they check here native 4k and later dlss quality for comparision:


TLDR: DLSS is ur friend, and transformer model is far beyond what ps5/pr0 can do, IQ is almost as good as native durning gameplay where u dont look for pixels/artifacts DF wise at closeups/slowed down video, its not as good as native 4k obviously but degradation of IQ is relatively small, to the point it doesnt distrurb u from enjoying/playing the game like it does on base ps5/pr0 in 60fps modes :)

It even makes sense if we look at literal power of hardware:
ps5pr0 gpu is roughly similar to 9060xt( when talking in actual games results), 5080 is literally at +99% of that and add superior AI upscaling method/better rt capabilities too.

Your 5080 should run quite a few UE5 games at 4K native 60fps with console like settings. My OC'ed 4080S offers the same performance as stock 5080 in UE5, so I know what your card can do:

4K DLAA high settings (PS5 version has even lower texture settings). Twice the framerate of PS5 version and 4x as many pixels.

DLAA-K-Lumen.jpg


AC Rally with maxed out settings at 4K TAA

TAA.jpg


SIlent Hill 2 4K DLAA with high settings

SHf-Win64-Shipping-2025-09-23-15-10-28-045.jpg


Robocop Unfinished Business 4K TAA native with high / epic settings

Robo-Cop-Unfinished-Business-Win64-Shipping-2025-09-16-23-47-52-977.jpg


Mafia The Old Country at 4K DLAA with high settings

4K-DLAA.jpg


With maxed-out settings (Epic), these UE5 games usually run at 40–50 fps, so DLSS Quality is necessary to get over 60 fps.

As for FG, you don't like it, but I think this technology is amazing for such demanding games as UE5, especially at 4K. With FGx2 and DLSSQ, my card can achieve 110–170 fps (28–35 ms latency) in these demanding UE5 games, and if I didn't know I was using FG, I would think I was playing at a real 110–170 fps.

As for the Cyberpunk Ultra settings, they cut the performance in half (especially SSR) compared to high settings. You should be able to achieve 4K native at 60 fps with a mix of Ultra/High settings on the RTX 5080. With DLSS4.5 in performance mode, you can reach 120 fps, which looks better than 4K TAA. To achieve a locked 4K 60 fps with RT Ultra, you would need to use DLSS Balance because Quality runs between 55-65fps. However, I believe an overclocked 5080 could deliver a consistent 60 fps, even with DLSS Q. Personally, I use DLSSB + FGx2 and play at 120 fps in 4K or 150–190 fps in 1440p with DLSSQ + FGx2.
 
Last edited:
I'm quoting XSX screenshots.

And yet…

FSR is usually very cheap. But I think 95% of the studios don't have a John Linneman kind of guy to tell them how the game actually looks to the hardcore audience.

To be honest I'm not sure if FSR2 is actually cheaper than TSR, maybe a bit? It's just dumb to use it.
 
Last edited:
The IQ on Wukong XSX in Quality and Balanced modes is very decent (in terms of artifacts) and quite sharp (~1200p-1440p base resolution).

In Quality mode, the framerate is a perfect 30fps, and in Balanced mode, it runs at ~50fps (perfect for 60Hz VRR) and a perfect 40fps at 120Hz.

The problem with UE5 on consoles is the 60fps target. Trying to achieve that target while maintaining technologies like Lumen and Nanite, etc., and maximizing graphics load, leads to reducing the resolution to a point where it's a very difficult to see performance mode as a good option VS Quality modes.

At the very end, It comes down to personal preference. Some people don't want anything less than 60fps, while others (like me) can tolerate a game at a perfect 30fps if it guarantees the highest possible graphical quality.
 
Last edited:
To be honest I'm not sure if FSR2 is actually cheaper than TSR, maybe a bit? It's just dumb to use it.

Yes, FSR2 is cheaper to use than TSR. Probably around 5-10%.
Especially heavy with the 2 highest quality settings, where the cvar r.TSR.History.ScreenPercentage is set to 200.
 
The IQ on Wukong XSX in Quality and Balanced modes is very decent (in terms of artifacts) and quite sharp (~1200p-1440p base resolution).

In Quality mode, the framerate is a perfect 30fps, and in Balanced mode, it runs at ~50fps (perfect for 60Hz VRR) and a perfect 40fps at 120Hz.

The problem with UE5 on consoles is the 60fps target. Trying to achieve that target while maintaining technologies like Lumen and Nanite, etc., and maximizing graphics load, leads to reducing the resolution to a point where it's a very difficult to see performance mode as a good option VS Quality modes.

At the very end, It comes down to personal preference. Some people don't want anything less than 60fps, while others (like me) can tolerate a game at a perfect 30fps if it guarantees the highest possible graphical quality.
post your screenshots of wukong.
 
Yes, FSR2 is cheaper to use than TSR. Probably around 5-10%.
Especially heavy with the 2 highest quality settings, where the cvar r.TSR.History.ScreenPercentage is set to 200.

That can be easily countered with some lower settings.

I think FSR2/3 is cancer in motion if native resolution is below 1440p (for 4k output), 5-10% is worth it for better image quality.
 
Last edited:
UE5 is pure shit and clearly born under a bad star. On paper it can looks awesome, but in reality largest part of games looks terrible because of devs trying to push features that are not meant for this gen hardware.

Engines made for this gen hardware looks far better as final result because even lacking features that UE5 have, they shows and infinite better IQ, smoothness and stability.

When Rockstar will release GTA 6 for PS5/Xbox Series it'll looks thousands time better than any UE5 games because they targeted this gen hardware at 30fps.
you are so close lol

UE5 games are using the same features GTA6 will use. Ray tracing, virtualize geometry, etc.
UE5 games already have the same IQ of GTA6. 1440p internal resolution at 30 fps. Just like every other UE5 game.
Every other engine built for this gen hardware has the same IQ. Snowdrop, Anvil, Northlight, Creation Engine, etc all target 1440p internal resolution using FSR at 30 fps and drop to 720p-864p in their 60 fps modes. Infinitely better? no. Hell, go watch the Avatar DF footage and you will see it drop frames below 60 fps ALL the time.
 
That can be easily countered with some lower settings.

I think FSR2/3 is cancer in motion if native resolution is below 1440p (for 4k output), 5-10% is worth it for better image quality.

One thing about consoles, is that, unless the devs state what they use, it's difficult to know the upscaler.
TSR is better than FSR2, but not by a huge difference. And FSR3.1 is much closer to TSR.
The real difference is when TSR is using r.TSR.History.ScreenPercentage=200 But this is rather heavy, probably as XeSS DP4A.

But the real problem is UE5, that uses so many low res buffers and dithered effects, that any upscaler can struggle with it.
It's impressive just how many corners are cut with UE5, and still it's extremely heavy to run.
 
One thing about consoles, is that, unless the devs state what they use, it's difficult to know the upscaler.
TSR is better than FSR2, but not by a huge difference. And FSR3.1 is much closer to TSR.
The real difference is when TSR is using r.TSR.History.ScreenPercentage=200 But this is rather heavy, probably as XeSS DP4A.

But the real problem is UE5, that uses so many low res buffers and dithered effects, that any upscaler can struggle with it.
It's impressive just how many corners are cut with UE5, and still it's extremely heavy to run.

That why upscaling makes so massive performance difference in UE5 vs. some other engines. Everything scales with internal resolution.
 
That why upscaling makes so massive performance difference in UE5 vs. some other engines. Everything scales with internal resolution.

And that is why UE5 games frequently look fuzzy and undetailed in motion.
For all the features it has, it also has some of the worst image quality of any game engine.
 
That can be easily countered with some lower settings.

I think FSR2/3 is cancer in motion if native resolution is below 1440p (for 4k output), 5-10% is worth it for better image quality.
TSR surprised me last night. i was expecting shimmering and lumen boiling at 720p internal resolution, but the only artifacting shimmering i noticed was in fences where it failed to properly resolve detail without artifacts. the grass in mafia had some noise but nothing as bad as what ive seen in PSSR. the far distance detail resolved surprisingly well. The trees in MGS Delta looked ok. I know FSR has issues with trees in Avatar. The detail on characters looked fine.

I had to restart mafia several times because i was like this doesnt look right. it shouldnt look this good. I had to switch between native upscaling and performance to confirm if the GPU load was going up. It looked that clean.

I still remember people thinking the original PS5 UE5 demo was native 4k because it looked so clean. It wasnt until Epic confirmed the internal resolution when everyone realized it wasnt true 4k. TSR was built for UE5 and looks excellent on PC. If consoles are trash, its probably because other settings are adding these shimmering artifacts, not because of TSR or some inherent issues with UE5. Because in terms of IQ, i have almost never had any issues with UE5. the mansion in Expedition 33 is the only time where ive seen lumen fail and that was mostly because of lumen reflections not being implemented well.
 
Core claim: "You can't have it all" isn't universally true
• They push back on the idea that on consoles you must pick only two of:
• Image quality / visual features
• Performance
Not universally true but the truly very common so not sure where is the change now.
 
TSR surprised me last night. i was expecting shimmering and lumen boiling at 720p internal resolution, but the only artifacting shimmering i noticed was in fences where it failed to properly resolve detail without artifacts. the grass in mafia had some noise but nothing as bad as what ive seen in PSSR. the far distance detail resolved surprisingly well. The trees in MGS Delta looked ok. I know FSR has issues with trees in Avatar. The detail on characters looked fine.

I had to restart mafia several times because i was like this doesnt look right. it shouldnt look this good. I had to switch between native upscaling and performance to confirm if the GPU load was going up. It looked that clean.

I still remember people thinking the original PS5 UE5 demo was native 4k because it looked so clean. It wasnt until Epic confirmed the internal resolution when everyone realized it wasnt true 4k. TSR was built for UE5 and looks excellent on PC. If consoles are trash, its probably because other settings are adding these shimmering artifacts, not because of TSR or some inherent issues with UE5. Because in terms of IQ, i have almost never had any issues with UE5. the mansion in Expedition 33 is the only time where ive seen lumen fail and that was mostly because of lumen reflections not being implemented well.

I'll give you a hint: r.TSR.History.ScreenPercentage=200

 
And that is why UE5 games frequently look fuzzy and undetailed in motion.
For all the features it has, it also has some of the worst image quality of any game engine.
what games are these? I play on PC like you do. I used to have a 3080 and only just upgraded to a 5080. I didnt not have these image quality issues with UE5. Stuttering in SH2, definitely. Traversal stutter in Immortals, 100%. But IQ issues in motion? UE5 to me looks just as detailed as any other game, if not more detailed due to nanite. I do use DLSS instead of TSR being used on consoles, but you are a pc gamer too.
 
what games are these? I play on PC like you do. I used to have a 3080 and only just upgraded to a 5080. I didnt not have these image quality issues with UE5. Stuttering in SH2, definitely. Traversal stutter in Immortals, 100%. But IQ issues in motion? UE5 to me looks just as detailed as any other game, if not more detailed due to nanite. I do use DLSS instead of TSR being used on consoles, but you are a pc gamer too.

Nanite is just a software rasterizer.
Yes, UE5 looks a lot better with good upscalers, such as DLSS4. But it always has a soft look to it. And hair and vegetation tend to look more shimmery than in other games, due to it using so many low res buffers and dithering.
 

"What Can Be Done About Unreal Engine 5"


Nothing. Devs should go back developing their own proprietary engine. UE isn't even worth all the trouble.
Heck, I'd say Trump should designate Tim Sweeney as a terrorist, his engine has terrorized million of gamers this whole generation.
 
Last edited:
so its not an engine issue. its just devs not utilizing these upscaling techniques correctly. or they just dont have the gpu power in the consoles to have a slightly more expensive version of TSR running on consoles.

This option is a bit expensive, so it is only enable on the 2 highest quality settings for TSR.
 
Nanite is just a software rasterizer.
Yes, UE5 looks a lot better with good upscalers, such as DLSS4. But it always has a soft look to it. And hair and vegetation tend to look more shimmery than in other games, due to it using so many low res buffers and dithering.
I tested the motion clarity in Black Myth: Wukong on my CRT monitor because it has no persistence blur during motion. With DLAA enabled, details remained clear, even during very fast motion. I could see some minor artifacting around pixel-thin grass, but that's just nitpicking. With DLSS-Quality, however, I started to notice noise around Wukong's hair even during slow movement. I also tested the motion blur filter in Black Myth: Wukong and noticed that it drastically increased the noise around the hair. Even with DLAA, there was some noise around the hair with that motion blur filter turned on, so I think this blur filter isnt not implemented well in this game. I remember similar problems in motion blur in UE4 games like for example "The Quarry" and that game used TAA.

But DLSSQ in black myth wukong isn't soft by any means. Here are my 4K DLSS-Quality (K preset) screenshots and the image look razor sharp:

b1-Win64-Shipping-2026-02-19-23-21-45-474.jpg


b1-Win64-Shipping-2026-02-19-23-26-21-679.jpg


b1-Win64-Shipping-2026-02-19-23-24-55-961.jpg


b1-Win64-Shipping-2026-02-19-23-26-40-170.jpg
 
Last edited:
God bless UE5, y'all just too entitled for this shit. Epic been pushing and handing out all these features that are gonna be the future of standard real-time graphics.
Also fuck them Sony crusty ass engines including Decima, they ain't pushing nothing, no RT, no nothing.

And shoutout goes out to Bluepoint man, Demon's Souls still top 5 graphics this gen, dem niggas the only ones that cared about pushing their engine to the max, shit was too trim, but then just to get done dirty like that bruh, sick industry.
 
Last edited:
I tested the motion clarity in Black Myth: Wukong on my CRT monitor because it produces no persistence blur during motion. With DLAA enabled, details remained clear, even during very fast motion. I could see some minor artifacting around pixel-thin grass, but that's just nitpicking. With DLSS-Quality, however, I started to notice noise around Wukong's hair even during slow movement. I also tested the motion blur filter in Black Myth: Wukong and noticed that it drastically increased the noise around the hair. Even with DLAA, there was some noise around the hair with that motion blur filter turned on.

But DLSSQ isn't soft by any means. UE5 looks razor sharp. Here are my 4K DLSS-Quality screenshots:

b1-Win64-Shipping-2026-02-19-23-21-45-474.jpg


b1-Win64-Shipping-2026-02-19-23-26-21-679.jpg


b1-Win64-Shipping-2026-02-19-23-24-55-961.jpg


b1-Win64-Shipping-2026-02-19-23-26-40-170.jpg

Yes, but you need the absolute best temporal upscaler with AI, to make it look good.
Anything lower and it starts to fall apart. And this is why so many UE5 games on consoles look terrible.
 
Your 5080 should run quite a few UE5 games at 4K native 60fps with console like settings. My OC'ed 4080S offers the same performance as stock 5080 in UE5, so I know what your card can do:
I dont wanna play at console alike settings tho, thats the keypoint, u can easily bump up the settings, especially rt that base ps5 has miniscule capabilities of( in cp2077 "rt/quality mode" it only has rt shadows, nothing else, it has better ssr quality vs perf mode but again, that is not rt of any kid, and it still runs in 1440p30, and lets not forget other settings in that mode are not maxed pc settings but somewhere around high, some even lower).

We do want that current gen/next gen graphical fidelity and stable 60fps, then ofc game gonna be demanding af, so even on highend mashines(not saying topend, since 5090 is still roughly 50% stronger from 5080, yups, thats probably highest performance gap between 80 and 90 series card of same family, price gap is even crazier :P ) u gotta go for slight compromise, which is dlss quality or even balance(aka ai upscaling from 1080p or 1440p to 4k).

Again how it looks for me now(and ofc its personal preference) but i go for stable 60fps(imho thats a must unless we playing some turn based strategy, point and click game or anything that doesnt give a damn about ur reflexes/response time), then u go for compromise between IQ and both raster settings and rt settings where u still dont want 4k native(coz its tons of gpu performance "wasted") but good enough(dlss quality/balance upscaled to 4k) IQ while pushing settings as close to max as u can, with some leeway for rt- some rt settings are crucial and really change how game looks/feels, some are resource hogs or/and are more optional.

Here good example High on Life 2- so actual UE5 and new too, not some 3yo game but it just launched this february.
On 5090, maxed out but 4k dlss quality, and later in the more demanding sections u see game drops to mid 70s, and since its all the beginning of the game its very likely that it could drop even lower at some rare superdemanding occasions:
 
I dont wanna play at console alike settings tho, thats the keypoint, u can easily bump up the settings, especially rt that base ps5 has miniscule capabilities of( in cp2077 "rt/quality mode" it only has rt shadows, nothing else, it has better ssr quality vs perf mode but again, that is not rt of any kid, and it still runs in 1440p30, and lets not forget other settings in that mode are not maxed pc settings but somewhere around high, some even lower).

We do want that current gen/next gen graphical fidelity and stable 60fps, then ofc game gonna be demanding af, so even on highend mashines(not saying topend, since 5090 is still roughly 50% stronger from 5080, yups, thats probably highest performance gap between 80 and 90 series card of same family, price gap is even crazier :P ) u gotta go for slight compromise, which is dlss quality or even balance(aka ai upscaling from 1080p or 1440p to 4k).

Again how it looks for me now(and ofc its personal preference) but i go for stable 60fps(imho thats a must unless we playing some turn based strategy, point and click game or anything that doesnt give a damn about ur reflexes/response time), then u go for compromise between IQ and both raster settings and rt settings where u still dont want 4k native(coz its tons of gpu performance "wasted") but good enough(dlss quality/balance upscaled to 4k) IQ while pushing settings as close to max as u can, with some leeway for rt- some rt settings are crucial and really change how game looks/feels, some are resource hogs or/and are more optional.

Here good example High on Life 2- so actual UE5 and new too, not some 3yo game but it just launched this february.
On 5090, maxed out but 4k dlss quality, and later in the more demanding sections u see game drops to mid 70s, and since its all the beginning of the game its very likely that it could drop even lower at some rare superdemanding occasions:

I saw this "High on Life 2" gameplay on the RTX 5090, and the performance in the 4K DLAA (native) segment was actually good compared to other UE5 games (59–85 fps). However, DLSS-Q did not improve the performance by much. B4BPCgamer turned on the average fps stats, and the 5090 averaged 84 fps in the DLSSQ segment. Maybe that DLSSQ segment was more demanding than introduction where he tested DLAA.
 
Last edited:
Top Bottom