• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

DF Direct Q+A: The Big Unreal Engine 5 Image Quality/Performance Debate

I don't think next gen will solve the elephant in the room: RT, bad optimization, shaders comp...

I don't remember the last time I praised UE.

The resolution is getting lower and lower, dynamic resolution is not enough, we are almost at the edge of back to 30 fps games, for what?
 
Nah fanboys are just too nick picky while the masses don't really give a shit unless it's an extreme case.
We been getting 1080p since 2013, now it's 2026. I don't think expecting 1080p-1440p/60fps is too much too ask. RTGI/lumen is not worth it when IQ looks like shit.
 
Last edited:
I don't think next gen will solve the elephant in the room: RT, bad optimization, shaders comp...

I don't remember the last time I praised UE.

The resolution is getting lower and lower, dynamic resolution is not enough, we are almost at the edge of back to 30 fps games, for what?
I think nexgen will solve most issues. upscalers should be much better, much better RT hardware plus a good deal more powerful, should be no excuses for descent developers.
 
Last edited:
The elephant in the room is that the general public has no clue about graphics rendering and is not able to appreciate what huge uplifts Lumen and Nanite bring to the graphics. We literally have movie like assets now, every pebble in 3D at high poly counts and dynamic lighting that looks like reality. Yet people still compare that to last gen games like Ghost of Tatsushima and Cyberpunk like they would come even close to what UE5 renders on screen. According to the average joe, Cyberpunk is optimized, UE5 is not simply because UE5 renders at a much lower resolution and framerate . They fail to see UE5 games have much more sophisticated lighting (excluding path tracing on PC of course) and MUCH higher polycon count everywhere. Look at any ground in Cyberpunk and you will see a flat plane, in UE5 games there's a ton of depth and micro detail that even casts shadows.

UE5 is not unoptimized, it is just doing a lot more than last gen games which were programmed with PS4 in mind. That's not to say the engine is without quirks, of course and there are some significant issues, but still it is incredible what UE5 renders on screen. And of course that will come with a huge performance and resolution penalty, especially on consoles, but that is the price you pay for progression.
 
Last edited:
The elephant in the room is that the general public has no clue about graphics rendering and is not able to appreciate what huge uplifts Lumen and Nanite bring to the graphics. We literally have movie like assets now, every pebble in 3D at high poly counts and dynamic lighting that looks like reality. Yet people still compare that to last gen games like Ghost of Tatsushima and Cyberpunk like they would come even close to what UE5 renders on screen. According to the average joe, Cyberpunk is optimized, UE5 is not simply because UE5 renders at a much lower resolution and framerate . They fail to see UE5 games have much more sophisticated lighting (excluding path tracing on PC of course) and MUCH higher polycon count everywhere.
The tech can be impressive but also not really useful. I love tech and those ray tracing card are impressive. But as a end user, I would say those tech have a huge performance impact and a really small ROI. Those tech are not ready. It's like at the beginning of 3D, while technically impressive, a lot of 2D games were looking better than the 3D game.
 
I think 900p should be the baseline.... And upscale to 1800p. PS4 era has some solid looking 1800p checkerboard rendering games.

900p internal then whatever upscaling method to target 1800p.
 
Last edited:
The elephant in the room is that the general public has no clue about graphics rendering and is not able to appreciate what huge uplifts Lumen and Nanite bring to the graphics. We literally have movie like assets now, every pebble in 3D at high poly counts and dynamic lighting that looks like reality. Yet people still compare that to last gen games like Ghost of Tatsushima and Cyberpunk like they would come even close to what UE5 renders on screen. According to the average joe, Cyberpunk is optimized, UE5 is not simply because UE5 renders at a much lower resolution and framerate . They fail to see UE5 games have much more sophisticated lighting (excluding path tracing on PC of course) and MUCH higher polycon count everywhere. Look at any ground in Cyberpunk and you will see a flat plane, in UE5 games there's a ton of depth and micro detail that even casts shadows.

UE5 is not unoptimized, it is just doing a lot more than last gen games which were programmed with PS4 in mind. That's not to say the engine is without quirks, of course and there are some significant issues, but still it is incredible what UE5 renders on screen. And of course that will come with a huge performance and resolution penalty, especially on consoles, but that is the price you pay for progression.
IQ and fps is gonna be always the most important thing people notice now cause graphics already look good enough




compare this to high on life on ps5 and this looks a generation ahead cause the damn IQ is so bad. like john said maybe they should use UE5 and not use lumen if its gonna ruin the graphics.
 
I don't think next gen will solve the elephant in the room: RT, bad optimization, shaders comp...

I don't remember the last time I praised UE.

The resolution is getting lower and lower, dynamic resolution is not enough, we are almost at the edge of back to 30 fps games, for what?
Frame generation, dynamic resolution, Ray and path tracing are all fine technologies and techniques however, it is my personal opinion that all of those things have just made developers more lazy. Optimization is a dying skill set in the industry out side of the devs that use custom or proprietary engines, even then it's not a primary focus for a "finished" game now. Just like patching games led to unfinished games being released to meet the release date the higher ups set for whatever reason in the past rather than making sure it's as good as it can be at launch.

If some studio really focuses on releasing a sound and solid game that is optimized and only uses the aforementioned tech as a bonus or for the end user to push things then they would be in rare company. Games without a day/night cycle or real-time lighting for gameplay purposes DO NOT NEED FUCKING RAY TRACING. Let's knock off using it "cause it's cool" if it's not necessary. It's lazy.
 
The elephant in the room is that the general public has no clue about graphics rendering and is not able to appreciate what huge uplifts Lumen and Nanite bring to the graphics. We literally have movie like assets now, every pebble in 3D at high poly counts and dynamic lighting that looks like reality. Yet people still compare that to last gen games like Ghost of Tatsushima and Cyberpunk like they would come even close to what UE5 renders on screen. According to the average joe, Cyberpunk is optimized, UE5 is not simply because UE5 renders at a much lower resolution and framerate . They fail to see UE5 games have much more sophisticated lighting (excluding path tracing on PC of course) and MUCH higher polycon count everywhere. Look at any ground in Cyberpunk and you will see a flat plane, in UE5 games there's a ton of depth and micro detail that even casts shadows.

UE5 is not unoptimized, it is just doing a lot more than last gen games which were programmed with PS4 in mind. That's not to say the engine is without quirks, of course and there are some significant issues, but still it is incredible what UE5 renders on screen. And of course that will come with a huge performance and resolution penalty, especially on consoles, but that is the price you pay for progression.
I agree that most people can't appreciate improvements in graphics fidelity because they don't understand what's being rendered on the screen. These people always cite Uncharted 4, The Last of Us 2 and Star Wars Battlefront because, in their minds, these games can look as good as modern current-gen games. However, if you view the assets up close and look at the quality of the indirect lighting, you will see a truly generational gap.

However, I wouldn't classify cyberpunk as a game that uses outdated technology. This game on PC was updated and now use PT lighting, the best technology available. The assets look very good too considering it's a huge open-world game. The buildings in this game are detailed everywhere, even if they are tall. CDPR also used a cool trick with the windows so that you can see inside. Of course, games using UE5 can achieve even higher polycounts; for example, the ground surface could be made up of tiny rocks instead of a flat texture or POM (Hellblade 2 is a good example). Nevertheless, Cyberpunk still holds up very well compared to most current-gen games and it's also extremely well optimized considered what's being rendered and the scale of the game.

Frame generation, dynamic resolution, Ray and path tracing are all fine technologies and techniques however, it is my personal opinion that all of those things have just made developers more lazy. Optimization is a dying skill set in the industry out side of the devs that use custom or proprietary engines, even then it's not a primary focus for a "finished" game now. Just like patching games led to unfinished games being released to meet the release date the higher ups set for whatever reason in the past rather than making sure it's as good as it can be at launch.

If some studio really focuses on releasing a sound and solid game that is optimized and only uses the aforementioned tech as a bonus or for the end user to push things then they would be in rare company. Games without a day/night cycle or real-time lighting for gameplay purposes DO NOT NEED FUCKING RAY TRACING. Let's knock off using it "cause it's cool" if it's not necessary. It's lazy.
You think that good optimization simply means good performance, but I have a different view. In my opinion, a game can be extremely demanding (only run at 30 fps on the 5090) while still being perfectly optimized, meaning available hardware resources will be well used and dont wasted on calculating uneccesary things. For example tesselated water in crysis 2 was renderend even when it wasnt displayed on the screen. Moder Maldo fixed that and made the game run faster and look better at the same time and that's a true optimization in my point of view.

Modern games are becoming increasingly more demanding, but not because of poor optimization, but because developers have big ambitions and are using very demanding technologies such as real-time lighting to realize their vision.

People praised Doom Eternal for its incredible optimization, but when ID added RT to 'The Dark Ages', performance declined — not because the engine is unoptimized, but simply because RT is demanding. Some youtubers clamied that current games can be better optimized, but they never proved it.

DLSS will not make UE5 lumen lighting more optimized. Thanks to DLSS tough even my 4080S can run these demansing UE5 games at 120-180fps without waiting for my next upgrade and image still looks perfect (4K like). Some developers expect people to use DLSS, so they include it in their recommended requirements. But that's only because times have changed and native resolution doesn't make much sense anymore.

We been getting 1080p since 2013, now it's 2026. I don't think expecting 1080p-1440p/60fps is too much too ask. RTGI/lumen is not worth it when IQ looks like shit.
I agree if we're talking about current-gen consoles, because they simply can't run UE5 games with all their iconic features enabled (VRS, Lumen and Nanite) and achieve reasonable image quality and 60 fps. Nanite and Lumen can make a difference, but not at 4K with a 25% resolution scale and poor image reconstruction (or simple bilinear upscaling). The benefits of these features are obscured by a blurry image and intense noise.

"High on life" is a good example. The first game runs on UE4 at 1800p on PS5, while the sequel runs on UE5 at 720p and use lumen / nanite / vsm. Technically, High on Life 2 has more advanced graphics, but the blur makes the game look worse. That's not the engine's fault, but the developers' fault because they opted for technology that can't run well on PS5 hardware at 60fps. I think PS5 has the power to run UE5 with all it's features, but only at 30fps.

For example Black Myth Wukong in quality mode on PS5 runs at 1440p reconstructed to 4K (with either FSR3 or TSR) and the image quality looks 4K like to my eyes on this screenshot. I think this level of image clarity would make all PS5 owners happy, but it's just 30fps.


04-test-black-myth-wukong-porownanie-wersji-pc-oraz-playstation-5-jakosc-path-tracingu-dlss-fsr-xess.png


In 60fps mode though the image looks like crap because the game is running at 1080p that's simply upscaled (bilinear filtering) to 4K. The game also use FSR FG to hit 60fps even at 1080p. The cost of 60fps is too big in UE5 on the PS5.

04-test-black-myth-wukong-porownanie-wersji-pc-oraz-playstation-5-jakosc-path-tracingu-dlss-fsr-xess.png


Since the PS5 has the power to run UE5 at its full potential (lumen / na ote / vsm) at 30 fps, I think it makes sense to use it for open-world games with dynamic TOD and slow combat that can be enjoyed at 30 fps. Maybe it would be also possibe to achieve 40fps with something like 1200p reconstructed to 4K. IMO on gamepad even 40fps already feels great and makes first person shooter games enjoyable.

I tested quite a few UE5 games on PC and this engine is very scalable. Here's good example. Cronos at lowest settings doesnt use lumen and nanite and the game runs at 4K native 260fps.

4-K-ultra-low-settings.jpg


If I however run Lumen and nanite I get 57fps at the same resolution

4-K-native-high-settings.jpg


57fps is playable but not ideal, so I used DLSSQuality + FGx2 to run the game at high refreshrate.

4-K-DLSSQ-fg2-high.jpg


UE5 engine can run well if developers carefully select the technology they use.
 
Last edited:
You think that good optimization simply means good performance, but I have a different view. In my opinion, a game can be extremely demanding (only run at 30 fps on the 5090) while still being perfectly optimized, meaning available hardware resources will be well used and dont wasted on calculating uneccesary things. For example tesselated water in crysis 2 was renderend even when it wasnt displayed on the screen. Moder MaldoHD fixed that and made the game run faster and look better at the same time and that's a true optimization in my point of view.

Modern games are becoming increasingly more demanding, but not because of poor optimization, but because developers have big ambitions and are using very demanding technologies such as real-time lighting to realize their vision.

People praised Doom Eternal for its incredible optimization, but when ID added RT to 'The Dark Ages', performance declined — not because the engine is unoptimised, but simply because RT is demanding. Some youtubers clamied that current games can be better optimized, but they never proved it.

DLSS will not make UE5 lumen lighting more optimized. Thanks to DLSS tough even my 4080S can run these demansing UE5 games at 120-180fps without waiting for my next upgrade and image still looks perfect (4K like). Some developers expect people to use DLSS, so they include it in their recommended requirements. But that's only because times have changed and native resolution doesn't make much sense anymore.

I never made the statement that optimization = performance.

The examples you sighted for Crysis 2 is a good observation of perhaps an oversight by the devs at Crytek or they hit a performance metric internally that was satisfactory for release standards. And as you shared, simple fix from the modding community and it paid off with improved visuals and performance. This reinforces what I was saying about release dates instead of making sure the game is as good as it can be prior to launch. Lazy, accidental, or inept it's always at least one of those things.

The Doom Dark Ages example is also proving my point, they didn't need path tracing. The game ran stupid fast on most modern hardware with little to know drops or image quality issues WITHOUT frame generation or dynamic resolution. There is such a thing as diminishing returns and if your title can run near 200 fps at max settings without DLSS/FSR or resolution scaling then bravo. If you then want to push it a little with path tracing and it takes a marginal hit, by all means. It's not being forced on the end user, it's not detracting from the experience, and it doesn't make your team look inept, lazy, or dumb. Leave it up to the players to decide if they would rather have the path traced visuals (I could see the difference but it wasn't a completely new experience with it on) or slightly higher framerates.
 
I never made the statement that optimization = performance.

The examples you sighted for Crysis 2 is a good observation of perhaps an oversight by the devs at Crytek or they hit a performance metric internally that was satisfactory for release standards. And as you shared, simple fix from the modding community and it paid off with improved visuals and performance. This reinforces what I was saying about release dates instead of making sure the game is as good as it can be prior to launch. Lazy, accidental, or inept it's always at least one of those things.

The Doom Dark Ages example is also proving my point, they didn't need path tracing. The game ran stupid fast on most modern hardware with little to know drops or image quality issues WITHOUT frame generation or dynamic resolution. There is such a thing as diminishing returns and if your title can run near 200 fps at max settings without DLSS/FSR or resolution scaling then bravo. If you then want to push it a little with path tracing and it takes a marginal hit, by all means. It's not being forced on the end user, it's not detracting from the experience, and it doesn't make your team look inept, lazy, or dumb. Leave it up to the players to decide if they would rather have the path traced visuals (I could see the difference but it wasn't a completely new experience with it on) or slightly higher framerates.
Doom TDA wouldn't look much worse with pre-baked lighting, but the levels in this game are so huge and so varied that real-time lighting makes sense. You can also destroy many objects and even some buildings when piloting Atlan mechs. Real time lighting makes that destruction much more realistic.

ID Software said that without RT, they wouldn't have been able to develop "The Dark Ages" so quickly, and even if they had prebaked the lighting, the game would have required an enormous amount of disk space. I also think Doom The Dark Ages has good performance considering what's being rendered. I had around 55-70fps at 4K TAA native (5fps less with DLAA) and I consider that a good result for games with heavy RT. Thanks to DLSS I was able to play at TDA at 90-110fps (DLSS-Quality) and 140-160fps with FGx2. At 1800p with the same settings I had around 200fps, so I was fully satisfied with performance. DOOM TDA is much more demanding with PT though (in fact it's probably the most demanding PT I played) but since PT is optional in this game, I don't have a problem with it.

4K DLAA


DLAA.jpg



4K DLSS-Quality + DLSS FGx2


DLSSQ-FGx2.jpg
 
Last edited:
Doom TDA wouldn't look much worse with pre-baked lighting, but the levels in this game are so huge and so varied that real-time lighting makes sense. ID Software said that without RT, they wouldn't have been able to develop "The Dark Ages" so quickly, and even if they had prebaked the lighting, the game would have required an enormous amount of disk space. I also think Doom The Dark Ages has good performance considering what's being rendered. I had around 55-70fps at 4K TAA native and I consider that a good result for games with heavy RT. Thanks to DLSS I was able to play ar 90-100fps and 140-150fps with FGx. At 1800p withe the same settings I had 200fps, so I was fully sarisfied with performance. DOOM TDA is much more demanding with PT though (in fact it's probably the most demanding PT I played) but since PT is optional in this game, I don't have a problem with it.
For file size and efficiency, those are great use cases for RT.

When Dark Ages launch pre-pathtracing, I was averaging 80 fps with 3840x1080. no FSR or frame generation. I never tried it with PT as I 100% it before that launched. I've only messed with PT in CP2077 but since I have a 7900 XTX I don't get to use the AI RT shit that AMD added last year and with the cost of shit now I get to just brute force things.
 
Top Bottom