The elephant in the room is that the general public has no clue about graphics rendering and is not able to appreciate what huge uplifts Lumen and Nanite bring to the graphics. We literally have movie like assets now, every pebble in 3D at high poly counts and dynamic lighting that looks like reality. Yet people still compare that to last gen games like Ghost of Tatsushima and Cyberpunk like they would come even close to what UE5 renders on screen. According to the average joe, Cyberpunk is optimized, UE5 is not simply because UE5 renders at a much lower resolution and framerate . They fail to see UE5 games have much more sophisticated lighting (excluding path tracing on PC of course) and MUCH higher polycon count everywhere. Look at any ground in Cyberpunk and you will see a flat plane, in UE5 games there's a ton of depth and micro detail that even casts shadows.
UE5 is not unoptimized, it is just doing a lot more than last gen games which were programmed with PS4 in mind. That's not to say the engine is without quirks, of course and there are some significant issues, but still it is incredible what UE5 renders on screen. And of course that will come with a huge performance and resolution penalty, especially on consoles, but that is the price you pay for progression.
I agree that most people can't appreciate improvements in graphics fidelity because they don't understand what's being rendered on the screen. These people always cite Uncharted 4, The Last of Us 2 and Star Wars Battlefront because, in their minds, these games can look as good as modern current-gen games. However, if you view the assets up close and look at the quality of the indirect lighting, you will see a truly generational gap.
However, I wouldn't classify cyberpunk as a game that uses outdated technology. This game on PC was updated and now use PT lighting, the best technology available. The assets look very good too considering it's a huge open-world game. The buildings in this game are detailed everywhere, even if they are tall. CDPR also used a cool trick with the windows so that you can see inside. Of course, games using UE5 can achieve even higher polycounts; for example, the ground surface could be made up of tiny rocks instead of a flat texture or POM (Hellblade 2 is a good example). Nevertheless, Cyberpunk still holds up very well compared to most current-gen games and it's also extremely well optimized considered what's being rendered and the scale of the game.
Frame generation, dynamic resolution, Ray and path tracing are all fine technologies and techniques however, it is my personal opinion that all of those things have just made developers more lazy. Optimization is a dying skill set in the industry out side of the devs that use custom or proprietary engines, even then it's not a primary focus for a "finished" game now. Just like patching games led to unfinished games being released to meet the release date the higher ups set for whatever reason in the past rather than making sure it's as good as it can be at launch.
If some studio really focuses on releasing a sound and solid game that is optimized and only uses the aforementioned tech as a bonus or for the end user to push things then they would be in rare company. Games without a day/night cycle or real-time lighting for gameplay purposes DO NOT NEED FUCKING RAY TRACING. Let's knock off using it "cause it's cool" if it's not necessary. It's lazy.
You think that good optimization simply means good performance, but I have a different view. In my opinion, a game can be extremely demanding (only run at 30 fps on the 5090) while still being perfectly optimized, meaning available hardware resources will be well used and dont wasted on calculating uneccesary things. For example tesselated water in crysis 2 was renderend even when it wasnt displayed on the screen. Moder Maldo fixed that and made the game run faster and look better at the same time and that's a true optimization in my point of view.
Modern games are becoming increasingly more demanding, but not because of poor optimization, but because developers have big ambitions and are using very demanding technologies such as real-time lighting to realize their vision.
People praised Doom Eternal for its incredible optimization, but when ID added RT to 'The Dark Ages', performance declined — not because the engine is unoptimized, but simply because RT is demanding. Some youtubers clamied that current games can be better optimized, but they never proved it.
DLSS will not make UE5 lumen lighting more optimized. Thanks to DLSS tough even my 4080S can run these demansing UE5 games at 120-180fps without waiting for my next upgrade and image still looks perfect (4K like). Some developers expect people to use DLSS, so they include it in their recommended requirements. But that's only because times have changed and native resolution doesn't make much sense anymore.
We been getting 1080p since 2013, now it's 2026. I don't think expecting 1080p-1440p/60fps is too much too ask. RTGI/lumen is not worth it when IQ looks like shit.
I agree if we're talking about current-gen consoles, because they simply can't run UE5 games with all their iconic features enabled (VRS, Lumen and Nanite) and achieve reasonable image quality and 60 fps. Nanite and Lumen can make a difference, but not at 4K with a 25% resolution scale and poor image reconstruction (or simple bilinear upscaling). The benefits of these features are obscured by a blurry image and intense noise.
"High on life" is a good example. The first game runs on UE4 at 1800p on PS5, while the sequel runs on UE5 at 720p and use lumen / nanite / vsm. Technically, High on Life 2 has more advanced graphics, but the blur makes the game look worse. That's not the engine's fault, but the developers' fault because they opted for technology that can't run well on PS5 hardware at 60fps. I think PS5 has the power to run UE5 with all it's features, but only at 30fps.
For example Black Myth Wukong in quality mode on PS5 runs at 1440p reconstructed to 4K (with either FSR3 or TSR) and the image quality looks 4K like to my eyes on this screenshot. I think this level of image clarity would make all PS5 owners happy, but it's just 30fps.
In 60fps mode though the image looks like crap because the game is running at 1080p that's simply upscaled (bilinear filtering) to 4K. The game also use FSR FG to hit 60fps even at 1080p. The cost of 60fps is too big in UE5 on the PS5.
Since the PS5 has the power to run UE5 at its full potential (lumen / na ote / vsm) at 30 fps, I think it makes sense to use it for open-world games with dynamic TOD and slow combat that can be enjoyed at 30 fps. Maybe it would be also possibe to achieve 40fps with something like 1200p reconstructed to 4K. IMO on gamepad even 40fps already feels great and makes first person shooter games enjoyable.
I tested quite a few UE5 games on PC and this engine is very scalable. Here's good example. Cronos at lowest settings doesnt use lumen and nanite and the game runs at 4K native 260fps.
If I however run Lumen and nanite I get 57fps at the same resolution
57fps is playable but not ideal, so I used DLSSQuality + FGx2 to run the game at high refreshrate.
UE5 engine can run well if developers carefully select the technology they use.