Completely understand what you are saying, but other than nanite, why is there such a discrepancy between UE4 and UE5 games in the stability of the performance -where surely those developer skills misgivings should translate from UE4 to UE5 and just tank performance, rather than be unstable like it seems in many UE5 games, no?
The reason I had believed that the flow control with interpreted bytecodes vs native would have been a noticeable issue, was rooted in using Java two decades ago.
I realised that even with performance benchmarks showing great comparative performance for throughput with Java relative to C, it was actually the latency in the interfaces that really highlighted the performance difference in task switching.
So I've assumed the stability of the latency in the Blueprints interpreter would be similar, and as flow control is typica;;y single core/thread bound, I assumed that latency stability would cascade up the engine causing marginally bigger performance losses. But it is interesting that UE's VM doesn't have that issue and you have displace a misplaced view I've held of VM interpreters in general over the years.
Just to be clear, VMs _are indeed_ slower. Blueprint code runs slower than C++ code despite achieving the same thing. But if, say, you have a single metric (average fps) that you measure performance by, that in itself won't tell you why that is and you have to look closer: what's the CPU doing, what is the GPU doing and so on.
And chances are you will run into performance issues because of content way before you run into them because of your own code, if you are somewhat competent.
Code on the game layer, not the engine layer, can easily tank performance just the same if you are doing things the wrong way, and some things definitely should be done in C++ instead of BP (although BP is completely fine for the average indie game).
My point isn't that game logic, especially using BP, can't be a problem, but moreso that if you are semi-competent and follow best practices and have general awareness for code performance, this isn't the first problem perf-wise you'll run into when you make a new Unreal project.
It's C++ systems or GPU work that are made to do more lifting than they should by the users, and that's due to content being abused.
Some of the knowledge required is arcane, but a lot of it is ignored best practice.
I am not even sure that UE4 games performed better on average; UE5 games run fine for the most part although some games definitely don't.
That used to be the case in UE3 and UE4 games also though. To what degree this has changed, I can not say.
There is a perception of that in social media currently, but a lot of it is just voices screaming louder than before, or people grifting, such as Threat Interactive.
But let's say indeed average performance has gone down, more frame dips, more stutters and so on: then I'd say it's a combination of new technology having to go through a ripening process where systems aren't as optimized as they should be yet, and simply a misguided perception of the gamer populace.
For example, people can point to bad frames without frame generation and say "why the hell does it perform this badly without frame generation?" and continue playing without DLSS or frame gen etc.
But that's because they hold on to their outdated perception of how games are supposed to work.
There is no "supposed to work": it's all about tradeoffs. Which tradeoffs are chosen one might disagree with, but their existence is based in reality and can not simply be "fixed".
For example, if modern tech is disproportionally more expensive for modern hardware due to raytracing, one can either a) wait for hardware to catch up, b) not use raytracing as much as possible or c) use the proposed solutions such as DLSS and frame gen.
The idea of "fake frames" and "real frames" didn't exist before, but they are literally solutions to some of the frame problems. Solutions that one might not like, but they are solutions regardless.
It's not that people just made expensive systems and rendering for no reason, saw low frames, and called it a day. It's all an ecosystem working together.
Bright minds decided this is the best possible course, which doesn't mean it's ideal.
As for "where are things going wrong where developer skills or lack of skills should have translated from UE4 -> UE5" (implying the engine is at fault; arguably not 100% true but let's say it is):
Changes to technology require changes to workflows. For example, overdraw is and will continue to be a big problem in dense foliage environments.
But with Nanite one should handle foliage differently compared to before, namely to use geometry for leaves, not alpha cards. Alpha cards were costly in UE4 as well, but the strategy to work with them has changed between UE4 and UE5 with Nanite.
It's possible however that not everyone got the memo.
The same goes for Lumen. Lumen wants geometric thickness to solve light leaks. Previously in UE4, this often wasn't necessary and walls in interior spaces could be flat planes.
This also goes for temporal artefacts/smear etc.
There are ways to mitigate and work with them, but not everyone has made the transition to new workflows or paradigms.
That being said it's not just the developer's responsibility; or rather, it is because it's a professional effort, but in the complex world of modern game dev no one has a complete idea of all the details, so Epic can help improve the situation, and we do with every engine release.