Cool, except that isn't true (it's mostly down to the developers not the game engines, especially a mature engine like UE)
Wrong. The only reason developers re-engineer/customize engines is because either the inbuilt feature set doesn't cover their needs, or because performance is lacking. And why would anyone adopt an engine going in with the assumption that the engine they are paying for is lacking in performance?
So, its always a matter of how far below expectation the final product is, and never above target.
Yes. Developer quality/investment matters, because art and design quality are massively impactful. But expecting devs to magically pull performance way above the norm is simply unrealistic. The whole notion of third-party engines is that the core render-tech is already optimized, because that's a key component of the offer of these technologies, along with offering a pre-built pipeline and other productivity conveniences.
Stuff like native render resolution, vram and ram usage as a consequence of geometric and texture density are very predictable and as the target hardware has hard resource limits end up being decided upon relatively early in development. These parameters have to be stayed within, with the major delta in the end result being performance level for those targets.
So when you see early footage, chances are that its the absolute best-case scenario as running on the most performant and least constrained hardware. By the end of development when all content and the overall vision is complete, AND it also needs to function on the weakest target hardware, then trims and compromises are almost guaranteed to have been applied across the board for consistency.
Its not an attempt to deceive when the final result shows cut-backs or reduced performance, its just there are limits on what can be optimized back.