I have yet to see a reasonable explanation why AAA developers can't stick with UE4, built their assets in a lower res and use upscaling tech. When I go back and play the big games from the later third of the PS4 era on my PS5, they look great and run great. I don't really feel any need to have graphics like we've recently seen in Hellblade 2. I'd rather have modest graphics running at a smooth frame rate and a reasonable budget that allows studios to make games they want to make.
Third party engines have limitations on what they can and can't do, and to overcome such limitations you usually have to do a lot of the dirty work yourself anyway. I'd say the only reason many devs adopt these engines at all is because there are a high number of workers with experience on them, as opposed to an in-house engine new employees would have to learn from scratch.
The issue with big budget games isn't the quality of the assets as much as it is all the work you have to do to have a game that matches these assets. The more details in the enviroment, the more testing must be done, more teams specialized in certain aspects of the game like the lighting or visual effects, more work on the animation, sound, VA, directing, etc. More workers, more management, more logistics, more bureaucracy, and so on.
Late PS4 era games already had to deal with these issues, had equally long development times and increasingly bigger budgets. They were just the first batch where devs looked back and were forced to consider the possibility they can't keep this up forever. For the "modest graphics" and "reasonable budget" you speak of to have any meaning, you must be willing to revert back to at least ps1/2/3 era visuals and presentation, and not necessarily the big hitters from these times either.