When devs start thinking about a game they make it with the current technology they have, so by the time they are done that tech is already 5 years old if it took them that long, yes they can use marketing terms like Ray Tracing 4K VRR and all that but that doesn't change the game much.
that's why a GPU like the 4090 isn't used at it's full potential but instead is just a tool to have more frames and pixels on PS4 games, same for consoles now we are starting to see games made with PS5 and Series X in mind but still limited by old gen in some.
Nah, you are talking about hardware. hardware might be 2018, but tech is mostly software based and can be introduced mid development. Sometimes devs create tech just for one level of the game. IIRC, ND had to completely change their streaming system to make the train level in uncharted 2. most developers in the PS3 gen introduced streaming mid development.
last gen, it was PBR materials. ND worked on engine upgrades to make environments in uncharted 4 and Lost legacy much larger while working on the game. They also worked on adding Motion matching to TLOU2 all the way till the very end. Something no other dev has bothered messing with since.
Watch the 1943 developer interview with Epic. They are literally working on tech that Epic is developing as we speak. it's pretty much concurrent.
Yes, most devs just slap on RT and call it a day but a lot of devs do go above and beyond and invent new tech or utilize existing tech in new ways. Even movies are constantly doing that as they attempt to get away from green screen and CG to full 360 degrees sets.
Tech is the reason why launch games always end up looking far worse than end of gen games. compare TLOU1 and GTA5 to Resistance 1 and GTA4. Compare TLOU2 and Ghost Of Tsushima to Infamous and Killzone Shadow Fall. Devs are constantly working on improving tech. The problem is that the last few years, a lot of the A-tier devs have gone MIA but watch them end the gen strong.