Here's a hint then.. don't make 30 fps games. This whole fake ai generated stuff is the exact wrong way we should be going. If the hardware cannot handle something then it can't handle it. We are actually paying more money to cover it up than we use to pay for actual performance improvements.
Unfortunately, this is a pipe dream... hell even borderline naive lol. There will always be 30fps games. As long as you have finite power in computer hardware, and ever-growing ambitions or standards... there will always be 30fps games. Even the 4090 that is today at the pinnacle of gaming hardware, will sometime in the future be only able to run games at at best... 30fps.
And that thing you said about paying more money to cover it up... couldn't be further from the truth. The issue, is that performance improvements as we traditionally knew them, are a thing of the past. We no longer have performance doubling from a simple node shrink anymore, and those node shrinks are becoming prohibitively expensive. Think about it, going from all the way from 90nm down to 28nm barely raised chip prices up by more than 10%. Going from 28nm to 7nm... saw chip prices rise by over 200%. and going lower is still getting more expensive.
The writing was clearly on the wall... we cannot advance the industry anymore by simply throwing more cores at the problem. We have to throw "smarter cores" at it. And hence... AI.
Oh and then there is the other thing... better-looking games.. even at 30fps... generate more buzz than than better-performing games at 60fps. So if a dev would ever have to choose, between pretty grafixxxx at 30fps or bleeding edge performance at 60fps... we know what they would choose. Case it point, just watch GTA6 be the best-selling game of this gen again.. all the while running at 30fps.