Can we get Devs to work smarter too so we don't need bullshit frames to be able to play?
That's what graphical settings are for.
In a lot of games, settings like shadow resolution look damn near identical from ultra to even medium, in some cases requiring half the cost.
So why set to ultra when it barely looks any better (if at all) than something lower?
If you insist on gaming in ONLY 4k, make use of the scalers. Many games offer lots of tools to improve performance.
You can only optimize so much before you must come to terms that your hardware is just too weak.
Just because a high-end graphics card came out today doesn't mean it will be able to handle a game that also came out today.
The ratio is changing.
Back then, you could buy a new card, and it'll play games on ultra for years later.
But those games also can't remotely compare to the technology of what games have today.
Most modern games have graphics that look like CGI cutscenes, and those cutscenes were made in render farms in the old days.
It's a bit unreasonable to expect 1 card to do all of that in real time.
The computational requirements for today's graphics are starting to magnify. Not increase, MAGNIFY.