Honestly, since 2018 they havent improved their ray tracing performance.
Turing 1x ray/triangle intersection rate
Ampere 2x ray/triangle intersection rate
Ada 4x ray/triangle intersection rate
While old RT titles don't really stress this as much as path tracing does, Turing is amputated with Cyberpunk 2077 overdrive where high poly and complex geometry open world is just too freaking hard for 1x ray/triangle.
They said the 4000 series would offer better RT performance for the same processing power but that turned out to be minimal at best.
4070 with 46 RT cores keeps up in CP2077 OD with a 3080 with 68 RT cores. Fundamentally there's less clocks on ampere but that does not compensate for nearly 1/3 reduction in RT cores
Maybe they cant improve it further but its been 6 years and their IPC gains have been non-existent. Just brute forcing performance is simply not good enough because that means the cards get bigger, hotter and more expensive. It's literally 1:1 what happened with intel. Only difference is that AMD didnt improve RDNA like they did Zen.
3080 628mm^2 → 4080 379mm^2
Again, what parallel universe is this?
1:1 what happened with intel
You could point to AI but again, its mostly on the software side. The 40 series cards will have the same DLSS upgrades as the 20 series cards from 2018. Aside from framegen which should be unlocked for all cards but thats another topic. Now if they can use their AI cores to somehow improve Path tracing or Ray tracing performance then great but using it exclusively for DLSS for 6 years is sitting on their laurels.
Maybe they take the next big leap forward with the 5000 series, but as someone who has owned both the 2080 and 3080, im rather undewhelmed by the progress. The 4080 being only 50% more powerful than the 3080 despite being $500 more expensive is not a great look.
AMD followed with the pricing scheme of just interpolating performance gen to gen with performance gain too. They're both in a shit situation for pricing and both not in a great look. Welcome to GPUs 2024.
Cuda, Gsync, AI, DLSS, RTX, Ray reconstruction, ReSTIR, Gsync, RTX Remix, AI HDR, etc. They're ahead in every perceivable way technologically and have again many things in the pipeline if we look at their papers. With >80% marker share monopoly and no competition in sight to push them to do these innovations.
There's nothing 1:1 to intel, what insanity is this.
Can you even name something Intel did since 2002's hyper-threading? I can't. At least not up to Ryzen 1st gen. Nearly 15 years left for AMD to get their shit together.
The only way AMD grabs a portion of the market is if Nvidia is so high up in AI gold money that gaming GPUs actually become a hindrance and "wasted" silicon for them as they would make way more cash to use that wafer for AI cards. Giving up on desktop GPUs entirely as they ascend to another realm of >10 trillion $ market.