While this is not a good look for AMD, this only affects like 1% of games. Cyberpunk, Metro and Control are literally the only games in the last five years that have utilized ray tracing like this.
I will speak from my own experience, but I had a 2080 that i got for cyberpunk. Turns out its not enough to run cyberpunk at a decent resolution with RT on so i waited to get a 3080. Eventually got it almost 2 years later only to find out that i would have to set ray tracing to medium and dlss to performance mode to get anywhere close to 60 fps. now path tracing is pretty much a mess even with ray reconstruction.
Meanwhile, this year alone, the following games have come out and ive had to disable ray tracing for one reason or another. RE4 would crash if i used high res textures and ray tracing. Hogwarts would also run out of vram if i used rt at 4k dlss quality. so stuck with ultra settings and 4k instead. star wars is a disaster but its a disaster on all GPUs. TLOU had no ray tracing and no nvidia advantage.
In fact, since cyberpunk came out, the 6950xt which was going for the same price as my 3080 when i got it, (its cheaper now) has beaten the 3080 in every single game. Go look at games that have come out in 2021, 2022 and 2023 and you would be surprised at how few have ray tracing. let alone ray tracing you can use on these vram starved cards.
And yet no one really talks about the actual user experience. i wouldnt be surprised if there are less than 500k 4090s in circulation. AMD should definitely improve their RT performance but once cyberpunk path tracing is done, you can look at Alan Wake 2 and then forget about another path tracing game until the next metro comes out in 3 years.