LMAO. What is this?
RE4 would straight up crash if you turned RT on. No stuttering. No framerate drops. Straight up crashes. Every youtuber including NX Gamer was able to replicate this. You would have to turn down textures to 1GB or below just to get RT running on 8-10 GB cards regardless of the resolution. You had a 4080 with 16GB so of course you didnt have any issues. But if you didnt want to have textures pop-in right in front of you, you
And minor stutters in hogwarts? What are you on about? The game was a mess on day one. Dropping from 100 fps to 25 fps if you entered the wrong room. It uses 25 GB of system ram with RT on. Hogsmead performance was awful and CPU limited and affected both AMD and Nvidia GPUs.
I really dont care for this rewriting of history when you can go back and watch PC reviews of all of these games. Dead Space is still broken and EA stopped patching it months ago. Just because you own a $1,200 GPU that is owned by less than 0.5% of the PC gamers, doesnt mean the game wasnt broken for 99% of the GPUs out there.
Gotham Knights, Dead Space, Hogwarts, RE4, TLOU, and now Star wars are all poorly optimized. Some games like Gotham Knights, Hogwarts and TLOU were patched after a few weeks, but Hogwarts and Gotham Knights still have a massive RT overhead. RE4 no longer crashes with RT on, but their latest patch fucked up the high res textures settings and it now crashes if I select 8 or 6GB high res textures. I have to settle for 4GB and it causes so much LOD pop-in, its infuriating. Even with RT off. This ran fine with 8GB textures before the latest patch, no pop-in. How do you explain that?
Some of these issues are vram limited, but as the latest TLOU patch has proven, a 10GB 3080 has enough horsepower to run every PS5 game at 2x the framerate and enough vram to handle PS5 quality textures and effects. These games were just poorly optimized.