the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM
the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap
not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM
in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"
I actually proved it, in the exact same intro scene;
even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop
only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)
this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.
The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.
In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.
If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.
PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.
Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background
The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,
- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)
Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;
it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.
as i said, it is indeed the 2070 being the problematic part here.
i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.