I'm sorry to bump this thread up but.... just WTF is going on with Direct Storage in this game on NVIDIA GPUs? We all tested this shit waaaaay back when there was only a demo of it and it was clear as day - Direct Storage had a massive advantage over traditional workloads. So why the hell does it perform so horribly in Ratchet & Clank... or probably the most important question right now should be - if it's so bad in R&C, how bad will it be in Horizon next week since it's still not fixed and probably won't be fixed in R&C?
I mean, honestly, the first time I launched the game and saw horrendous performance drops for no reason, I thought that maybe it was a shader compilation thing and I should just wait a bit in the main menu for the process to finish and then play the game. But hell no, the CPU literally did nothing while I waited and utilization was extremely low - which is not a good sign. I also know that it's not a GPU problem at all cuz 3080Ti is more than capable of running this game with over 100FPS in 1440p almost completely maxed out with all RT features and no upscaling.
So, imagine my surprise when after checking PCGamingWiki and deleting direct storage .dll files, I saw the game running almost flawlessly even during intense firefights with lots of explosions, particles etc., it also loads words during portal transitions very fast and CPU utilization is absolutely superb - PCGamingWiki has a note that you need to have beefy CPU to run the game without Direct storage but 8700K is like 4+ years old. There's a few dips in the prologue section to 56FPS but they're very brief and barely noticeable since they happen outside of gameplay.
What the hell happened?