Because it is more complicated as Zathalus mentioned which you actually quoted hilariously, it’s far from perfect. It’s not an automatic catch all that works automatically in all cases in current form. You describing it as something that it currently is not. It’s not automatic, there is set up invovled. Average steam users aren’t going to go into a repo to make builds etc. as it’s not plug and play. I mentioned the deck because it is curated to one specific hardware setup right now. That is why I mentioned it, it’s all done in the background in current form for the Deck.
“Fossilize works across a wide variety of hardware and software. It even works on Windows. It’s not as perfect as it is on Steam Deck (as you said, fixed hardware), but it definitely helps with Vulkan games”
Well for PCs it runs in the background as well, specifically integrated into Steam. One game I know of it works well with is PoE on Windows. End user doesn’t really have to do anything.
This is not really a Linux vs Windows thing. It’s a Vulkan vs DX12 one. Nothing is preventing Microsoft from integrating a system like this into Windows for DX12 and it will be fundamentally invisible to the end user. Its not perfect, if you’re the first one with a specific driver and hardware config that runs game X then Fossilize cannot help you, so developers should still have a compilation step involved in the process. But the user after you on the same GPU and driver can get that pipeline and shader from what was uploaded when your PC ran it and use it. This applies to games that don’t have a compilation step as well, you’ll get shader stutter if you’re the first one again but the next user won’t, as your computer already did the work. With tens of millions of users doing this via Fossilize the odds of it working perfectly are very high. Basically almost 100% with a fixed platform like the Steam Deck. If Windows had it built in, then all games on all storefronts will benefit.
So either Microsoft gets something done, or developers need to Switch over to Vulkan entirely. The former is a toss up and the later probably won’t happen any time soon.
But are you not describing the recent front-end API development that is part of the response to the back-end problems/challenges/issues that I described?
Basically, with increased geometry complexity and texture quality the amount of files and data that is handled increases exponentially. This results in more traffic across the board which in turns requires micromanaging the flow (e.g., low level APIs)?
Then your take is that the micro stutters primarily are driven by poor coding utilising these lower level APIs and not the other factors. I am just saying that I believe that is only one piece of the puzzle (for the reasons stated).
Sort of. DX12 is meant to bring more performance with perfect developers, but most games would still run fine with DX11.
But shader stutter is certainly linked to the developers not having some sort of process in place to catch them, not any hardware limitation. You can ship the most demanding game using UE5 and it will stutter like mad on the most demanding hardware, but if you include a compilation step that catches all shader permutations you would suddenly be stutter free.
This only applies to shader stutter of course, traversal and other forms are a whole other matter. PS5 I/O certainly helps with this and it requires specific engine and software optimisation to get the most out of PC, but UE5 is terrible at this, even on console.