Mesh Shaders were announced in like 2019.....a full year before UE5 and Nanite were announced.
Alan Wake 2 in 2023 was i believe the first mainstream game to actually use them.
Thats like 4 years for team of wizards like Remedy to implement.
And even Shadows took time to implement it and they dont even have full coverage yet.
So yes, the time to actually use meshshaders/virtual geometry has NOT been timely and likely studios would take the advantages that Epic and Unreal have in their solution rather than spend dev time creating their own.
Fun fact, PS2 basically had the ancestor of mesh shaders. Vector units. It didn't operate on work groups(?) and not much local memory (it had a tiny scratchpad) but maybe that was sufficient for the generation compared to geometry nowadays, probably quite a few things it wouldn't have handled but still the ancestor
F
Fafalada
would probably know a lot more. Weren't they programmable only in assembly? That's fucking nuts.
Was never meant to be a nanite back then, but they were fully programmable.
Geforce then introduced hardware T&L which I guess acceleration was the biggest point to it all back then, programmability be damned, the geometry needed to be faster. To hardware T&L → vertex shaders → geometry shaders (avoided at all cost it performed like shit)→ Mesh shaders. I wonder what would have happened if Sony had kept the vector unit idea and improved on it.
So the industry turned to fixed-function hardware to speed things up but they are at a point where now its back to compute, if of course you have enormous geometry details, otherwise fixed function is still faster.
Nanite is a tiny part what Mesh shaders can do, in this case it was for meshlet LOD. Nanite is also still hybrid with vertex shaders and not all assets was using mesh shaders, although they are slowly transitioning to mesh shaders more and more each iterations, like 5.5 having skinned mesh tessellation/displacement.
AMD / Nvidia / Apple / Intel have meshlet compression/decompression which happens in pipeline while I believe nanite was CPU (? so many versions It might not be anymore)
Mesh shaders are in infancy, like when programmable shaders were introduced in 2001.
Mesh shaders are insane for the concepts of procedural instancing such as loading say, procedural hair on a bunny and go directly to rasterization within the same pipeline without ping-ponging back to device memory. The old method for hair is a compute kernel for the hair geometry and then back to device memory and then back to render encoder for traditional draw calls and back to device memory. I mean it works but its a slower, takes more memory, more latency and is a bit clunky and not really exploiting current hardware capabilities. It can also have its algorithm to change that geometry on the fly without exiting pipeline because everything is programmable and custom. Mesh shaders can also do isosurfaces like metaballs, fluids, marching cubes, etc. Think Blender offline rendering geometry tricks but real-time.