I think people have rose-tinted glasses about CRTs. I recall plenty of image quality issues with the technology. Lack of 1:1 pixel mapping (overscan, geometry distortion), moiré, and flicker are the biggest ones that come to mind. CRT was practically the definition of image quality tradeoffs, as you could tune parameters to do things like reduce moiré at the cost of losing sharpness.
No-one is saying that they were perfect, but motion on a CRT is still many times better than just about anything else, they have effectively zero latency, they have far better viewing angles than anything else, are far better at displaying low resolution pixel art, don't have to worry about "non-native" input resolutions, and have better black level and uniformity than most LCDs.
Flicker was not a detriment, it was an asset. That's why they had such good motion handling.
With LCD or OLED you have to introduce flickering to make any significant improvement to motion handling.
If they were still being manufactured today, I would absolutely buy a brand new CRT monitor if there was something built to the spec of a Sony FW900.
I've given up on buying used displays at this point though, because the last ones were manufactured 10-15 years ago now.
That's why I hope it's not too much longer before motion handling is something that manufacturers start trying to compete on again.
It looks like Philips may have finally started to do something about it with their OLED this year (though I suspect it refreshes at 120Hz and not 60Hz) and I'm hearing rumors that Sony are going to have a blur reduction mode as well.