Motion blur found in video footage is due to the exposure time. A single frame doesn't represent a single instant in time but a period of time in which objects might move, creating the typical stretched look known as motion blur. The higher the framerate at which a camera captures video, the lower the exposure time has to be leading to LESS motion blur.
I've said that exact thing multiple times, and I'm not sure what you think you're responding to.
I never, EVER claimed that high-framerate video tends to use more motion blur frame-for-frame. My claim is that, if the human visual system were to view absurdly (like, SUPER absurdly) high-framerate video, the time-averaging effect of vision would produce a motion-blurred result through exactly the same mechanism by which it produces motion blur when looking at "real" moving objects.
If you have any evidence that support your theoretical framework of motion perception please share it because otherwise it seems completely made up in light of observations of the real world.
I would, but unfortunately the phrase "persistence of vision" is wrapped up in an absolutely idiotic discussion amongst film theorists, and it's hard to google this stuff. Things like
this VR discussion are some of the most
interesting tidbits, if unfortunately not entirely hard scientific; this particular example uses the concept I've brought up plenty (though it only explicitly says "persistence of vision" once).
Suffice it to say, it's reasonably well recognized that the human visual system forms a temporal low-pass filter of sorts, time-averaging incoming stuff. It explains why stroboscopic ghosting is noticeable in cases where you're pushing a decent frequency*, it explains why CRTs and film projection don't flicker if you use a respectably high refresh rate**, and it explains why motion blur occurs in human vision***.
I can't fathom how you think my claims "seem completely made up in light of observations of the real world," except that you seem to be misinterpreting most of what I say. In any case, if my assumption about visual time-averaging is wrong, I also can't fathom what you think causes motion blur in human vision.
*Because your vision doesn't drop stuff instantly, you might still be in the process of perceiving a past event when you begin perceiving a subsequent event.
**Human vision is literally smearing over the moments where the phosphors are dark with the bright moments. This is a good thing, because if our vision had the temporal sharpness to cleanly pick out the dark moments, we'd see the flicker pretty raw, which would make CRTs and film projection obnoxious enough to be basically unwatchable.
***Essentially the same reason that stroboscopic ghosting occurs, but over more continuous imagery.
//=================
Anyway, we've been going in circles literally all day, and I'm not going to respond to another "you're claim is contradictory because high framerate video has less motion blur" post.