Yes, I never denied that there was judder for 24p, but that's technically supposed to be there as it's inherent with the source.
It's not supposed to be there though.
If you view a 24 FPS source at 24Hz on a CRT or other low-persistence display, motion is perfectly fluid.
It's only once you start repeating frames or viewing it on a high-persistence display that it is no longer smooth.
With that low of a framerate, you're just going to be missing information between frames. Frame doubling, tripling, or quadruple the signal should, however, result in still smooth motion most of the time. This is at lower speeds of motion, of course, but you at least won't lose resolution. This is most obvious with long camera pans in movies. They're slower more often than not, so loss of resolution has a tendency to be more noticeable than judder in these instances.
I've posted it here before, but this image should demonstrate how reducing persistence improves motion smoothness - so long as frames are not being repeated.
Caution: low framerate flashing images:
Both circles here are moving back and forth at 10 FPS.
The upper circle has full persistence and is displayed the entire time.
The lower circle has low persistence and is displayed for 1/6 the time.
If you cover up each half, you should see that the lower circle appears to be moving much more smoothly than the upper circle.
The same thing happens with 24 FPS motion displayed at 24Hz on a CRT. It looks as smooth as if you turned up the interpolation to its smoothest setting on a modern TV.
There are no artifacts though, because it's not using interpolation. There is, however, an awful lot of flickering.
As soon as you start repeating frames, it no longer looks smooth. Even going to 48Hz means that it starts to judder like you see on any recent TV with interpolation disabled.
48Hz on a low-persistence display arguably looks worse than a high-persistence display since you get very clear and distinct double-images instead of motion blur.
I think we're arguing about two different things here. You're argument isn't so much against 24p on a low-persistence display, but instead against the inherent limitations of 24fps material. I can respect that argument as it's all fairly subjective anyway. High-persistence displays at least mask that with blur (at the expense of resolution). Given the choice of a good low- or high-persistence display for 24p, however, I personally would choose low-persistence. I prefer it because I like the clarity, am not bothered by the judder, and want something that's more accurate to the source.
I agree that 24 FPS is an inherently limited format, but it's made worse by the way that modern displays present it.
I don't know that I would agree with repeated frames being "more accurate to the source".
It's true that the frames being displayed on your television have less processing applied to them, which is closer to the data being sent to the TV.
However when you use motion interpolation to smooth things out, that is closer to the smoothness that 24 FPS sources were originally supposed to have - though it does add unintended artifacts.
So the question is really about whether you believe that leaving the data untouched is being more faithful to the source, or preserving the original look of motion is more faithful to the source.
Both have serious drawbacks caused by the low framerate.
Ideally the framerate would be high enough that interpolation is no longer necessary. (120 FPS or higher)
Motion interpolation isn't a terrible compromise on paper, but I've yet to see it implemented without artifacts. They often stick out like a sore thumb to me when I'm watching something with it enabled. Artifacts are kind of a given anyway since the processing has to guess what's in the next frame on the fly. I also don't care for the soap opera effect either (and I'm certainly not alone here), but that can usually be reduced at least. I just prefer mucking with the source as little as possible. Interpolation shouldn't be necessary, however, to get decent motion on low-persistence displays.
"Soap opera effect" is just a disparaging term that 'purists' use because they have been conditioned to
prefer bad motion.
These people just don't like smooth motion and are the sort that would argue against 60 FPS gaming too. (or worse, 120-240 FPS on PC)
Once OLEDs have native 120Hz input support, it should hopefully be possible to demonstrate what perfect 24Hz motion looks like again as you will be able to encode (or play back) a video using black frame insertion to reduce the effective displayed refresh rate to 24Hz - assuming the response time can keep up.
Perhaps that will convince them that the result you get from interpolation is how smooth 24 FPS material was originally intended to look before displays started repeating frames or using high-persistence.
It won't be CRT-like, but you could reduce persistence to ~8ms which would hopefully be low enough.
I do agree that interpolation artifacts can be distracting but I'm not sure that they're worse than the severe judder you get without interpolation.
Yes, I know how plasma displays work. I never denied the trails were there or even noticeable. My argument was that most of those flaws aren't as perceptible with a 60p signal. The picture you posted, for example, doesn't reflect what it looks like at low or medium speed motion--only at a high one. Not to mention some content masks this better than others. Thus, my point was that in typical situations, motion on a plasma can still look very good (particularly in comparison to the average LCD). Further, while its motion is not quite as good as CRT or comparable tech (as I had also mentioned), it's still relatively decent. Maybe I misread your earlier post, but I got the impression you didn't think very highly of plasma in this regard.
Well my point is that you don't really get a 60Hz image with a Plasma TV.
It depends on the model, but with the example used above, you get a low-persistence 600Hz presentation of a 60 FPS source - which brings about the same problems as the CRT displaying 24 FPS at 48/72/96Hz.
It's just that, as you reach higher and higher refresh rates, the more those duplicated frames start to look like motion blur because the difference in position between them gets smaller.
It's the same thing with LCDs that use high frequency PWM backlighting.
You won't find many people defending LCDs which use PWM-controlled backlights, but the motion artifacts from that are similar to a Plasma TV.
These examples would be 300Hz PWM (5 repeats at 60Hz):
Nonetheless, I'm curious where you got the images. Would you mind sharing the link? Seems like there's probably an interesting article attached to them.
Sorry, I saved them to my PC years ago and I don't have a link to the original article.