ComputerNerd,
I really hope youre a joke poster
and just messing with us
ComputerNerd said:
Deinterlacing, IMO, is only proper if the data source is in 60 FPS. Otherwise you get more image distortion.
Deinterlacing is a technology concern. There are no opinions, only fact.
Regardless, your opinion is wrong. If the original source is 60fps but being output interlaced, exactly what is there to deinterlace? The deinterlacer will generally either combine successive fields as though the content is video (30fps), creating major motion artifacts, or the deinterlacer will simply line double the fields. Even if it uses a pretty extrapolation algorithm for the line doubling
you cant pull data out of thin air. The image quality would be awful.
We have been talking about movie and video data, which is all 30fps or less
so Im not even sure where youre pulling 60fps from. The only things that can produce that are games. In that case, you are still completely wrong. Native 60fps games being output interlaced look like poo due to deinterlacing. That is common knowledge here.
I don't see anything here about Blu-Ray disks being recorded in 1080i/60 (via VC-1).
Why is this statement being directed at me? When did I say that?
Protip 1080i/60 using the nomenclature from that wiki page does not exist.
I may be wrong, but are there any Blu-Ray movies recorded in 60 FPS?
No, and again
who stated that?
Edit: Nevermind, seems like Blu-Ray is strictly 24 FPS
Or 30fps.
Edit 2: Oh, and to respond to something earlier, I do know what telecine is. I just don't think it makes a difference in this thread.
:lol OMFG :lol