|
Posted by Jukka Aho on 11/15/06 05:49
wdoe999@yahoo.com wrote:
> Joshua Zyber wrote:
>
>> Read this:
>>
>> http://www.hometheaterhifi.com/volume_7_4/dvd-benchmark-p
>> art-5-progressive-10-2000.html
> Thanks. Yes, I had seen that article before and it does seem to make
> the most sense. They are essentially confirming that progressive scan
> is NOT better than interlaced per se, rather a progressive scan image
> is better when the source is progressive scan (such as film).
That's a correct conclusion if we're talking about content that was
produced _natively_ as interlaced fields or _natively_ as non-interlaced
frames. Neither system benefits when it's being converted to the other
system.
But note that the refresh rate / frame rate matters, too. 60 Hz (60 *
1000/1001 Hz) "progressive scan" display is not ideal for
film-originated content. 24 fps film-originated video would be best
displayed with a non-scanning display that updates the pictures 24 times
a second, or with a scanning display that flashes the frames two times
(48 Hz) or three times (72 Hz) in a row, like movie projectors do.
The article that you were referred to in the above appears to be mostly
correct but the animated tomato illustration and the animated depiction
of an interlaced scanning pattern appear to give false impressions about
the topic. Both illustrations seem to make a somewhat ludicruous (or at
least inaccurate) claim that your brain would somehow integrate _exactly
two adjacent fields at a time_ into a single picture. You will get a
better description of what is really happening from here:
<http://lurkertech.com/lg/fields/fields.html>.
The "Interlace Scan" illustration also appears to suggest that "Field 1"
would be retained on the screen while "Field 2" is being drawn
in-between its lines. That's not true. The phosphors on modern CRT
screens fade away long before a single field refresh is complete. See,
for example:
<http://en.wikipedia.org/wiki/Image:Refresh_scan.jpg>
> What had me confused is that a good percentage of the information on
> the web seems to be quite wrong (what else is new about the internet).
> Most articles make crazy statements about interlaced images having
> "half the resolution", or gaps between the lines in interlaced images
> (as if progressive scan images have fatter lines or something).
Most of the time confusion arises because it is not clearly stated what
kind of a progressive system the writer has in his mind when he is
making these comparisons.
In my previous message to this thread, I gave a link to a Wikipedia
discussion page where I compared three different (but technically
related) "progressive scan" systems to a single interlaced system [1].
For example if you're comparing an interlaced system to a "Progressive
variant A" system, as defined on that page, you _will_ get more visible
gaps between the scanlines (or rather, more discernible scanline
structure) - but note: the gaps are visible in the _progressive_ system,
not in the interlaced system. And, if you're comparing an interlaced
system to a corresponding "Progressive variant C" system (as defined on
that page as well), each field in the interlaced system has only half of
the vertical resolution when compared to the frames in the progressive
system. It all depends on what you're comparing to what.
_____
[1] Here's the link again: <http://en.wikipedia.org/wiki/Talk:Inter
lace#Comparing_interlace_to_progressive>
--
znark
Navigation:
[Reply to this message]
|