You are here: Re: interlaced vs progressive scan question « DVD Players « DVD MP3 AVI MP4 players codecs conversion help
Re: interlaced vs progressive scan question

Posted by Jukka Aho on 11/18/06 20:14

Bill's News wrote:

> From recollection:
> USA power cycle 60 Hz; human visual perception 40 Hz.
> Interlacing was the tradeoff between bandwidth and cost vs.
> watchability. Apparently, were humans 25% quicker at
> perception, the continent might have gone progressive scan from the
> beginning!

Hmm. I don't see how this follows. If we suppose the 40 Hz figure holds
true, 1.25 * 40 = 50 Hz. The current refresh rate, and temporal rate,
for "real" video shot with an interlacing video camera is about 60 Hz in
the US. What kind of progressive system are you suggesting for these
hypothetical humans with a 50 Hz visual perception (however "visual
perception" would be defined in this context)?

> Regarding TV phosphors:
> By the late 60's and very early 70's three methods of placing
> alpha-numerics onto video screens were:

"Placing alpha-numerics onto video screens" means computer monitors in
this context, right?

> Stroke writer (IBM - white), sawtooth (DataPoint - green), raster scan
> (aka Standard TV, by Hazeltine - gold). Of those, the latter
> was the most economical because of the mass production of the TV
> subassembly. However, standard TV bottles - as produced by Ball (the
> Mason Jar folk) at that time - had major problems due to the fast
> decay time of their white phosphor. The home TV was usually
> viewed under incandescent lighting, while office alpha-numeric
> monitors were typically viewed under florescent lighting. At
> that time a longer decay time was the solution and a goldish
> colored phosphor was mixed which satisfied all
> memory/refresh/lighting conditions except "smooth scrolling."
> Viewed under incandescent lights, the phosphor persistence was
> more noticeable.

What you appear to be saying is that in the late 60s (or early 70s),
when first "glass terminals" started to appear, standard tv phosphors
were already fast-decaying and appeard a bit too flickery for computer
use - at least at typical tv refresh rates. Right?

But in the historical context, it would be more interesting to know how
fast or slow decaying the phosphors were in the 1930s when interlaced
scanning was invented.

> It was about 10 years before all the ingredients came together
> to render "smooth scrolling" on raster TVs with short
> persistence phosphors, highly repeatable deflection, and fast
> enough memory systems.

That's all true - for computer monitors. But if standard tv phosphors
were already too flickery (too fast-decaying) for computer use at tv
refresh rates by the late 60s, or early 70s, does that not mean that
they were fast enough for scrolling or panning without smearing the
picture (in tv use, that is)?

--
znark

 

Navigation:

[Reply to this message]


Удаленная работа для программистов  •  Как заработать на Google AdSense  •  статьи на английском  •  England, UK  •  PHP MySQL CMS Apache Oscommerce  •  Online Business Knowledge Base  •  IT news, forums, messages
Home  •  Search  •  Site Map  •  Set as Homepage  •  Add to Favourites
Разработано в студии "Webous"