|
Posted by Bill's News on 11/18/06 19:23
Jukka Aho wrote:
<snip>
> Do you assume those to be faster-decaying or slower-decaying
> than the
> modern phosphors? What is the problem you assume there being
> with
> 1930s phosphors? (This is not a trick question - there just
> does not
> seem to be a consensus about this. Some say the early
> CRT-based
> televisions had faster-decaying phosphors, and insist that
> interlaced
> scanning was designed, in part, to combat this problem. Others
> maintain that they had a longer afterglow. Go figure.)
>
<snip>
From recollection:
USA power cycle 60 Hz; human visual perception 40 Hz.
Interlacing was the tradeoff between bandwidth and cost vs.
watchability. Apparently, were humans 25% quicker at
perception, the continent might have gone progressive scan from
the beginning!
Regarding TV phosphors:
By the late 60's and very early 70's three methods of placing
alpha-numerics onto video screens were: Stroke writer (IBM -
white), sawtooth (DataPoint - green), raster scan (aka Standard
TV, by Hazeltine - gold). Of those, the latter was the most
economical because of the mass production of the TV subassembly.
However, standard TV bottles - as produced by Ball (the Mason
Jar folk) at that time - had major problems due to the fast
decay time of their white phosphor. The home TV was usually
viewed under incandescent lighting, while office alpha-numeric
monitors were typically viewed under florescent lighting. At
that time a longer decay time was the solution and a goldish
colored phosphor was mixed which satisfied all
memory/refresh/lighting conditions except "smooth scrolling."
Viewed under incandescent lights, the phosphor persistence was
more noticeable.
It was about 10 years before all the ingredients came together
to render "smooth scrolling" on raster TVs with short
persistence phosphors, highly repeatable deflection, and fast
enough memory systems.
But then, memory is the second thing to go - so my recollection
may be dimmer than I imagine ;-0)
[Back to original message]
|