|
Posted by Bill's News on 11/18/06 22:59
Jukka Aho wrote:
> Bill's News wrote:
>
>> From recollection:
>> USA power cycle 60 Hz; human visual perception 40 Hz.
>> Interlacing was the tradeoff between bandwidth and cost vs.
>> watchability. Apparently, were humans 25% quicker at
>> perception, the continent might have gone progressive scan
>> from the
>> beginning!
>
> Hmm. I don't see how this follows. If we suppose the 40 Hz
> figure
> holds true, 1.25 * 40 = 50 Hz. The current refresh rate, and
> temporal
> rate, for "real" video shot with an interlacing video camera
> is about
> 60 Hz in the US. What kind of progressive system are you
> suggesting
> for these hypothetical humans with a 50 Hz visual perception
> (however
> "visual perception" would be defined in this context)?
>
Oh, I was being facetious - as humans will always be slower than
1/50 sec perception. But my guess is that, were it otherwise,
since Europe was already on a 50 Hz electrical standard they
might have been able to implement progressive scanning TV
instead of interlaced. I guess I needed a smiley thingy there.
Sorry;-0)
>> Regarding TV phosphors:
>> By the late 60's and very early 70's three methods of placing
>> alpha-numerics onto video screens were:
>
> "Placing alpha-numerics onto video screens" means computer
> monitors in
> this context, right?
>
Yes! Though, as I recall, equipment to provide "real-time" text
overlays to video camera images was emerging at the same time -
using much larger fonts and aided by moving backgrounds. 5x7 in
7x9 at 80x25 characters on a black background made for "tight"
specs back then.
By the way, the same lab at which I worked in the 60's had a
square sheet of metal oxide coated Mylar spinning on a platter
under a record/playback head. Their thought was sports
broadcast "instant playback" and NOT what eventually became the
vast market of CD/DVD.
Video tape did not lend itself to "instant replay" or anything
like what we have come to know in contemporary sports-replay
selection and composition.
>> Stroke writer (IBM - white), sawtooth (DataPoint - green),
>> raster
>> scan (aka Standard TV, by Hazeltine - gold). Of those, the
>> latter
>> was the most economical because of the mass production of the
>> TV
>> subassembly. However, standard TV bottles - as produced by
>> Ball (the
>> Mason Jar folk) at that time - had major problems due to the
>> fast
>> decay time of their white phosphor. The home TV was usually
>> viewed under incandescent lighting, while office
>> alpha-numeric
>> monitors were typically viewed under florescent lighting. At
>> that time a longer decay time was the solution and a goldish
>> colored phosphor was mixed which satisfied all
>> memory/refresh/lighting conditions except "smooth scrolling."
>> Viewed under incandescent lights, the phosphor persistence
>> was
>> more noticeable.
>
> What you appear to be saying is that in the late 60s (or early
> 70s),
> when first "glass terminals" started to appear, standard tv
> phosphors
> were already fast-decaying and appeard a bit too flickery for
> computer
> use - at least at typical tv refresh rates. Right?
>
Yes. But recall that this was not obvious until viewed in
ambient florescent lighting. Not the TV watcher's typical
environment.
> But in the historical context, it would be more interesting to
> know
> how fast or slow decaying the phosphors were in the 1930s when
> interlaced scanning was invented.
>
My memory is not that good. I think you touched on it in
another post. Visibly decayed below perception in no more than
vertical retrace time (less than 1/60th second, no?).
Otherwise, the image would have looked similar to today's
display of interlaced images on digital.
>> It was about 10 years before all the ingredients came
>> together
>> to render "smooth scrolling" on raster TVs with short
>> persistence phosphors, highly repeatable deflection, and fast
>> enough memory systems.
>
> That's all true - for computer monitors. But if standard tv
> phosphors
> were already too flickery (too fast-decaying) for computer use
> at tv
> refresh rates by the late 60s, or early 70s, does that not
> mean that
> they were fast enough for scrolling or panning without
> smearing the
> picture (in tv use, that is)?
Yes and no! In that era both inexpensive memory systems and
inexpensive deflection yokes were not up to the task of
repeatable (hitting the same spot on the phosphor) dot drawing
at high rates (hmmm, perhaps any rate?). Don't forget that the
FAA was already using drum memory driven 1000x1000 monitors at
higher refresh rates with better beam deflection to present
graphic and alpha-numerics at in-flight control centers. Large
screen projection was a reality too, schlieren - I think - was
the art of the day. And this is really stretching my recall -
it was an oil based screen written to by a cathode ray. Later
fresnel got into the act.
Any way, back to phosphor decay. When faster memory came along
and the "art" of controlling the beam economically advanced,
fast decay was necessary to implement "smooth scroll." This was
a huge leap forward in the human interface, as previously the
"moving" alpha-numerics were unreadable until NOT scrolling.
To some degree, Hazeltine was restrained by technology in which
they held patents and expertise - TV and magnetic memory among
them. When they finally accepted volatile memory as a
substitute for iron cores and LSI for discretes, they began to
move forward more quickly.
DataPoint, with their non-TV approach, implemented "smooth
scroll" earlier than Hazeltine - and perhaps spurred Hazeltine's
further efforts in raster control. IBM, at that time, only used
page presentation, with a limited amount of characters, and
never attempted to scroll.
By the time we got to the 90's CRTs for PCs, 60 Hz was no longer
a constraint and mass production of these newer designs allowed
for similar economies of scale as realized in the 60's.
We've come a long way, as once upon a time we could be
sterilized by sitting too close to out GE color TV ;-0)
Admittedly, I tried - it didn't work :-(
Navigation:
[Reply to this message]
|