|
Posted by Richard Crowley on 01/01/06 18:22
"Clive Tobin" wrote ...
> Does anyone know where the A to D converters in
> digital video stuff such as MiniDV, Digital8 and DVD
> run out of 1's and clip the signal? Does anything bad
> happen when it does, such as bearding or losing sync?
> This could happen with impedance mismatch, maladjusted
> camera clipping point, etc.
>
> I have seen consumer VHS machines that put out about
> 1.25 volts of video. Presumably if digital machines are
> designed to accept this (so the public can convert their
> home tapies to DVD), then this would be about the limit.
> I'm just wondering how close to clipping level normal
> video would be. Thanks.
I thought the NTSC standard was 1v p-p. I assumed that
most video equipment uses AGC to normalize the amplitude.
If we know the specified proportion of the sync signals
(40/140 IRE units) then we know what the amplitude of
the video part (the top 100/140) should be.
It doesn't surprise me if some consumer equipment puts
out 1.25V, (assuming it was properly terminated with
75-ohms, etc.) They probably rely on the AGC of the
destination equipment since video has its own built-in
reference (the sync pulses).
Navigation:
[Reply to this message]
|