|
Posted by PTravel on 01/12/59 11:52
"Richard Crowley" <rcrowley@xpr7t.net> wrote in message
news:12b4o3hjeatv0fb@corp.supernews.com...
> IMHO, Mr. Tauger and Mr. Heffels are both correct. The problem is that we
> are still using "analog" terminology
> even here in the digital age.
That's "Mr. PTravel." ;)
>>> You can dupe a D-25 tape 18 times and the 18th copy will be identical to
>>> the first.
>
> Not by engineering standards. It is highly unlikely that the low-grade
> error detection and correction used in DV the tape format could read and
> write 18 sequential dubs with 100% accuracy. OTOH, if you were to say
> that usually you can't *see* any anomolies from an 18-
> generation DV dub, that is a different matter.
Not be confrontational, but do you have any stats for that? D-25 data on a
miniDV tape is also a whole lot less dense than a hard drive, which is one
of the reasons that less robust error-correction is required. Drop out on
miniDV isn't the result of read errors, but of magnetic media flaking off
the binder and leaving an empty spot with no data. As long as the integrity
of the magnetic media is maintained (and it's not exposed to heat or
magnetic fields that would corrupt the alignment of the magnetic particles),
the data will be read without error.
>
>
>>> Yeah -- one is tape, one isn't. However, the statement stands: error
>>> correction is used for both media,
>
> Note that the error detection, and particularly the error
> correction mechanisms used for *data* are substantially
> more rigorous and effective than those used for *media*.
> This applies to Red-Book audio CDs (vs. data CD-ROM)
> as well as for DV tape vs. a DV-AVI computer file. This
> is, for example, the reason you cannot store as much info
> on a CD-ROM as you can on an audio CD. I have many
> examples of audio CDs I have made where the raw WAV
> tracks won't fit on the same disc because of the extra
> overhead from the (Orange Book0 data format.
Agreed, but see above re: data density.
[Back to original message]
|