Posted by Spex on 11/10/07 10:32
Johan Stäck wrote:
> This thread "went astray" (basically my own fault..)
>
> But I am still hoping to get some feedback on my initial question:
> What is the reason for having interlaced camcorders *today* with
> virtually all end-user gear (TV sets, computer screens etc) needing to
> convert it to progressive (by means of de-interlacing) before displaying
> it?
>
> I emphasize *today*, because I am well aware of the historical
> background for interlace ..
>
> /JS
The reason why interlace is still necessary for 1080 HD broadcasts is
the shear amount of bandwidth required to broadcast 1080p50 material.
Clever deinterlacing schemes come close to producing a similar level of
quality for a much lower bandwidth requirement.
In tests consumers were not able to distinguish 1080p50 material from
high quality deinterlaced 1080i50 (more correctly written as 1080i25).
In static scenes there is very little for the deinterlacer to do and
therefore is absolutely indistinguishable between interlaced and
progressive material. In scenes with high motion you'd think
progressive material would shine but we human don't register the detail
in high motion at all well so again there is little pay off in using
1080p50.
The prosaic reason is that by the time the broadcaster has compressed
the f**k out of the material there is no point. If the broadcaster
provided enough bandwidth for a glorious 1080p50 picture the wouldn't be
able to afford the bandwidth for the 100's of shopping and poker channels.
The consumer is generally quite ignorant about the level of picture
quality they should be receiving and are generally happy with up-rezzed
SD material thinking it to be HD. So used to appallingly compressed SD
in the past the much more efficient MPEG4 broadcasts look far superior
to them. Sky in the UK regularly broadcasts up-rezzed SD on their HD
channels.
Navigation:
[Reply to this message]
|