I'm sorry if this is a bonehead question - if it is, I'll go quietly...
I shoot in 60i with a Canon HV30, and have spent a fair amount of time rendering from Vegas into a variety of progressive formats. I understand why the deinterlaced video always looks worse than the original, either in terms of detail or motion smoothness. But when I play the original clip in, say, Windows Media Player, it looks great: smooth and sharp. Well of course it does - it's interlaced ... but doesn't it still have to be deinterlaced as it's being played on my LCD monitor? If so, why does it look so much better than any deinterlaced video I can render out of Vegas?
I suspect the answer is that Media Player can refresh my display faster than 30 times per second, and thus "deinterlace" without losing temporal information. But if computer displays can do this, why does everyone deinterlace their video "for the web"?
I shoot in 60i with a Canon HV30, and have spent a fair amount of time rendering from Vegas into a variety of progressive formats. I understand why the deinterlaced video always looks worse than the original, either in terms of detail or motion smoothness. But when I play the original clip in, say, Windows Media Player, it looks great: smooth and sharp. Well of course it does - it's interlaced ... but doesn't it still have to be deinterlaced as it's being played on my LCD monitor? If so, why does it look so much better than any deinterlaced video I can render out of Vegas?
I suspect the answer is that Media Player can refresh my display faster than 30 times per second, and thus "deinterlace" without losing temporal information. But if computer displays can do this, why does everyone deinterlace their video "for the web"?