1080i to 480i: bad motion blur

john-beale wrote on 3/22/2006, 1:52 PM
I'm having a problem with motion blur on HDV which is converted down to SD 480i for DVD. Here's an example, a small 1:1 cropped portion of an HDV 1080i frame converted in Vegas to 720x480 (widescreen) and then expanded horizontally for square-pixel displays to a 853x480 frame. Note the checkerboard appearance of the motion blur on the man's jacket. All the blurred edges have a repeating pattern of 4 scanlines of lighter grey and 4 lines of darker grey throughout the frame. http://www.bealecorner.com/D30/misc/1080-480-motion-blur-alias.jpg

The problem is quite visible during motion on the final DVD playback, this is not just a still frame issue. I assume the problem is that the HDV 1080i scanlines were simply downsampled to 480i scanlines with no filtering. 1080:480 is a 2.25:1 ratio meaning there is a "beat frequency" alias every 4 scan lines unless you take measures to avoid it. Vegas 6d apparently does not take measures to avoid it, at least by default. Is there any way to do a better downconversion in Vegas?

If I select "Properties/Reduce Interlace Flicker" it seems to blend the fields after conversion to 480i, giving unacceptable image blurring. Maybe I could do field blending and export as 1080i, then convert that to 480i afterwards, but that takes considerable time and disk space. Any better suggestions?

Comments

GregFlowers wrote on 3/22/2006, 2:01 PM
Are you rendering using the "Good" or "Best" setting? I found I had to render at "Best" to achieve good results. The difference was not subtle either. I tried everything I could think of with no acceptable results. When I set the rendering to "Best" it turned crystal clear. It takes forever to render though.
John_Cline wrote on 3/22/2006, 2:41 PM
I was always under the impression that Vegas split the fields and processed them separately. In other words, it would take one HDV frame consisting of two fields and made two individual images of 1920x540 (or more accurately, 1440x540) and then reinterleaved them after it was done processing them, in this case, scaling. If this is indeed how Vegas deals with it, then there shouldn't be any "beat frequency" aliasing.

John
john-beale wrote on 3/22/2006, 2:52 PM
I did have rendering quality set to best and was still getting the aliasing. What I'm doing now is blending fields (checking Properties/"reduce interlace flicker") and exporting from Vegas as native 1440x1080i via DebugMode frameserver (RGB). I then use ProCoder to simultaneously downconvert to 720x480 and transcode to MPEG2. This seems to work fine with no visible aliasing. I suspect there is a better way to do it, but this works for me for now.

OOPS. Update 3/25/06. The process described above yields a DVD that looks just fine on non-interlaced (computer) displays. However, it looks rather bad (jittery) on interlaced displays. It is not AS bad as if the field order had been reversed. It is maybe about 1/4 as bad as that, but jitter is still noticible and distinctly worse than a DVD generated from a native SD 720x480 interlaced camera. So Procoder is letting me down on the conversion to SD.