Weird Horizontal Motion Glitch

MikeFulton wrote on 10/6/2008, 6:25 AM
I'm seeing a weird glitch... I'd appreciate comments and feedback.

I've shot a bunch of 1440x1080 AVCHD footage on my Canon HG10. When I import it into Vegas, I can place a clip on the timeline and it plays back just fine. If I render the timeline back out to the same resolution but with a different codec (like HDV) it works just fine and there are no glitches.

However, if I try to render the timeline to a lower-resolution format like DV-format 720x480, where it needs to down-sample the video to a lower resolution, then it introduces an odd horizontal motion glitch. Horizontal motion gets broken up into sections where some parts are drastically pushed either left or right.

It superficially resembles a scanline interleave problem with field ordering, however it's not single scanlines that are out of position... it's more like a block of 8 or so scanlines at a time. There is a sample frame image at:

http://www.mikefulton.net/messagepix/videoglitch.jpg

you can see the glitch at the bottom in the subject's hands. Please note that while the glitch does not appear to be scanline based, I did experiment for a few hours with different settings relating to interlacing just to be sure that nothing was being overlooked. I must have rendered with at least a dozen different combinations of settings, but nothing I changed affected this glitch.

I tried doing essentially the same operation in two other programs that support AVCHD footage, and they both worked fine... no glitches at all. This is defintely a problem specific to VEGAS.

Because the glitch appears be groups of about 8 scanlines at a time, which is the height of an MPEG macroblock, I'm wondering if perhaps there is a bug with some rendering speed optimization in Vegas that involves taking shortcuts in decoding the video.

It looks like maybe it's NOT completely decoding the source video to complete, finished frames before downsampling and recompressing them. That would be a useful speed optimization if it actually worked, but if that's what's going on here, then there's definitely a problem.

Comments

MikeFulton wrote on 10/6/2008, 6:42 AM
I just thought to do one more test...

If I export the original footage back out to the same resolution as the original clip, but as UNCOMPRESSED, then swap in the new clip, then it renders down to the lower resolution just fine, no glitches.

So... it appears that it only glitches when the source material is MPEG-based. To me, this proves that the problem is an error with the MPEG decoding and how it's interleaved with the downsampling process.

My guess would be that the motion data in the MPEG source is not being handled properly in some fashion. Either it's grabbing some data from the wrong frame or the motion interpolation is not being processed correctly.

Well, Sony, I've debugged the problem as much as I can on my end... now how about fixing it?
jrazz wrote on 10/6/2008, 9:02 AM
Did you send in a ticket via the "Support" tab up above? If not, I would recommend you do so.

j razz
MacVista wrote on 10/6/2008, 10:06 AM
I have seen this before when scaling interlaced material.

You should try rendering using "Best" quality and then it should be fine.

From the Knowledgebase:

"Best" uses bicubic scaling with integration, while "Good" uses bilinear without integration. You will have a hard time telling the difference, but must use "Best" if you are using high resolution stills or video that are getting scaled down to the final output size. "Good" may have bad artifacts on the near-horizontal edges, while "Best" will look great. More can be read on this subject in the Creative Cow forums here.
John_Cline wrote on 10/6/2008, 12:55 PM
Make sure you have a deinterlace method selected in the project's properties. I'm guessing that's it's currently set to "none." Interpolate works best.

John
MikeFulton wrote on 10/6/2008, 1:51 PM
The source material is not interlaced, but I went and tried rendering with "Best" quality anyway, and in fact that does remove the problem.

I previously mentioned that I had tried a variety of options regarding interlacing, and now you may be asking "why? if the source material isn't interlaced". The answer is, because I knew people would be suggesting various ideas involving changing those options and I wanted to have already eliminated them.

Rendering with "Best" quality does the trick, however I'm sure it's got nothing to do with using Bicubic filtering instead of Bilinear... I'm a programmer and I've written bicubic and bilinear bitmap scaling functions before and there's nothing about either that would explain this glitch.

However, the phrase "with integration" from the knowledge base item is intriguing. It's not a phrase that has any meaning with regards to either type of filtering algorithm, so I'm guessing it has something to do with the way that the source material is decoded during the process. That's the only thing that would explain the glitches.
RBartlett wrote on 10/6/2008, 4:17 PM
Additionally Mike, could you describe your project settings (pixel size, aspect ratio, fielding) and whether you modified the DV output codec to be anything other than interlaced.

- Does the import/clip-properties show non-interlaced for your source material from the Vegas perspective?

- I'm not sure if the Sony-Vegas DV encoder supports non-interlaced _output_ - I guess it does since the times when progressive was supported for Panasonic camcorders.

-The custom-video panel in the rendering engine may help you test/determine this.

Of course you've already found a workaround but I'm hoping that this post might in some way help you answer your outstanding question.

The question is not always with regard to how footage is treated as a source file. Occasionally the pipeline (project settings) or any re-interlacing of non-interlaced footage could be causing. Although the stripe size would indicate that this is all a problem with the 'order of events' with processing the scaling and encoding.

What you've shown in clip form I've seen referred to as Dog-Teeth. Whereas interlaced footage previewed on a PC that (typically) doesn't render on a field basis usually (without Bob/Weave/hybrid/motion-estimated processing or directx fielded drawing as per NewTek SpeedEDIT) will look like Mice-Teeth with single scan line pitched combing.

If it isn't an interlace issue then the bands are most likely to have been created through scaling. Not sure, perhaps the Best render setting takes more lines into account when downscaling whereas the Good takes a sample and the step difference between 1080 and 480 creates the bands. I think I've seen the issue you've given a screenshot of when I've output 576i DVD-Video from a 2K still camera source. Although a different workflow, fixed by using Best (and as it was an older version of Vegas when I last did this, also a supersampling envelope).

Your issue needs a better explanation than my assertions supply. There is a chance that you may be steered in some way or another while you await someone else chipping in...?

farss wrote on 10/6/2008, 11:02 PM
I've seen this problem a lot and not just from mpeg-2 source so that's not the problem. As John points out above it happens when you fail to specify a de-interlace method.
Now I know you believe your footage is progressive but:

a) Is it?
b) Does Vegas now it is.
c) All of the above plus does it have pulldown and has that been removed?

Bob.