After all these years of editing with Vegas... the default project settings for "de-interlace" has just hit me... and I have a sinking feeling that I may have been producing lower quality output as a result.
The issue is this. When you start a new project.. .it seems the default for project properties is set to "de-interlace" of Blend Fields.
My thought all this time was that if I am not rendering to progressive formats then this setting would be irrelevant... as a simple render to an interlaced format would not do any de-interlacing. WRONG!!!
I just rendered a project where I "letterboxed" the footage. The destination was DVD so I was rendering to MPEG2. Thus... although my external previewing was showing no problem... after rendering to MPEG and playing out to the TV... I thought I noticed it was a little softer than how it looked while previewing from the timeline.
I decided to render to AVI and load that into the project above all the "real" tracks and then mute/unmute the track so I could compare frame-by-frame with the original.
Sure enough... there is a significant difference between the rendered/unrendered video.
I racked my brains trying to figure out what was going on... and then decided to mess with the de-interlace settings. I set it to none. Rendered to AVI again and repeated the comparison process. Sure enough.. the3 video this time was perfect and I could see no difference between the pre-rendered/rendered video.
So... now I'll be sure to set my de-interlace settings to NONE on every project from now on.
Question is though.... under what circumstances would setting the de-interlace method to none - start to cause me trouble?
Now I feel - with this personal revelation - that I really don't understand how this de-interlace property factors in to the way the project renders.
Can anyone take a stab at explaining this to me?
How do you all setup your project properties as far as de-interlacing?
The issue is this. When you start a new project.. .it seems the default for project properties is set to "de-interlace" of Blend Fields.
My thought all this time was that if I am not rendering to progressive formats then this setting would be irrelevant... as a simple render to an interlaced format would not do any de-interlacing. WRONG!!!
I just rendered a project where I "letterboxed" the footage. The destination was DVD so I was rendering to MPEG2. Thus... although my external previewing was showing no problem... after rendering to MPEG and playing out to the TV... I thought I noticed it was a little softer than how it looked while previewing from the timeline.
I decided to render to AVI and load that into the project above all the "real" tracks and then mute/unmute the track so I could compare frame-by-frame with the original.
Sure enough... there is a significant difference between the rendered/unrendered video.
I racked my brains trying to figure out what was going on... and then decided to mess with the de-interlace settings. I set it to none. Rendered to AVI again and repeated the comparison process. Sure enough.. the3 video this time was perfect and I could see no difference between the pre-rendered/rendered video.
So... now I'll be sure to set my de-interlace settings to NONE on every project from now on.
Question is though.... under what circumstances would setting the de-interlace method to none - start to cause me trouble?
Now I feel - with this personal revelation - that I really don't understand how this de-interlace property factors in to the way the project renders.
Can anyone take a stab at explaining this to me?
How do you all setup your project properties as far as de-interlacing?