Hello. I have just done some experimental renderings from Vegas 8 Pro to NTSC DV Widescreen, windows .AVI with the "progressive" box checked on the video tab and when I load the render into GSpot it tells me they are still interlaced (LFF). What gives? Any ideas appreciated.
There is no such thing as a "progressive DV" file. I'm pretty sure what you have ended up with is an interlaced AVI file whose upper and lower fields of each frame are identical.
Ah. Okay. Hmmm. Makes sense. That's why the render takes so long. Do I now have twice as much material squeezed in? That sounds like bad news.
I am trying to create a widescreen (16x9) progressive .AVI file from my widescreen Vegas project which consists of AVIs. MPEG-2 and ac3 renders are fine and work well for the DVD.
I am trying to fill a projection file request for a widescreen progressive AVI of the project. Any suggestions would be appreciated. I hope this isn't too confusing.
As John says, if you render using the DV codec, it always sets the interlace flag. If you want to see if the video is really interlaced, the only way I know to do it for sure is to use a program that separates the even fields into one frame and the odd fields into the following frames, thus doubling the frame rate, and halving the vertical resolution. (I use an AVISynth script to do this). You then walk through the video one field at a time. If the video is progressive, the upper and lower fields should be different spatially (the frame should jump up and down slightly, and you should be able to discern a change in detail). However, you should see no temporal difference between them (i.e., a ball thrown left to right across the screen should not progress to the left as you go between each pair of fields.
John was slightly incorrect (or at least it seems to me) when he said that the upper and lower fields are identical. If that is what Vegas does, then you've lost half the information in your video, and that is a bad thing.
If you render progressive in Vegas from interlaced source material, Vegas will de-interlace the material (using either blend or interpolate assuming you have chosen a deinterlace method.) Technically, each frame will still be broken into two fields, but there will be no temporal difference between the two fields. The net result is essentially losing up to half your vertical spatial resolution and exactly half your temporal resolution.
To my knowledge, there is no flag in the header of an AVI file that indicates whether it is interlaced or progressive and if it is interlaced, whether it is upper or lower field first. The only reason that Vegas thinks that a DV file is interlaced is not because of a non-existent flag, it's because it "knows" (assumes) that all DV files are interlaced, lower field first. There is, however, a flag in a DV AVI file that indicates whether it wide-screen or not.
If you render to an AVI file that is not a DV file, Vegas has no way of knowing interlaced vs progressive or wide-screen vs 4:3. You must know and tell Vegas manually by setting the clip properties. MPEG2 files do have all these flags.