I was doing some tests to prepare a response over in this thread:
HDV to SD Workflow
In that thread -- and in many others in this forum -- people suggest that when rendering HDV footage to MPEG-2 in order to create a standard SD DVD, you MUST set the Vegas "Deinterlace Method" (found in the Project Properties) to something other than "none," or you will get horrible results.
This has never made sense to me, so I did some tests, and boy, did I ever get a surprise!!
I won't bore you with all the steps, and just cut right to the chase:
Setting deinterlace to "blend" or "interpolate" and then rendering to MPEG-2 using the standard DVD Architect Widescreen template (720x480) gives you interlaced footage, but if you set deinterlaced to "none" you get progressive footage, and this footage has all sorts of bizarre artifacts that look like interlace combing except that they are multiple scan lines thick!!
None of this makes sense to me, at any level. I used Vegas 7.0d for the test. Perhaps this behavior has changed in 8.x.
So, just to make sure you understand what I did, I captured some HDV footage directly from my FX1. I put it on the timeline, and set the Vegas project properties to the "HDV 1080-60i (1440x1080, 29.970 fps)" preset. I then set "Deinterlace Method" to none. I then selected Render As, and rendered using the "DVD Architect NTSC Widescreen video stream" MPEG-2 template, without any modifications.
I then did the same thing, except before rendering, I changed the Deinterlace Method (in the Project Settings) to "Blend."
I took the resulting two MPEG-2 files and put them into this AVISynth script, which I read into VirtualDub:
When I had deinterlace set to blend, each field was from a different instant in time than the previous field, i.e., it was interlaced. So, in fact, Vegas did NOT deinterlace anything!!!! At least not in the sense that most of us talk about when we use the word "deinterlace."
I can't show that in a still photo, but I can show in this photo the other thing, namely that each field looked smooth and free from artifacts:
In the photo above, each time I pressed the right arrow key in VirtualDub, which takes me to the next field (not frame, but field, because I'm reading the video via the AVISynth script), I got horizontal movement, because each successive field was from a different moment in time.
By contrast, when deinterlace was set to none, each pair of fields was from exactly the same moment in time (i.e., it had been deinterlaced!!!!), and it looked absolutely terrible:
This picture (above), like the previous picture, is one field of video, not a frame (that's why it appears vertically squished). Thus, we shouldn't be seeing interlaced "herring bones." But also note that the herring bones are more than one scan line, so in fact I don't think they are interlace artifacts at all, but rather some sort of strange Vegas bug. If so, who knows how deeply this affects other video rendering situations.
Also, as I went from one field to the next, I got no movement (other than slight up/down) between each pair of fields. This is what you get with progressive video.
So I have no idea what is going on here, and this is one place where it sure would be wonderful if there were an actual live body in Madison who still cared about this stuff and could provide input and possibly a fix, if this is indeed a bug.
I think some of this was covered about a year ago in another thread, but the focus there was in getting rid of the horrible jaggies. I am not sure whether anyone at that point in time figure out that setting deinterlaced to "blend" or "interpolate" actually results in creating interlaced results, whereas setting it to none results in progressive footage.
BTW, I redid these test four times just to make sure I didn't screw up the first time.
HDV to SD Workflow
In that thread -- and in many others in this forum -- people suggest that when rendering HDV footage to MPEG-2 in order to create a standard SD DVD, you MUST set the Vegas "Deinterlace Method" (found in the Project Properties) to something other than "none," or you will get horrible results.
This has never made sense to me, so I did some tests, and boy, did I ever get a surprise!!
I won't bore you with all the steps, and just cut right to the chase:
Setting deinterlace to "blend" or "interpolate" and then rendering to MPEG-2 using the standard DVD Architect Widescreen template (720x480) gives you interlaced footage, but if you set deinterlaced to "none" you get progressive footage, and this footage has all sorts of bizarre artifacts that look like interlace combing except that they are multiple scan lines thick!!
None of this makes sense to me, at any level. I used Vegas 7.0d for the test. Perhaps this behavior has changed in 8.x.
So, just to make sure you understand what I did, I captured some HDV footage directly from my FX1. I put it on the timeline, and set the Vegas project properties to the "HDV 1080-60i (1440x1080, 29.970 fps)" preset. I then set "Deinterlace Method" to none. I then selected Render As, and rendered using the "DVD Architect NTSC Widescreen video stream" MPEG-2 template, without any modifications.
I then did the same thing, except before rendering, I changed the Deinterlace Method (in the Project Settings) to "Blend."
I took the resulting two MPEG-2 files and put them into this AVISynth script, which I read into VirtualDub:
mpegsource("e:\test (HDV widescreen, BFF default SD, no deinterlace).mpg")
separatefields()
When I had deinterlace set to blend, each field was from a different instant in time than the previous field, i.e., it was interlaced. So, in fact, Vegas did NOT deinterlace anything!!!! At least not in the sense that most of us talk about when we use the word "deinterlace."
I can't show that in a still photo, but I can show in this photo the other thing, namely that each field looked smooth and free from artifacts:
In the photo above, each time I pressed the right arrow key in VirtualDub, which takes me to the next field (not frame, but field, because I'm reading the video via the AVISynth script), I got horizontal movement, because each successive field was from a different moment in time.
By contrast, when deinterlace was set to none, each pair of fields was from exactly the same moment in time (i.e., it had been deinterlaced!!!!), and it looked absolutely terrible:
This picture (above), like the previous picture, is one field of video, not a frame (that's why it appears vertically squished). Thus, we shouldn't be seeing interlaced "herring bones." But also note that the herring bones are more than one scan line, so in fact I don't think they are interlace artifacts at all, but rather some sort of strange Vegas bug. If so, who knows how deeply this affects other video rendering situations.
Also, as I went from one field to the next, I got no movement (other than slight up/down) between each pair of fields. This is what you get with progressive video.
So I have no idea what is going on here, and this is one place where it sure would be wonderful if there were an actual live body in Madison who still cared about this stuff and could provide input and possibly a fix, if this is indeed a bug.
I think some of this was covered about a year ago in another thread, but the focus there was in getting rid of the horrible jaggies. I am not sure whether anyone at that point in time figure out that setting deinterlaced to "blend" or "interpolate" actually results in creating interlaced results, whereas setting it to none results in progressive footage.
BTW, I redid these test four times just to make sure I didn't screw up the first time.