Chienworks wrote on 8/17/2009, 7:49 AM
NTSC DV is 29.97fps. If you're editing NTSC DV then you should have your project properties and render set to match the same 29.97fps to avoid frame resampling and keep the motion smooth.
L_Town wrote on 8/17/2009, 9:50 AM
So if I am going to the web with the video then I should change it to 30fps correct?
Chienworks wrote on 8/17/2009, 10:06 AM
No. It's still 29.97fps material and you want to preserve that.
L_Town wrote on 8/17/2009, 10:10 AM
Does Vegas convert it to progressive then too? So is there such a thing as 656 X 480 29.970 fps progressive (if I create it that way).

I'm creating a 4:3 video for non-web playback. On the contrast however, what if I create web content? I always thought material on the web was 30fps progressive...
johnmeyer wrote on 8/17/2009, 10:47 AM
I think the confusion here is that you may be thinking that setting Vegas to 29.97 (NTSC) will ALSO force Vegas to convert it to progressive or do other things that will make the video play poorly on the web.

However, each control in Vegas is generally independent of the other controls. Thus, when you set the fps setting to 29.97, as Kelly is correctly recommending, you are putting a flag into the WMV file that tells the WMV player to play back the video at exactly that rate. In addition, you are telling Vegas to not resample the framerate. This is actually the key thing. If instead you use 30 fps, Vegas will need to drop one frame out of every 1000 frames or -- and I don't know if it might do this -- convert virtually every frame to a blend of adjacent frames in order to achieve the frame rate you have specified. I honestly don't know which thing it would do, but if it does the second thing, you are going to end up with MUCH worse looking video.

So, you should definitely match the fps of your rendered file to the fps of your source material whenever possible.

Now, as to progressive vs. interlaced, if your playback is going to be via the web, then you should specify progressive for the output.

What about resolution? I always keep it the same as the source material, since most web systems now permit playback of relatively high-res video. However, this gets a lot trickier and I think depends on whether you are uploading to YouTube or whether you are hosting the file yourself. If you are going to host the file you create, then I would convert it to the same resolution and PAR (720x480 and 0.9091 for DV AVI, for instance) as the original video. Thus, the only change from the original would be to go from interlaced to progressive.
L_Town wrote on 8/17/2009, 12:44 PM
Hi John. Thanks for responding.

I am confused about setting Vegas to 29.97 inthe "Render As" section. Vegas must convert it to progressive because 1) the video doesn't like interlaced. I can't see the individual fields. 2) There is no interlaced/progressive option in the WMV settings.

I exported my project to both settings 29.97fps and 30fps WMV at 656 X 480 and I can't tell the difference between the two clips. However, I am thinking that there should be a "correct" method in there somewhere right?

John_Cline wrote on 8/17/2009, 1:38 PM
If the temporal resolution of the original source video was 29.97, then your should render at 29.97 and not 30. The WMV will be progressive and will be deinterlaced using whatever method is selected under "File" > "Properties." Personally, I prefer setting it to "Interpolate" in virtually all circumstances.
tumbleweed7 wrote on 8/17/2009, 4:05 PM

I think the confusion lies in the understanding that frame rate has no relationship to weather it's progressive, or interlaced video.... no actual frames are really dropped either in the 29.97.... do a google search, as I'm beginning to reach the end of my experience now.. : )

& I believe the 29.97 setting would be correct for viewing on a CRT monitor... I could be wrong on this though ...
Chienworks wrote on 8/17/2009, 6:57 PM
29.97 (actually 29.97002997002997...) and 30 are physically different frame rates so when converting from one to the other frames will be dropped or added. When rendering 29.97 at 30 extra frames have to be added, about 1 every 30 seconds. Vegas can either do this by resampling which blends adjacent frames together and can cause blurriness and doubling, or without resampling in which case one frame every 30 seconds will be repeated.

Most computer CRTs refresh at much higher rates than typical video frame rates. Any computer monitor, whether CRT or LCD or plasma, will display 30, 29.97, 25, 24, etc. with equal ease. On the other hand, sending 30fps to a run-of-the-mill CRT television will probably cause some odd stuttering. This usually isn't an issue though because there will be some hardware necessary to convert the signal into something that the television can understand and this will probably adjust the frame rate to 29.97 anyway.
tumbleweed7 wrote on 8/17/2009, 9:14 PM

I have to respectfully disagree with "the old man"... : ) ...

a quote from Adobe I found:

" There are three fundamentally important things to remember about NTSC and drop-frame timecode:
• NTSC video
runs at 29.97 frames/second.
• 29.97 video can be notated in either drop-frame or non-drop-frame format.
• Drop-frame timecode only drops numbers that refer to the frames, and not the actual frames."

This is the kind of stuff that hurts my head, so I'll stop with a final note.... I just don't want folks to leave with the wrong idea... 29.97 refers to NTSC timecode...

respectfully, tumbleweed
John_Cline wrote on 8/17/2009, 10:54 PM
Non-drop or drop-frame timecode does not alter the fact that NTSC video is 29.97 frames per second. PERIOD. It can either be 29.97 progressive frames per second or it can contain 59.94 interlaced fields per second.

There is ABSOLUTELY no reason to render a Windows Media File at 30 fps if your project and all of your source material is at 29.97 fps.
L_Town wrote on 8/18/2009, 5:09 AM
I appreciate your responses everyone! I think I understand it now. I was a little confused at first. I will just have to let this soak in for a second. Lol. Thanks again!

tumbleweed7 wrote on 8/18/2009, 7:11 AM

I have no disagreement with John here....

Looking at the original post, the reason there's a 29.97 option would likely be because, if the final output was for an interlaced display, the output would look smoother... I'll let it go now...
R0cky wrote on 8/18/2009, 7:49 AM
a related question - I have a point 'n shoot cam that takes 30 (not 29.97) fps progressive video. I want to render it at 29.97 by just slowing it down that tiny bit without resampling which can cause blurring or stuttering.

I have been doing this by setting the event properties to do not resample and then just rendering at 29.97 progressive. Is what I want to happen in fact what vegas does or is it dropping a frame (for real) every 100 seconds?
Chienworks wrote on 8/18/2009, 7:57 AM
Doing that will make Vegas drop 1 out of every 1000 or so frames, which works out to about one ever 33 1/3 seconds. If you want to maintain all the frames then you need to slow the clip down to match. Find the length of the clip in seconds on the 29.97 timeline, divide by 29.97002997 and multiply by 30. For example, if the clip is 64 seconds 15 frames, or 64.5 seconds, you need to stretch it out to 64.5645 seconds, which is 64 seconds and 17 frames. Hold the Ctrl key down and drag the right edge out the extra 2 frames' worth. Now the frames in the clip physically line up with the frames of the timeline.
Chienworks wrote on 8/18/2009, 8:08 AM
29.97fps can be interlaced or progressive.
30fps can be interlaced or progressive.
24fps can be interlaced or progressive.
25fps can be interlaced or progressive.
xx fps can be interlaced or progressive.

Frame rate and interlaced/progressive aren't tied together. Changing the frame rate doesn't change or preserve whether the material is interlaced or progressive.
R0cky wrote on 8/18/2009, 8:09 AM
thanks, that's what I thought and yes I meant 3 frames every 100 seconds.

johnmeyer wrote on 8/18/2009, 10:02 AM
Bastinado's question is different than the original post, and I think the answer is different as well. The original poster wanted to know whether he should change from 29.97 to 30 fps and the answer was definitely no because the player (in this case another PC over the web) is quite capable of playing 29.97. You never want to change fps if you don't have to.

However, what if you want to play on a device (an NTSC TV set, for example) that only understands 29.97, and you have some video that was taken with a still camera that also takes 30 fps video. My recommended solution in this case would be to find a utility which can patch the header of the video to "lie" about the framerate and tell the computer programs that play or edit the file (in this case Vegas) that the video was recorded at 29.97 fps. If you do this, Vegas will play back ALL the frames, without dropping or adding or creating any frames. It will simply play back at a very slightly reduced speed. Since the change in speed between 29.97 and 30 fps is going to be undetectable, this approach provides a solution that is virtually perfect, with no downside.

I use a very crude hack called AVIFrate:


Hopefully, someone else can suggest a slightly more elegant tool.
R0cky wrote on 8/18/2009, 1:13 PM
Thanks John. I was hoping there was a way to get Vegas to do this without fiddling with the playback rate adjustment.

Rocky, aka bastinado