Comments

craftech wrote on 5/24/2014, 7:30 AM
I am not following this conversation very well.

24 Peter (The OP) asked:
Is there anyway to set Disable Resample as the default Switch for my video events? VP13
--------------------------------------------------------
Kelly replied:
i've been doing exactly that in Vegas 9, 8, 7, 6, 5, 4 ... maybe even in 3..........I still maintain that there are a HUGE number of video quality issues brought up in the forums in which the answer is "disable resampling", yet the number of times where having resampling on would have helped is nearly non-existent.

------------------------------------------------------
Later Kelly said:
If you happen to be working with 29.97, remember that while almost all other software out there uses 29.970000000000 as the frame rate, Vegas is one of the exceptional few that uses the true value of 29.9700299700299700..., so this could very well qualify as a different frame rate.
-------------------------------------------------
SOUNDS BAD
-------------------------------------------------
John R. said,
It should be pointed out that you really don't want to have Disable Resample be the default... what you really want is for Smart Resample to get "smarter"!
-------------------------------------------------
Kelly then said,
The only time you need to disable it is if your source frame rate(s) don't match your output frame rate, and you don't like the ghosting that the resampling causes. ????
-----------------------------------------------------------------------------
OK, so if "a HUGE number of video quality issues brought up in the forums in which the answer is "disable resampling", yet the number of times where having resampling on would have helped is nearly non-existent.", why isn't "disable resample" by default a good idea. What is the downside John?

What do you do each time Kelly to avoid the Huge number of problems? I am confused by some of your responses (which is highly unusual because you are usually so absolutely clear).

I have to admit, until I read this I never paid any attention to "resample" so now I am wondering if the ghosting I sometimes get when going HD to SD isn't a resampling issue.

John
Chienworks wrote on 5/24/2014, 7:42 AM
John, 99.9999% of what i deal with is 29.97002997i in -> 29.97002997i out. I imagine this is the case for many of our region 60 users. For region 25 users i'm sure 99.9999% of what they deal with is 25i in -> 25i out. Therefore the whole resample or not issue is mostly moot.

For those cases where it isn't moot, such as people going to 24p, or using 30p animation or screen capture sources, or low grade cameras that don't shoot at a standard frame rate, then a huge part of their quality problem in those cases is resampling. It's also affected by the "squeaky wheel" problem, in that a lot of those who come in asking for video quality advice are these users. This is the problem we hear about here because it's the one that gets asked about.

Hopefully that helps you understand why i say "huge" even when it's a small problem. ;)
craftech wrote on 5/24/2014, 8:18 AM
Thanks Kelly. Really appreciate it.
Regards,
John
MikeLV wrote on 5/24/2014, 9:17 AM
"If you happen to be working with 29.97, remember that while almost all other software out there uses 29.970000000000 as the frame rate, Vegas is one of the exceptional few that uses the true value of 29.9700299700299700..., so this could very well qualify as a different frame rate."

Does that statement imply that camcorders shoot at 29.970000 but Vegas uses 29.9700299700299700, therefore, the frame rates are different, and resample should be disabled? Or are you only referring to other NLEs?
Chienworks wrote on 5/24/2014, 8:56 PM
I'd guess that probably most mid to high end camcorders use the right value. Certainly any pro equipment would. Probably cheaper stuff is more likely to use 30 than 29.97000000...

This problem was noticed by people bringing in files rendered in Premier and a few other NLEs and discovering that a frame was missing. One of the Sony programmers participating in the discussion was able to nail the issue down to the other NLEs using the rounded-off value.
john_dennis wrote on 5/24/2014, 9:47 PM
When looking at internal preferences, it appears that Vegas uses eight decimal places to store the precision of the frame rate. At least that's the precision that one is able to enter custom frame rates, or that's the precision that it is displayed.



My first exposure to Disable Resample was with media from a camera that recorded at 15 fps.
musicvid10 wrote on 5/25/2014, 9:33 AM
I don't know what kind of clock cycles Vegas uses to map frame rates, but "assuming" it is 90 kHz, the ongoing rounding differences would far outweigh (are coarser than) comparing those of "true" vs. rounded NTSC.

We are always going to be dealing with irrational numbers here in NTSC land, so the realized frame rate is always a moving target. Hardly worth concerning oneself with, even if one in a million frames is adjusted.
MikeLV wrote on 5/27/2014, 6:55 PM
Vidmus says his renders are much faster when resample is disabled. So, is there any actual harm done by disabling? In other words, are there instances when it must be left on? If so, is it only when you're changing the frame rate from the source footage to whatever you're encoding to from the timeline? If that's the case and all my footage is from one camera filmed at 29.97, then from now on, I will disable resample.
Chienworks wrote on 5/27/2014, 8:32 PM
Mike, you never NEED to have it enabled, not ever. You may WANT to use it when changing frame rates, or you might not want it, it's your choice (i'm most definitely in the do NOT want camp).

I've never noticed the slightest bit of speed difference in any of my normal rendering, but then i'm not changing frame rates so Vegas won't resample anyway. If you do get a speed difference then the source isn't matching the output rate.

Try it both ways and see what you think.
24Peter wrote on 5/27/2014, 10:20 PM
[I]I've never noticed the slightest bit of speed difference in any of my normal rendering, but then i'm not changing frame rates so Vegas won't resample anyway. If you do get a speed difference then the source isn't matching the output rate.
[/I]

FWIW - I do experience a definite increase in speed of renders when disabling resampling, but I am often changing frame rates.
PeterDuke wrote on 5/28/2014, 8:07 PM
Coming from the PAL world, I am constantly reminded how the NTSC committee (I know, it is like saying DC current) made a rod for its collective back, and for so many others, by having those non-integer frame rates. Is it really too late to change? (And get rid of interlaced as well while we are about it!)
Chienworks wrote on 5/28/2014, 8:44 PM
Back in the day i had a host of various "NTSC" output devices with either video line out or RF out, ranging from video game consoles to Commodore computers to VGA->TV converters. Not a single one of them cost enough to have encompassed true NTSC-compatible circuitry. Some of them claimed 30fps while others 15. Most of them were far from 525 lines. Yet, all of them connected to my old analog CRT TV through the antenna or video line inputs and the TV showed the image on the screen just fine (well, for various values of "fine").

And of course, this is all completely moot now with digital displays that either automatically sync to whatever input is given, or don't even need to sync as they build up a frame buffer from incoming data and generate the display from that buffer.

It's probably mostly fear of backward compatibility that still keeps us at the weird frame rate. I'd say if the NTSC world tomorrow started broadcasting at precisely 30.0000000, no one would even notice except for a few station techs who would wonder why their 24 hours worth of programming ended 2 minutes and 26 seconds early each day.

By the way, the repeating decimal value 29.9700299700299700.. looks a little better as it's fractional equivalent: 30,000/1001.
PeterDuke wrote on 5/29/2014, 12:05 AM
"no one would even notice except for a few station techs who would wonder why their 24 hours worth of programming ended 2 minutes and 26 seconds early each day."

Some of our commercial stations are so far behind that 2 minutes is nothing. I suspect that they do it so that their programs don't end when others do, to discourage channel switching.

Chienworks wrote on 5/29/2014, 7:41 AM
I must have been really tired last night. My mental math was off. 1 minute, 24 seconds.
rs170a wrote on 5/29/2014, 8:00 AM
Using Drop Frame time code results in 108 frames per hour which comes out to 2,592 frames or 86.4 seconds per day.
For more information, check out Timecode PDF, a white paper done by Leitch (now owned by Harris) several years ago.

Mike
Chienworks wrote on 5/29/2014, 8:06 AM
I believe Drop Frame doesn't change the speed with which the material plays. It merely changes how we count the frames.
Former user wrote on 5/29/2014, 8:14 AM
Right. Drop frame does not actually drop frames of video. It only skips frame numbers just like leap year.
rs170a wrote on 5/29/2014, 8:16 AM
That's correct Kelly. Drop Frame was introduced when the NTSC world went from black and white (30 fps) to colour (29.97 fps). The PDF I referenced goes into a lot of detail if anyone is interested. In a nutshell, it drops the first two frames (00 and 01) of every minute except every 10th minute.

Mike
PeterDuke wrote on 5/29/2014, 8:49 AM
I rest my case :)
riredale wrote on 5/29/2014, 10:57 AM
Here's some more information about the whole 29.97 thing. It was done to make the new color TVs compatible with the millions of monochrome TVs that were already in use. They were worried about interference patterns, and they had to either change the location of the audio carrier (which would have killed sound for the existing monochrome sets) or slightly change the frame rate. They chose the latter.

An excerpt:

"When a transmitter broadcasts an NTSC signal, it amplitude-modulates a radio-frequency carrier with the NTSC signal just described, while it frequency-modulates a carrier 4.5 MHz higher with the audio signal. If non-linear distortion happens to the broadcast signal, the 3.579545 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen. To make the resulting pattern less noticeable, designers adjusted the original 60 Hz field rate down by a factor of 1.001 (0.1%), to approximately 59.94 fields per second. This adjustment ensures that the sums and differences of the sound carrier and the color subcarrier and their multiples (i.e., the intermodulation products of the two carriers) are not exact multiples of the frame rate, which is the necessary condition for the dots to remain stationary on the screen, making them most noticeable.

The 59.94 rate is derived from the following calculations. Designers chose to make the chrominance subcarrier frequency an n + 0.5 multiple of the line frequency to minimize interference between the luminance signal and the chrominance signal. (Another way this is often stated is that the color subcarrier frequency is an odd multiple of half the line frequency.) They then chose to make the audio subcarrier frequency an integer multiple of the line frequency to minimize visible (intermodulation) interference between the audio signal and the chrominance signal. The original black-and-white standard, with its 15750 Hz line frequency and 4.5 MHz audio subcarrier, does not meet these requirements, so designers had either to raise the audio subcarrier frequency or lower the line frequency. Raising the audio subcarrier frequency would prevent existing (black and white) receivers from properly tuning in the audio signal. Lowering the line frequency is comparatively innocuous, because the horizontal and vertical synchronization information in the NTSC signal allows a receiver to tolerate a substantial amount of variation in the line frequency. So the engineers chose the line frequency to be changed for the color standard. In the black-and-white standard, the ratio of audio subcarrier frequency to line frequency is 4.5 MHz / 15,750 = 285.71. In the color standard, this becomes rounded to the integer 286, which means the color standard's line rate is 4.5 MHz / 286 = approximately 15,734 lines per second. Maintaining the same number of scan lines per field (and frame), the lower line rate must yield a lower field rate. Dividing (4,500,000 / 286) lines per second by 262.5 lines per field gives approximately 59.94 fields per second."