Audio levels for television

fongaboo wrote on 9/24/2002, 5:43 PM
I edited a commercial for a school project on Vegas, but I needed to deliver it to the store owner on miniDV (because they actually plan to use it on local cable ads). Since I don't have a miniDV deck or camera, I brought my Firewire drive to school and brought the finished AVI into FinalCut Pro to print to their miniDV deck. FCP had a default level of -12db for the tone (as in bars and tone). My project varies between 0 and -6db but never clips. I questioned that default for the tone and my teacher said that -12db in digital equated to 0 in analog. So he suggested that I lower the project audio by -12db before I print. This didn't sound right to me, but until recently he worked at the cable ads office that is receiving this for air, so I figure who am I to argue? But I am still curious so I am posting here. What do you all say?

Comments

bakerja wrote on 9/24/2002, 8:29 PM
Your teacher is right. The bars and tone are exactly that, a reference signal that the cable folks should set at zero vu (analog) on their playback decks. If you give a zero vu test tone and your actual levels excede that tone, then your reference is useless.
vicmilt wrote on 9/24/2002, 8:39 PM
On the other hand, did you ever notice that TV commercials seem louder than the show?
Well... often they are.
fongaboo wrote on 9/24/2002, 9:26 PM
So should I be mastering everything for -12db? ..for TV anyways?

Or could I have left my piece as is, and raised the tone to 0db?
Tyler.Durden wrote on 9/25/2002, 7:12 AM
Hi fongaboo,

While audio levels frequently exceed 0vu in analog, it is good practice for only percussives to pop over the 0vu level. Beyond +3 is almost guaranteed distortion. The 0vu to +3vu range is what analog audio folks call "headroom".

You might master your program peaks to average at the -12db (dig) with only occassional pops over -12db, but always under -6 (headroom). If you hit 0db in digital, that audio is guaranteed trashed.

The standard of 0vu in *analog* broadcast was a goal to optimize audio for consistant levels from differing sources. If everyone masters programs to peak at 0vu (analog)and sets tone on their tape at 0vu, things stay pretty close. Not everybody does and the results can be ugly (sounding).

Same for digital... if your tone (-12)is representative of the peaks in your program, with only occassional pops, you should be fine. Tape operators will optimize playback of your tone to 0vu (analog) for air.

If your program peaks really are between -6 and 0db (dig), you really might want to bring the program level down to -12db and put tone on at -12db, 'cause your program is out of standard and you're livin in the digital audio "danger-zone".


HTH, MPH



fongaboo wrote on 9/25/2002, 7:21 AM
OK sounds good. Is -12 particularly significant? Is it considered a digital equivelant of 0-analog, as my teacher claimed? Just wondering where I should be mastering future projects..
Tyler.Durden wrote on 9/25/2002, 8:34 AM
Hi fongaboo,

It sorta depends, but yeah. The level can differ from device to device, but -12db is pretty common.

Remember, the signal has to get into the analog world eventually, so a reference can vary (from system to system) as long as the program content (audio) is consistant with the reference. i.e.: if tone is -12 from Vegas, the tape-op will set it to 0vu... if tone from a DAT is -8db, the tape-op will set it to 0vu... if tone from a D-1 is -20db, the tape-op will set it to 0vu... If edited or mastered properly, the levels will be in accordance with accepted practice in each case.

HTH, MPH
SonyEPM wrote on 9/25/2002, 8:50 AM
This article does a pretty good job of covering this complex topic: http://www.creativecow.net/articles/spottedeagle_douglas/audio_mastering/index.html