OT: How does value of DV signal relate to analogue video

farss wrote on 10/10/2003, 9:27 PM
This is probably a question for BillyBoy or perhaps someone else can throw some light on this mystery.

I've always assumed that apart from ensuring that I kept my video within the broadcast safe colour range then it's impossible to generate video that's 'illegal'. A very good friend of mine who works for our national broadcaster tells me that the bane of their existance is indie film makers who submit material that is rejected when its tech checked because of illegal levels. Mostly its video over 1 Volt.

Now I've always assumed that DV which can have values from 0 to 255 for each of the RGB channels ends up with 0 mapped to 0 IRE black and 255 mapped to 100 IRE (i.e. white) which surely is not over 1 Volt. So I just couldn't see how it's possible to do this.

Well last night I was having a look at BillyBoys website on how to calibrate a monitor and I see that the pluge has black swatches at -4, 0, +4 IRE. Makes sense for setting the brightness on the monitor but it kind of blows my assumptions about DV out of the water. This means within DV I can create levels below black, another real no no if you want to get your material broadcast.

Now I know the full story on this is going to be pretty technical and probably of not much interest to most of us. But then again if getting it wrong is causing people to loose out on potential sale of their work then it is something we need to be very much aware of.

I do live in the wonderful world of PAL which has no setup by the way.

Anyway I'm hoping someone can illuminate this for all our benefit.

Comments

StormMarc wrote on 10/10/2003, 9:55 PM
I can't answer all of your question but my experience is as follows.

If I fade a clip to the default background black in Vegas it is actually about -7.5 IRE. If I then add a black media file and set the black to 16,16,16 it comes in at 0IRE. Proper DV that falls within 0IRE and 100IRE should be limited to the equivlent of 16-235.

If you scan a photo into photoshop and bring it into Vegas it will show illegal values in the waveform. If you then limit the output levels in PS to 16 & 235 first it will be ok. You can shoot DV higher (up to the equivlent 255 level) but if you don't fix it in post it will technically be illegal for broadcast. Check out http://www.adamwilt.com for some additional good technical info.

Marc
farss wrote on 10/11/2003, 3:25 AM
I've had a good look through Adam's site, I had been there before but had forgotten about it. Heaps of very good info but didn't see anything on this issue. Will keep looking, it does seem important that we all understand this.
Former user wrote on 10/13/2003, 8:16 AM
I can't answer your question directly, but remember Luminance value is not the only variable. You also have to be aware of Chrominance. Sometimes video can read 100 but chrominance may well exceed standards. That is why you use both a Vectorscope and Waveform scope to check levels for broadcast.

Dave T2
BillyBoy wrote on 10/13/2003, 8:49 AM
The topic of why chrominance and brightness levels are limited to what they are, why something can be considered "illegal" or too "hot" why it can cause 'noise' from a braodcast standpoint coupled with why frame rates are what they are in video instead of something else and why there are different broadcast standards instead of just one is very interesting from a technical standpoint but does quickly get highly technical and requires some advanced math to explain fully. What may be surprising in part deals with two seperate and highly unexpected things; human perception of light and color and the method your country uses for electrical power distribution.

If there is enough interest I suppose I could put a web page up, but I'd have to bone up on the topic, been a long while and I've forgotten most of the details. Some the things really get fanciating like why the colors yellow and purple fully saturated don't show well on TV, other odd ball stuff like that.