Broadcast colors: is it always necessary?

Jayster wrote on 6/20/2006, 11:22 AM
I've been having a bit of confusion about this, being a hobbyist and not a pro. Suppose I want to make a slide show using stills from a digital camera. Will I ALWAYS need to limit the colors to broadcast range (equivalent of 16-235 RGB)???

Forget the case of displaying it on a PC. That's a no-brainer (no need whatsoever to limit the colors there). And forget the case where it's SD for general distribution (you MUST use broadcast colors then because you don't know the target environment).

Suppose I want to make the slide show exclusively for display on an HDTV at 1440x1080i via component video. (Source can be a set-top box like a LinkPlayer2 or could be a PC video card with HD output).

Does this medium (HDTV via component video) require the same IRE level limitations as a regular NTSC television?

And, finally, would it be any different if the target was an LCD-based HDTV?

Comments

Former user wrote on 6/20/2006, 11:27 AM
No, broadcast colors are primarily an issue when you are creating a project to be broadcast over TV (such as from a TV station local affiliate or a network).

Dave T2
Laurence wrote on 6/20/2006, 11:36 AM
I recently did a project for a my wife's non -profit organization. Parts of it were broadcast on Unavision on a segment they did covering her efforts. They used footage from a DVD she gave them. On TV the colors looked really washed out. Could I have improved this by giving them a DVD mastered with the broadcast colors filter applied?
Jayster wrote on 6/20/2006, 11:43 AM
Laurence - good question. I wonder if it is a case of colors being compressed twice:-(

Dave: Thanks for the reply! One of the things that confuses me about this... If a TV is designed to receive broadcasts, one would think (perhaps incorrectly) that it would expect black to be IRE7.5 (equivalent to a digital RGB 16,16,16) and that the TV would display it's blackest color when it receives this level of signal. And vice versa for whites at equivalent of RGB 235,235,235 which apparently equates to IRE 100.

So if the TV is designed to display its full contrast at the range of IRE 7.5 to 100 (RGB equivalent of 16-235), what happens if it receives a signal from some non-broadcast source that starts at IRE 0? Do the blacks get crushed?

Or does the TV handle it differently when it gets the signal from a non-broadcast source? The set-top box will send HDTV out via component video, not the RF from an antenna or composite from a VCR / cable box. And if it's HDTV (which is NOT NTSC), do the rules change?

That could get even more confusing, because modern cable boxes have component video, too. But I suppose cable is going over a wire instead of air waves and wouldn't need the reduced color range.

Obviously I am confused about all this...
Former user wrote on 6/20/2006, 11:52 AM
I don't know all of the technical details, but the TV normally doesn't care. It is the tv Transmission equipment that is looking for certain levels. Too high video use to cause a transmission to shut down. Also, in the past, if the video was hot it would bleed into the audio.

If the black was too low, the TV signal would have problems finding the sync reference so the picture would roll or flip.

7.5 is not true black. It is 7.5 units above true black. But that is part of the NTSC broadcast standard. Digital 601 uses 0 black.

Dave T2
Former user wrote on 6/20/2006, 12:01 PM
Laurence,

I am surprised they used footage from a DVD. Most TV stations will require some sort of tape source.

It could be if the blacks were at 0, during the transfer of the DVD for broadcast, they were raised to 7.5, which will grey out the image some.

Chances are though, they just left the DVD in a unity playback situation and did not worry about the colors or contrast. If there were no color bars on the DVD, they would have no reference of what the color is supposed to look like.

Dave T2
GlennChan wrote on 6/20/2006, 9:03 PM
Digital 601 uses 0 black.
ITU-R BT.601 uses Y'=16 for black (for 8-bit formats).


Some information about 7.5 IRE setup and levels:
here

A quick rundown of proper levels for various "formats":

RGB formats (digital)
studio RGB - colors use 16-235 range. Default for Vegas 5+. The MPEG2 encoder looks for studio RGB levels.
computer RGB - colors use 0-255 range. Still images import in this form... which is inaccurate unless you apply the "computer RGB to studio RGB" preset in the color corrector.

RGB fomats (analog component)
Unfortunately, more than one format. You don't have to concern yourself with this much though, unless working with something like betaSP or other analog gear.

Digital Video formats (that follow ITU-R BT.601, BT.709 ... includes DV, DVD)
The Y' luma values control black and white levels. Legal range is 16-235.
*Some camera manufacturers add fake digital setup to compensate for the camera not converting levels correctly.

Analog composite - NTSC (except for Japan)
Black level at 7.5 IRE.
White level at 100 IRE.
*Unfortunately, the majority of DV equipment does not follow this!! See the article.

Analog composite - NTSC (except for Japan)
Black level at 0 IRE.
White level at 100 IRE.

DVD players: Some Asian models convert black and white level incorrectly.

2- The TV station might be dubbing your footage onto another format... which could explain why the levels can get out of whack if the person dubbing isn't checking for levels. Or they could be "correcting" your levels. Just another possibility... I wouldn't know, just adding some speculation here. :D
Jayster wrote on 6/20/2006, 11:23 PM
Glenn:
Thanks so much for the reply. The link you posted was a wealth of information. I'll summarize my understanding, and hopefully I get it right:-)

I wrote earlier that IRE 7.5 is the equivalent of RGB 16,16,16. Now I think this statement was utter nonsense. It seems the question is more like: is the video in a digital domain or an analog domain?

If it's in the digital domain (i.e. while it is on a file in your hard drive or a DVD or a mini-DV), and assuming it's complying with the digital 601 spec, then it should be within the bounds of 16-235 as far as it's Y (luminance) values go. In RGB terms that would be 16,16,16 for black and 235.235,235 for white. [But perhaps I should forget about saying "in RGB terms" because DV is not a jpeg.].

In the analog domain, the answer is "depends!" But evidently it is the job of the digital-to-analog equipment chain to translate the video into an analog signal with luminance levels (measured in IRE) that are acceptable for the display device and signal format (i.e. composite, etc.). "Acceptable for the display device" also depends on the region. In USA, that's IRE 7.5 to IRE 100.

Final question (I hope): you wrote on your web page that luminance in the digital domain is stored in the 16-235 range for DV and DVD. What about for HDTV? What if your target is ONLY for high def across component video? Does your video in the digital domain (HDV, WMVHD, etc.) still need to have its luminance restricted to the range of 16-235? (Forgive me if I missed the point somewhere)
farss wrote on 6/21/2006, 12:00 AM
Well I'm not Glenn but I might be able to help.
I think you've got it right.
Except from what I'm seeing on my scopes the output from digital still cameras is as Glenn said Computer RGB and needs to go through the CC set to Computer RGB to Studio RGB preset. However all the Sony DV cameras I've looked at seem to record from 16 to 255 so the above conversion gets it wrong at the black end. I use a custom curve in the Color Curves to get it to 16 to 235 without upsetting the blacks.
WMV seems to be Computer RGB, in other words legal video might look a bit washed out. The HDV cameras also seem to run from 16 to 255 by the way.

Bob.
Jayster wrote on 6/21/2006, 12:20 AM
Bob - your observation about the 16-255 luminance from HDV cameras (I have a Z1) makes me wonder what the z1's color bars are sending out. (i.e. can I use it to calibrate my HDTV? Maybe not...)

Seems like the levels FX does a nice job of squeezing the colors into legal range. You can regulate pretty well by eyeballing the results on the waveform scope. I suppose I just have to make sure the frame I base it on has plenty of whites and blacks in it.

Speaking of eyeballs, I better go close mine. Sure is awesome having a global community on this forum!
DrLumen wrote on 6/21/2006, 7:14 AM
I'm not sure of all the tech details either but just something to consider...

As a hobbyist, I don't do anything for broadcast, not yet anyway. But, I have noticed that if the whites are too hot in contrast to the blacks (0-255) the whites will bloom and cause a buzz in the audio on some standard TV's. On an even older TV (1970ish) the picture tried to do a horizontal roll and get out of sync. Maybe it was just that TV. <shrugs> If I clamp them to broadcast colors, I don't have those problems.

Just a thought...

intel i-4790k / Asus Z97 Pro / 32GB Crucial RAM / Nvidia GTX 560Ti / 500GB Samsung SSD / 256 GB Samsung SSD / 2-WDC 4TB Black HDD's / 2-WDC 1TB HDD's / 2-HP 23" Monitors / Various MIDI gear, controllers and audio interfaces

plasmavideo wrote on 6/21/2006, 8:31 AM
"The MPEG2 encoder looks for studio RGB levels"

Just curious about this statement, Glenn. What happens in the encoder if the video isn't studio RGB? Does it process improperly or is that just a "good practice" statement?

I understand the principles, I'm just a bit puzzled by that comment and want to learn more.

BTW - haven't had a chance to thank you yet for the great series on color correction, so consider yourself thanked!

and yes, overblown whites can do nasty things to audio in televisions and nasty things at the transmitter end of things as well.

We keep a tight clamp on whites with a special proc amp to make sure proper video levels go to the transmitter at all times. That sync buzz noise you hear is one result of improper white level.

Sorta off topic, but if you ever watch a station in stereo, and you hear a low frequency kind of growling noise on audio peaks, and a rumbling noise in quiet passages, call up the station and tell them to fix their ICPM problem and resweep their transmitter! That'll impress 'em. One station in this market has had that problem for over a year and hasn't bothered to fix it. They also have one beta machine with the video cranked up so high that the whites clip and bloom and cause the buzz, too. I can't understand why no-one there won't take 10 minutes to fix THAT problem. Between the ICPM noise and the buzz and the lousy video, the Britcoms they show are ruined - at least every other one that hits that tape machine.

Enough rant
GlennChan wrote on 6/21/2006, 1:51 PM
Except from what I'm seeing on my scopes the output from digital still cameras is as Glenn said Computer RGB and needs to go through the CC set to Computer RGB to Studio RGB preset. However all the Sony DV cameras I've looked at seem to record from 16 to 255 so the above conversion gets it wrong at the black end. I use a custom curve in the Color Curves to get it to 16 to 235 without upsetting the blacks.
A lot of DV cameras do record stuff over 235. These are sometimes called "superwhites". They are above maximum white level.

I'm not sure why cameras are implemented this way. It's useful in the sense that:
--It gives you 0.3+ or more stops of dynamic range. In post you can bring this into legal range... I have a color curves preset that does this.
http://www.glennchan.info/Proofs/dvinfo/color-curves.veg

--If you hook up your camera to a TV directly (via composite/RCA or S-video), many CRT TVs are actually able to display this extra information... albeit with distortion. But arguably, something is better than nothing. If the camera didn't record over Y'=235, you wouldn't have those values to begin with (i.e. you'd have nothing).

It's disadvantageous in that:
--These levels are illegal. These legals will either be clipped (which is ok), distorted, and/or cause other problems (like audio buzzing).
--If you hook up your camera to your VCR, and the VCR hooks up to the TV via co-axial then you can get audio buzz.

GlennChan wrote on 6/21/2006, 1:58 PM


If the levels are in the 0-255 range instead of 16-235, then the image will have:
--clipping in the blacks.
--increased contrast, saturation
--illegal levels

When the DVD player plays the material back, one of two things can happen:
A- The DVD player will clip the levels outside the 16-235 range. This means clipped whites and blacks.
B- It doesn't clip the levels outside the 16-235 range. So now you have superwhites and superblacks going to the TV.
The superwhites are as mentioned previously... CRTs can generally display them with (slight) distortion. Depending on the signal path, you might get audio buzz.
The superblacks can cause a rolling picture in old TV... the TV can't differentiate between sync pulses and the superblacks. Newer equipment shouldn't really have problems with that.
farss wrote on 6/21/2006, 2:36 PM
I think the answer to why the cameras record video this way has to do with the D<->A conversion process and interpolation.
Feed the A->D converters legal white and they should produce constant values of 235. At the other end of the process the D->A converters should produce the same 1 V signal.
However feed the converters a 0V to 1V square wave at around the nyquist frequency and they should produce sample values of 16,235,235,16,16,235 etc. However the D-> A converters will apply interpolation and produce a sine wave that overshoots.
Now I have no hard knowledge of this but it can be a problem in the audio realm, known as intersample clipping. It's pretty easy to avoid though by leaving a little headroom, of course there's way more dynamic range in audio, turning things down 1dB is no big deal.
Bob.