Color space clarification needed!

Richard V. wrote on 1/2/2012, 5:52 PM
Like many of the visitors to this forum I am interested in getting optimal results from Vegas. I have some questions, about color spaces and the behavior of encoding codecs, that are not adequately addressed by the Vegas help screens.
Let’s assume that our source data is HDV or AVCHD video or JPEG still images, and that our output will be Blu-ray or DVD. As I understand it the situation may be summarized as follows:
1. RGB color encoding typically involves using an eight-bit integer to represent the intensity of each of the primary colors red, green, and blue. Computer monitors distinguish the full range of such values from 0 through 255, but, for historic reasons TV's accept only values in the range 16 through 235.
2. On the computer monitor, a value of (0,0,0) produces black and a value of (255,255,255) produces white. On the TV, a value of (16,16,16) produces black and (235,235,235) produces white. This means that there are 256 distinct gradations of red, green, and blue on the monitor and only 220 gradations on the television, but the range of perceived colors is otherwise identical. The only harm done by the historical limitation of 16-235, aside from the awkwardness of processing values with an origin of 16 rather than 0, is this reduction in the number of distinct steps, which makes the color gamut thinner and could contribute to banding.
3. Sony calls signals intended for a computer monitor, with values 0-255, "computer RGB" and signals intended for TV, with values 16-235, "studio RGB". But inside Vegas they look the same – Vegas can’t tell if an event with values all between 16 and 235 is that way because it was intended to be a studio RGB clip or if it was actually a computer RGB event that happened to have a narrow range.
4. If a computer RGB signal is passed to a TV through a Blu-ray or DVD player, it will usually be clipped to the range 16 through 235. All values less than 16 will be converted to 16 and all values greater than 235 will be converted to 235, with no changes to intermediate values. Therefore all detail in the "illegal blacks" and "illegal whites" of the original signal will be lost.
5. Even if a computer RGB signal has no illegal values it will produce an incorrect image when passed to a television. For example, a value of (16,16,16) will come out pure black, although intended to be a dark grey (exactly how dark depending on the gamma curves elsewhere in the signal chain, which we ignore here). A value of (235,235,235) will come out pure white, although intended to be a light grey. The image on the TV will appear more contrasty than intended, even if no information is lost.
6. Likewise a studio RGB signal produce an incorrect image when displayed on a computer monitor. It will be flatter than intended, and will have impure blacks and whites.
7. Sony Vegas allows conversion between computer RGB and studio RGB as special cases of the "levels" filter. These conversions are linear mappings between [0,255] and [16,235], and vice versa, i.e. y=16+(220/256)x, or x=(256/220)(y-16), rounded. They are not 1-1 only because of the rounding, and, except for the effect of rounding, they preserve all information in the data.
8. The Vegas preview screen shows a computer RGB image. Therefore if the conversion to studio RGB is applied in the editing process it will make the image in the preview window appear flat. So it appears that Sony “expects” the user to use the computer RGB color space while editing.
9. The final output is often intended for TV viewing. To get the colors right on the TV – not too much contrast, all the expected shadow and highlight detail – the computer RGB must be converted, somewhere, to studio RGB.
So far so good (but correct me if any of the above assertions are incorrect). HERE IS WHERE THE REAL QUESTIONS BEGIN!
One way to get the final output right is to apply a computer RGB to studio RGB conversion as video output effect when rendering. Another way is to use a codec that does the conversion while encoding. We don’t want to do the filtering twice, so we need to know precisely when it is necessary and when it is not, and the information on this forum and elsewhere is conflicting. My questions:
• Is HDV and AVCHD source data (a) computer RGB, (b) studio RGB, (c) computer RGB except with values 0 and 255 illegal, or (d) studio RGB except with values 0 and 255 illegal, the rest of the super white and sub black levels giving headroom (and footroom?). All of these choices seem possible from reading the online sources, or perhaps the format of the data is different for different cameras. Note that, in case (d), if the headroom is actually used (by treating the data as computer RGB and doing a computer RGB to studio RGB conversion at the end) then the original 16-235 range will be compressed to about 30-218 so the contrast will be incorrect.
• Which codecs expect studio RGB source data and produce studio RGB output? Which codecs expect computer RGB source data and produce studio RGB output? Of these, which do it by clipping and which, if any, do it by a linear filter? Which codecs produce computer RGB output?
• What codec does Vegas use by default for Blu-ray? For DVD? How can we control the choice of codec? What if any difference does it make if we export a movie from Vegas in HDV format, with no recompression except for effects, titles, etc., and then do the rendering while exporting from DVDA to a Blu-ray image or a DVD folder? What if any difference (in color space) does it make if we export from DVDA for Blu-ray using MPEG2 versus AVCHD?
• Do any of these codec options make it unnecessary to prepare the data for TV viewing with a computer RGB to studio RGB conversion?
• In short, how can we organize our workflow to produce optimal output for TV viewing in the common situation where the source media come from HDV or AVCHD camcorders and the distribution is via Blu-ray and DVD?
I know there are color space gurus in this forum (e.g. Glenn) so I’m sure someone will be able to clear up any misconceptions on my part and answer these questions. And maybe someone from Sony can chip in. I hope Sony will include a more complete explanation of these matters in future help files.
Thanks in advance!
Richard Vaughan

Comments

amendegw wrote on 1/2/2012, 6:19 PM
Richard,

Welcome to the forum, and might I say, "That's one helluva first post!"

These issues have been cussed & discussed over and over in this forum and you will find many of your answers here.

Allow me to get you started on you first question, "Is HDV and AVCHD source data (a) computer RGB, (b) studio RGB,..." with a thread started by Nick Hope: Survey: What min/max levels does your cam shoot?

I'll let others chime in on your other questions.

I'm sure this thread will be a long one.

....Jerry

System Model:     Alienware M18 R1
System:           Windows 11 Pro
Processor:        13th Gen Intel(R) Core(TM) i9-13980HX, 2200 Mhz, 24 Core(s), 32 Logical Processor(s)

Installed Memory: 64.0 GB
Display Adapter:  NVIDIA GeForce RTX 4090 Laptop GPU (16GB), Nvidia Studio Driver 566.14 Nov 2024
Overclock Off

Display:          1920x1200 240 hertz
Storage (8TB Total):
    OS Drive:       NVMe KIOXIA 4096GB
        Data Drive:     NVMe Samsung SSD 990 PRO 4TB
        Data Drive:     Glyph Blackbox Pro 14TB

Vegas Pro 22 Build 239

Cameras:
Canon R5 Mark II
Canon R3
Sony A9

john_dennis wrote on 1/2/2012, 7:41 PM
"What codec does Vegas use by default for Blu-ray? For DVD?"

Blu-ray

Vegas Pro provides codecs using MPEG-2 from Mainconcept and AVC from Sony.

VC-1 is also legal for Blu-ray but is not supported by Vegas Pro out of the box.

DVD

Sony provides an MPEG-2 codec from Mainconcept. MPEG-2 is the only legal video codec for DVD.

"What if any difference does it make if we export a movie from Vegas in HDV format, with no recompression except for effects, titles, etc., and then do the rendering while exporting from DVDA to a Blu-ray image or a DVD folder?"

Rendering in Vegas Pro allows much better contol of the codecs using customizable templates than DVD Architect. In some versions of DVD Architect, the cores were not used as effectively as in Vegas. Since I rarely render in DVD Architect anymore I have not checked that fact, lately.

"In short, how can we organize our workflow to produce optimal output for TV viewing in the common situation where the source media come from HDV or AVCHD camcorders and the distribution is via Blu-ray and DVD?"

musicvid and others have been evangelists for using scopes to determine the range of colors in a work. From the Vegas Pro Menu, select View / Video Scopes. Select the RGB Parade from the drop-down to get started with a graphic display of the values in your video. With this information, you can determine where you need to apply the Computer RGB to Studio RGB filter.

Disclaimer:

Six moths ago, I couldn't spell color space.
Richard V. wrote on 1/2/2012, 8:42 PM
Thanks for referring me to the thread on camera color ranges. I tried my old XH-A1 on black, with all the settings suggested except my gain was at -3db, and came up with minimum luminance 17, peak 21, maximum 29. I couldn't test white since it is night.

In a dark indoor scene with a few incandescent bulbs in the field of view I have seen luminance down to 6 (blue down to 0) and on the white end all the way to 255, Here the aperture was nearly wide open and I was using (I think) 12db gain with the shutter at 1/60. Perhaps someone can explain why there would be a deeper black in such a scene than when shooting with the lens cap on.

musicvid10 wrote on 1/2/2012, 9:00 PM
"Let’s assume that our source data is HDV or AVCHD video or JPEG still images, and that our output will be Blu-ray or DVD."

I'll read the rest of your post time permitting. But for starters, HDV (MPEG-2) and AVCHD (h264) are both YUV codecs. That means their "legal" luminance range is 16-235. That doesn't mean footage isn't shot outside that range; it is. JPEG images cover the entire RGB range of 0-255. Care must be taken to conform the levels of all sources to 16-235 before rendering to a YUV codec (as for a DVD or BluRay), or suffer the results of clipped levels on playback.
musicvid10 wrote on 1/2/2012, 11:03 PM
Reading your questions at the end of your initial post, here are a few takeaway points (these are the basics, there are a lot of intricacies that do not directly relate to your questions):

YUV codecs, of which MPEG-2 (DVD, HDV, BluRay) and h264 (AVCHD, BluRay) are two, expect 16-235 input levels, and anything outside those legal levels will get clipped on playback. The most important thing to understand from your questions, is that codecs do not "produce" the levels, you do!

It is quite possible to shoot or encode levels outside the legal range (up to 0-255) to YUV codecs, but they will be clipped by the player, not by the codec. This is an important distinction.

Likewise, RGB codecs expect and play back a range of 0-255 RGB levels, which you provide; again the codec itself does not "produce" anything you do not give it.

Obvious exceptions to "filling" the entire range of luminance are the proverbial "white cat in a snowstorm" and "black cat in a coal mine" situations.

Also important is that some codecs will operate in either YUV or RGB color space, so you again must provide the proper levels depending on the operative color space.

Applying a Computer RGB filter to the Preview, then removing it prior to YUV render, --OR--
Editing in native RGB Preview space, and then applying a Studio RGB filter prior to YUV render,
are both simplistic approaches for people who do not have scopes or have no interest in using them. I can promise you that those of us who use the scopes and our eyes to do the work learn something new every time we do it, no matter how many years or decades we have been in the imaging business.

For a basic understanding of the theory and philosophy behind exposure and levels, I highly recommend a read of Ansel Adams. The principles that were illuminated over sixty years ago are as applicable today as ever.
NickHope wrote on 1/3/2012, 1:07 AM
Welcome to the forum Richard. Very impressive first post!

In a dark indoor scene with a few incandescent bulbs in the field of view I have seen luminance down to 6 (blue down to 0) and on the white end all the way to 255, Here the aperture was nearly wide open and I was using (I think) 12db gain with the shutter at 1/60. Perhaps someone can explain why there would be a deeper black in such a scene than when shooting with the lens cap on.

I have a few clips like that from my old DV Sony VX2000. e.g. Illuminated critters shot at night against pitch black water. It seems that in some cases one can get deeper blacks when there are various levels within a real scene than when one is trying to force the deepest black using the method "black-out-across-the-whole-scene" method. Some shooting of very high contast scenes could verify this.

Some of your questions may be answered by my flawed and over-simplistic article that requires updating. It at least splits various typical codecs into the 2 categories of "needs a fix" and "doesn't need a fix". As always the best thing, if you have Vegas Pro, is, a) to know what min and max your camera shoots, and b) to get the scopes up and see exactly what levels you've got in each shot and grade accordingly.

Failing that, as a general guide, most "stills" cameras including dSLRs shoot nominally 0-255 and so require a filter that squeezes that to 16-235. Most "Camcorders" shoot nominally 16-255 and so require a filter that leaves 16 at 16 and maps 255 to 235.

BUT... , having just graded, principally using color curves, 800 DV shots from a camera (Sony VX2000) that shoots nominally 16-255, there are many exceptions, and there is no substitute for individual taste and proper monitoring of each clip. I have several clips that simply look better with the highlights clipped. The camera designers are not idiots. Sometimes highlight detail outside of 235 is not worth salvaging, and to linearly map 255 to 235 can reduce the dynamic range of the important detail in the shot that lies between 16 and 235.

Another small detail that is rarely mentioned is that, although the broadcast-legal limit for luminance is 235, the limit (as I understand it... correct me if I'm wrong) for individual chroma is 240. In footage with a particular color bias I often map one or two of the channels to 240. e.g. If I have a very yellow scene I might simply clip the blue at 240, and get the overall luminance down to 235 just by rolling off the red and green channels. The following curves would be a theoretically-good starting point before adjusting the slope of the individual curves. Pathetically small, fussy, details but worth a mention (p.s. My new year's resolution: To be less perfectionist... but I'm making a poor start). I've never tested if YouTube/Vimeo permit chroma at 240, but it could easily be done.

amendegw wrote on 1/3/2012, 4:08 AM
"My new year's resolution: To be less perfectionist... "Nick,

I, for one, would be really disappointed if you kept that resolution!

...Jerry

System Model:     Alienware M18 R1
System:           Windows 11 Pro
Processor:        13th Gen Intel(R) Core(TM) i9-13980HX, 2200 Mhz, 24 Core(s), 32 Logical Processor(s)

Installed Memory: 64.0 GB
Display Adapter:  NVIDIA GeForce RTX 4090 Laptop GPU (16GB), Nvidia Studio Driver 566.14 Nov 2024
Overclock Off

Display:          1920x1200 240 hertz
Storage (8TB Total):
    OS Drive:       NVMe KIOXIA 4096GB
        Data Drive:     NVMe Samsung SSD 990 PRO 4TB
        Data Drive:     Glyph Blackbox Pro 14TB

Vegas Pro 22 Build 239

Cameras:
Canon R5 Mark II
Canon R3
Sony A9

musicvid10 wrote on 1/3/2012, 11:27 AM
"Most "Camcorders" shoot nominally 16-255 and so require a filter that leaves 16 at 16 and maps 255 to 235."

The camcorders Nick refers to are mainly HDV and AVCHD conumer/prosumer models, and 16-255 seems to be the norm without manual intervention.
The off-the-shelf solution here is the Studio RGB filter with "Output Start" adjusted to 0.

The other 85% of the market (DSLR, Pocket Cams, Phones, Point-and-Shoot) usually register 0-255 to some flavor of AVC, and require adjustment in post to reveal the full range on playback. There were between 250-300 million such units sold in 2010, depending on whose figures you believe. But even here you need to be careful -- a few of these models shoot RGB codecs, so the levels are correct, and only need to be adjusted if rendered to YUV.



musicvid10 wrote on 1/3/2012, 11:41 AM
"I, for one, would be really disappointed if you kept that resolution!"

Don't worry Jerry, that's going to be impossible. And yes, we like Nick just the way he is.
GlennChan wrote on 1/3/2012, 2:26 PM
Computer monitors distinguish the full range of such values from 0 through 255, but, for historic reasons TV's accept only values in the range 16 through 235.
Regarding the latter, the short answer is no.

2. On the computer monitor, a value of (0,0,0) produces black and a value of (255,255,255) produces white. On the TV, a value of (16,16,16) produces black and (235,235,235) produces white. This means that there are 256 distinct gradations of red, green, and blue on the monitor and only 220 gradations on the television
Again, no.

, but the range of perceived colors is otherwise identical. The only harm done by the historical limitation of 16-235, aside from the awkwardness of processing values with an origin of 16 rather than 0, is this reduction in the number of distinct steps, which makes the color gamut thinner and could contribute to banding.
The color gamut is sort of larger. see the section on mapping superwhites into legal range:
http://www.glennchan.info/articles/vegas/color-correction/tutorial.htm

Yes a disadvantage is banding issues.

3. Sony calls signals intended for a computer monitor, with values 0-255, "computer RGB" and signals intended for TV, with values 16-235, "studio RGB".
That's kind of a misleading dichotomy. Some video codecs expect computer RGB. I know the logic is stupid (if there is any logic to all this), but that's just the way it is.

If you don't like it, I highly recommend filing a feature request because every other NLE on the market handles this for you. It's ridiculous that there are endless discussions here on this subject... and that they are always full of misinformation.

If a computer RGB signal is passed to a TV through a Blu-ray or DVD player, it will usually be clipped to the range 16 through 235. All values less than 16 will be converted to 16 and all values greater than 235 will be converted to 235, with no changes to intermediate values. Therefore all detail in the "illegal blacks" and "illegal whites" of the original signal will be lost.
This is not what happens in practice. Some players and some monitors will clip at 235 Y', others at 255 Y'.

At this point I will mention that it is INCREDIBLY IMPORTANT to specify whether you are talking about Y'CbCr (sometimes incorrectly referred to as YUV) and RGB. These are two DIFFERENT THINGS.

**************If you have a value of 16 Y', sometimes this will result in a RGB value of 0 and other times it will result in a RGB value of 16 when you decode it in Vegas. THIS IS WHY PEOPLE ARE CONFUSED AND WHY THEY ARE WRONG.***********************

(Yes I am sick and tired of these endless discussions...)






There are workarounds, and I suggest you use them.

But yes it is silly that Vegas will often display your image incorrectly.

We don’t want to do the filtering twice, so we need to know precisely when it is necessary and when it is not, and the information on this forum and elsewhere is conflicting.
In practice you will often convert twice.

My questions:
It is neither. It is Y'CbCr. Trust me on this one... it is neither. (But if you really want to know the answer... look at the formula to convert from computer RGB to Y'CrCr.)


• In short, how can we organize our workflow to produce optimal output for TV viewing in the common situation where the source media come from HDV or AVCHD camcorders and the distribution is via Blu-ray and DVD?
Read the example workflows here:
http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm

If that doesn't make sense, then I would file a feature request. Pretty much every other NLE on the market handles this stuff for you. Please do it... we've all been going around in circles since Vegas 8 or before that.

I hope Sony will include a more complete explanation of these matters in future help files.
IMO, they should make Vegas automatically handle all this stuff for you. That way the user doesn't need to know. That is what every other NLE does.
Richard V. wrote on 1/3/2012, 6:30 PM
Thanks to everyone for the replies so far! It is illuminating just sitting back and reading them. In particular, thanks to Glenn for correcting my misconceptions (I knew they would be there) in the list of items that tried to explain things as I understood them from reading the forum. I'm sure it is tiring for you to revisit this subject so often but you do a real service for the rest of us.

I should have been more precise and asked which RGB format resulted in Vegas from importing HDV and AVCHD content, rather than what the original format was, since the fact that these are Y'CbCr is already explained in forum posts that I had read. I shall study the mapping between Y'CbCr and RGB as it may help with some of my other questions. Also, I shall file a feature request with Sony if you let me know how to do it. I am just an ordinary person who enjoys filming my children's musical activities and my nieces' and nephew's weddings, so I don't know how much weight my request will carry.

Once again, thanks to all!
kcw wrote on 1/4/2012, 7:55 AM
Hello Richard,

Like you, I have been trying to figure this out for some time, and even though Glenn, musicvid, and others have given some great advice, I often find myself questioning what is the best/correct way of dealing with this issue. When I put HDV or AVCHD footage into Vegas and check the scopes, it is 16-255. But unlike many here, I render to WMV files, so the "computer" RGB of 0-255 is what I need.

Is it better to A. use the levels filter to lower black to 0, B. Use a filter like the one on Glenn's site to clip off the superwhites, then apply the studio to computer rgb filter or C. Again, use a filter like the one on Glenn's site, but map superwhites to legal range, then apply studio to computer rgb filter? All three options look similar when viewed and checked with the rgb scopes, but are definitely not the same. Is one way any more "correct" than the other? I don't know. Seems half of the suggestions out there are to eliminate the superwhites, the other half to save them. But if 0-255 is the goal, I would guess that method A would be best. But Vegas offers no default filter to do this.

It is just frustrating, as the same footage looks so much better in other editors right off the bat. Premiere Pro shows the levels at 0-255 for the same footage that needs correcting in Vegas. In the last few months I have had 4 different people ask me which editing program to buy, and I have told them to download the trials of Vegas Movie Studio and Premiere Elements. All 4 wound up buying Elements because the video looked bad to them from the start in Movie Studio. If user frustration and lost sales are not enough to get Sony to fix this, I am not sure what else will.
musicvid10 wrote on 1/4/2012, 10:39 AM
WMV works in RGB space, so your first suggestion is the simplest.
Is it better to A. use the levels filter to lower black to 0
Yes, and leave the whites at 255. But rolling them off a tiny bit or even dropping the gamma a tiny bit may look subjectively better than leaving a brickwall at the top end.

Negating the effects of one levels filter with another has the net effect of reducing bit depth, so that's one thing we need to pay particular attention to if / when stacking filters.

GlennChan wrote on 1/4/2012, 11:43 AM
I don't think that there is a correct way of doing things.

What the cameras do isn't right in my opinion. It is a bad system in practice because sometimes they get clipped off and sometimes they don't. e.g. a DP on set may see the superwhites, but due to a lack of communication the post house might throw them out.

Regardless, I would lean towards keeping the superwhites rather than throwing them out.
farss wrote on 1/4/2012, 3:31 PM
"All three options look similar when viewed and checked with the rgb scopes, but are definitely not the same."

That's because of changes in gamma. Avoid using the Levels FX, it can fool you badly. Use the Color Curves FX in conjunction with the waveform monitor so you can see what you're doing.

The reason why there's subtle difference between those three methods is because they are changing gain / gamma. There's no technically right way, you have to make a creative decision.

The closest to technically correct is to apply a sRGB to cRGB transform but that will clip the superwhites. Everything between Y=16 and Y=235 remains the same i.e. you preserve gamma.
If you shift the blacks from Y=16 to Y=0 you've altered the slope and how everything between Y'=16 and Y'=235 looks but you will preserve the superwhites. This is probably not a bad choice, nothing gets lost.
Your final choice is the one you found on Glenn's site to remap the superwhites and then use a LevelsFX to adjust the range to suit what you are rendering to. This to me is the best approach. You make a "creative" decision using the Color Curves and then preserve that decision using the blunt tools provided by the Levels FX.

As noted and shown above you do run a very small risk of introducing banding, you are throwing away some dynamic range in the pipeline unless you use Vegas's 32bit float modes but that can open another much bigger can of worms. Given the noise level of the cameras any of us here can afford the probability of introducing a banding problem is very low.

Bob.
NickHope wrote on 1/4/2012, 11:44 PM
Vegas Pro needs a "Computer Levels / Studio Levels" toggle switch above the preview window, that has no effect on the render, only on the preview. Vegas Movie Studio should probably just display computer levels by default.
kcw wrote on 1/7/2012, 6:55 AM
Hello Bob,

I have read conflicting articles about the benefit of keeping superwhites. One of them seemed to indicate that if the codec involved wanted 16-235, as AVCHD and HDV do, it is better to clip the superwhites, as the material over 235 recorded by such codecs might be problematic. But if 0-255 is the output goal, throwing them out does not seem to make sense. Would musicvid's suggestion - lowering the white levels a bit, to 250, for example - help take care of any such errors the camera might introduce?
GlennChan wrote on 1/9/2012, 10:59 AM
Break the problem down into 2 questions:

1- Do you want to keep the superwhites or throw them out? Either choice can make sense.

But really... it's not hard to run a test and do things both ways. Make up your own mind.

2- What do you need to do to make sure that your levels are correct? There is only one correct answer here.

One of them seemed to indicate that if the codec involved wanted 16-235, as AVCHD and HDV do, it is better to clip the superwhites, as the material over 235 recorded by such codecs might be problematic.
That person must be confused.

Remember... Vegas' MPEG2 codec expects studio RGB or computer RGB depending on your project settings. What levels the encoder expects has nothing to do with whether or not you wish to keep superwhites.