Like many of the visitors to this forum I am interested in getting optimal results from Vegas. I have some questions, about color spaces and the behavior of encoding codecs, that are not adequately addressed by the Vegas help screens.
Let’s assume that our source data is HDV or AVCHD video or JPEG still images, and that our output will be Blu-ray or DVD. As I understand it the situation may be summarized as follows:
1. RGB color encoding typically involves using an eight-bit integer to represent the intensity of each of the primary colors red, green, and blue. Computer monitors distinguish the full range of such values from 0 through 255, but, for historic reasons TV's accept only values in the range 16 through 235.
2. On the computer monitor, a value of (0,0,0) produces black and a value of (255,255,255) produces white. On the TV, a value of (16,16,16) produces black and (235,235,235) produces white. This means that there are 256 distinct gradations of red, green, and blue on the monitor and only 220 gradations on the television, but the range of perceived colors is otherwise identical. The only harm done by the historical limitation of 16-235, aside from the awkwardness of processing values with an origin of 16 rather than 0, is this reduction in the number of distinct steps, which makes the color gamut thinner and could contribute to banding.
3. Sony calls signals intended for a computer monitor, with values 0-255, "computer RGB" and signals intended for TV, with values 16-235, "studio RGB". But inside Vegas they look the same – Vegas can’t tell if an event with values all between 16 and 235 is that way because it was intended to be a studio RGB clip or if it was actually a computer RGB event that happened to have a narrow range.
4. If a computer RGB signal is passed to a TV through a Blu-ray or DVD player, it will usually be clipped to the range 16 through 235. All values less than 16 will be converted to 16 and all values greater than 235 will be converted to 235, with no changes to intermediate values. Therefore all detail in the "illegal blacks" and "illegal whites" of the original signal will be lost.
5. Even if a computer RGB signal has no illegal values it will produce an incorrect image when passed to a television. For example, a value of (16,16,16) will come out pure black, although intended to be a dark grey (exactly how dark depending on the gamma curves elsewhere in the signal chain, which we ignore here). A value of (235,235,235) will come out pure white, although intended to be a light grey. The image on the TV will appear more contrasty than intended, even if no information is lost.
6. Likewise a studio RGB signal produce an incorrect image when displayed on a computer monitor. It will be flatter than intended, and will have impure blacks and whites.
7. Sony Vegas allows conversion between computer RGB and studio RGB as special cases of the "levels" filter. These conversions are linear mappings between [0,255] and [16,235], and vice versa, i.e. y=16+(220/256)x, or x=(256/220)(y-16), rounded. They are not 1-1 only because of the rounding, and, except for the effect of rounding, they preserve all information in the data.
8. The Vegas preview screen shows a computer RGB image. Therefore if the conversion to studio RGB is applied in the editing process it will make the image in the preview window appear flat. So it appears that Sony “expects” the user to use the computer RGB color space while editing.
9. The final output is often intended for TV viewing. To get the colors right on the TV – not too much contrast, all the expected shadow and highlight detail – the computer RGB must be converted, somewhere, to studio RGB.
So far so good (but correct me if any of the above assertions are incorrect). HERE IS WHERE THE REAL QUESTIONS BEGIN!
One way to get the final output right is to apply a computer RGB to studio RGB conversion as video output effect when rendering. Another way is to use a codec that does the conversion while encoding. We don’t want to do the filtering twice, so we need to know precisely when it is necessary and when it is not, and the information on this forum and elsewhere is conflicting. My questions:
• Is HDV and AVCHD source data (a) computer RGB, (b) studio RGB, (c) computer RGB except with values 0 and 255 illegal, or (d) studio RGB except with values 0 and 255 illegal, the rest of the super white and sub black levels giving headroom (and footroom?). All of these choices seem possible from reading the online sources, or perhaps the format of the data is different for different cameras. Note that, in case (d), if the headroom is actually used (by treating the data as computer RGB and doing a computer RGB to studio RGB conversion at the end) then the original 16-235 range will be compressed to about 30-218 so the contrast will be incorrect.
• Which codecs expect studio RGB source data and produce studio RGB output? Which codecs expect computer RGB source data and produce studio RGB output? Of these, which do it by clipping and which, if any, do it by a linear filter? Which codecs produce computer RGB output?
• What codec does Vegas use by default for Blu-ray? For DVD? How can we control the choice of codec? What if any difference does it make if we export a movie from Vegas in HDV format, with no recompression except for effects, titles, etc., and then do the rendering while exporting from DVDA to a Blu-ray image or a DVD folder? What if any difference (in color space) does it make if we export from DVDA for Blu-ray using MPEG2 versus AVCHD?
• Do any of these codec options make it unnecessary to prepare the data for TV viewing with a computer RGB to studio RGB conversion?
• In short, how can we organize our workflow to produce optimal output for TV viewing in the common situation where the source media come from HDV or AVCHD camcorders and the distribution is via Blu-ray and DVD?
I know there are color space gurus in this forum (e.g. Glenn) so I’m sure someone will be able to clear up any misconceptions on my part and answer these questions. And maybe someone from Sony can chip in. I hope Sony will include a more complete explanation of these matters in future help files.
Thanks in advance!
Richard Vaughan
Let’s assume that our source data is HDV or AVCHD video or JPEG still images, and that our output will be Blu-ray or DVD. As I understand it the situation may be summarized as follows:
1. RGB color encoding typically involves using an eight-bit integer to represent the intensity of each of the primary colors red, green, and blue. Computer monitors distinguish the full range of such values from 0 through 255, but, for historic reasons TV's accept only values in the range 16 through 235.
2. On the computer monitor, a value of (0,0,0) produces black and a value of (255,255,255) produces white. On the TV, a value of (16,16,16) produces black and (235,235,235) produces white. This means that there are 256 distinct gradations of red, green, and blue on the monitor and only 220 gradations on the television, but the range of perceived colors is otherwise identical. The only harm done by the historical limitation of 16-235, aside from the awkwardness of processing values with an origin of 16 rather than 0, is this reduction in the number of distinct steps, which makes the color gamut thinner and could contribute to banding.
3. Sony calls signals intended for a computer monitor, with values 0-255, "computer RGB" and signals intended for TV, with values 16-235, "studio RGB". But inside Vegas they look the same – Vegas can’t tell if an event with values all between 16 and 235 is that way because it was intended to be a studio RGB clip or if it was actually a computer RGB event that happened to have a narrow range.
4. If a computer RGB signal is passed to a TV through a Blu-ray or DVD player, it will usually be clipped to the range 16 through 235. All values less than 16 will be converted to 16 and all values greater than 235 will be converted to 235, with no changes to intermediate values. Therefore all detail in the "illegal blacks" and "illegal whites" of the original signal will be lost.
5. Even if a computer RGB signal has no illegal values it will produce an incorrect image when passed to a television. For example, a value of (16,16,16) will come out pure black, although intended to be a dark grey (exactly how dark depending on the gamma curves elsewhere in the signal chain, which we ignore here). A value of (235,235,235) will come out pure white, although intended to be a light grey. The image on the TV will appear more contrasty than intended, even if no information is lost.
6. Likewise a studio RGB signal produce an incorrect image when displayed on a computer monitor. It will be flatter than intended, and will have impure blacks and whites.
7. Sony Vegas allows conversion between computer RGB and studio RGB as special cases of the "levels" filter. These conversions are linear mappings between [0,255] and [16,235], and vice versa, i.e. y=16+(220/256)x, or x=(256/220)(y-16), rounded. They are not 1-1 only because of the rounding, and, except for the effect of rounding, they preserve all information in the data.
8. The Vegas preview screen shows a computer RGB image. Therefore if the conversion to studio RGB is applied in the editing process it will make the image in the preview window appear flat. So it appears that Sony “expects” the user to use the computer RGB color space while editing.
9. The final output is often intended for TV viewing. To get the colors right on the TV – not too much contrast, all the expected shadow and highlight detail – the computer RGB must be converted, somewhere, to studio RGB.
So far so good (but correct me if any of the above assertions are incorrect). HERE IS WHERE THE REAL QUESTIONS BEGIN!
One way to get the final output right is to apply a computer RGB to studio RGB conversion as video output effect when rendering. Another way is to use a codec that does the conversion while encoding. We don’t want to do the filtering twice, so we need to know precisely when it is necessary and when it is not, and the information on this forum and elsewhere is conflicting. My questions:
• Is HDV and AVCHD source data (a) computer RGB, (b) studio RGB, (c) computer RGB except with values 0 and 255 illegal, or (d) studio RGB except with values 0 and 255 illegal, the rest of the super white and sub black levels giving headroom (and footroom?). All of these choices seem possible from reading the online sources, or perhaps the format of the data is different for different cameras. Note that, in case (d), if the headroom is actually used (by treating the data as computer RGB and doing a computer RGB to studio RGB conversion at the end) then the original 16-235 range will be compressed to about 30-218 so the contrast will be incorrect.
• Which codecs expect studio RGB source data and produce studio RGB output? Which codecs expect computer RGB source data and produce studio RGB output? Of these, which do it by clipping and which, if any, do it by a linear filter? Which codecs produce computer RGB output?
• What codec does Vegas use by default for Blu-ray? For DVD? How can we control the choice of codec? What if any difference does it make if we export a movie from Vegas in HDV format, with no recompression except for effects, titles, etc., and then do the rendering while exporting from DVDA to a Blu-ray image or a DVD folder? What if any difference (in color space) does it make if we export from DVDA for Blu-ray using MPEG2 versus AVCHD?
• Do any of these codec options make it unnecessary to prepare the data for TV viewing with a computer RGB to studio RGB conversion?
• In short, how can we organize our workflow to produce optimal output for TV viewing in the common situation where the source media come from HDV or AVCHD camcorders and the distribution is via Blu-ray and DVD?
I know there are color space gurus in this forum (e.g. Glenn) so I’m sure someone will be able to clear up any misconceptions on my part and answer these questions. And maybe someone from Sony can chip in. I hope Sony will include a more complete explanation of these matters in future help files.
Thanks in advance!
Richard Vaughan