Accuracy of Bars for HD alignment

farss wrote on 5/31/2006, 11:30 PM
Correct me anyone if I'm wrong but Vegas only has one set of bars and I assume they are for Rec 601 i.e. SD.
What to use to calibrate a monitor for HD.
Or does it matter?

Actually I'm pretty confused (as always).

Vegas correctly handles 601 and 709 and will convert between them but how does it handle sending this to the 2nd monitor preview. Does it convert everything to 601 and therefore that's how I should calibrate the monitor?

Someone please save me before my brain explodes, well more of a pop than an explosion I think :)

Bob.

Comments

Spot|DSE wrote on 5/31/2006, 11:53 PM
Vegas sends the bars in an HD project, as 709. The secondary monitor should be 709.
Vegas 6 (I believe) does everything at 709 in an HD project.
farss wrote on 6/1/2006, 12:32 AM
I understand that Vegas 6 runs HD correctly at 709 for HD projects.

What I'm talking about is the bars from Generated Media.
Here's where I get confused (possibly incorrectly).

I would have assumed that the bars for HD should be true HD bars (created for 709), I suspect that Vegas is using 601 bars and converting them to 709. I could be right or wrong on this point.

Assuming I'm right on the above I don't know if this matters anyway.
To put it around another way.

If I have bars in say DV25 and then render that to HD will they be correct bars for 709?

OK, just realised there's a simple enough way to check this, record some bars to HDV in a Z1, duh!

Bob.


Chienworks wrote on 6/1/2006, 3:50 AM
That's assuming, of course, that the Z1's bars are correct.
GlennChan wrote on 6/1/2006, 8:58 AM
The Rec. 601 versus Rec. 709 color space should come into play when you convert from R'G'B' to Y'CbCr. In the case of Vegas, it only needs 1 set of color bars as long as the R'G'B'-->Y'CbCr conversion is done correctly.

The formulas for Y' are:
Rec. 601: Luma (Y’) = 0.299 R’ + 0.587 G’ + 0.144 B’
Rec. 709: Luma (Y’) = 0.2126 R’ + 0.7152 G’ + 0.0722 B’

As long the codec (/Vegas) is applying the right formula, it should produce proper bars for you.

All the processing in Vegas is done in 8-bit R'G'B'.

2- Technically speaking, the video scopes in Vegas should be switchable between Rec. 709 and Rec. 601. The luma values (measured by the waveform monitor) and the chroma values (vectorscope) will be different depending on which formula you use.

3- If anyone is used to Final Cut Pro, you'll know that it does something else.
FCP's color bars generators operate in Y'CbCr color space. So they need two sets of color bars, one for each formula / color space.
In HD mode, the vectorscope's targets will actually be in a different place. This is correct.
The waveform monitor and vectorscope are inaccurate, but for different reasons than Vegas.

4- Those funny prime ( ' ) symbols are to denote gamma correction. They don't really matter in this case.
Usually RGB actually refers to R'G'B', but R'G'B' is such a fingerful to type.
farss wrote on 6/1/2006, 3:13 PM
Thanks Glenn,
I think it makes sense now.
If I'm understanding this correctly then I was worrying about nothing.
In my case (a LCD Monitor connected via DVI) everything stays in RGB, bars are speced in RGB and Vegas internally does it's stuff in RGB. So the generated media in Vegas I assume also works in RGB.
End result is that in my case the 601 / 709 thing is irrelevant, 75% Blue or whatever will always be 75% Blue and once my monitor is calibrated to that the only possible source of error is how Vegas does the YUV <-> RGB conversions.

Looking at this another way around.

Vegas converts 601 and 709 to its own internal thing and does the reverse going back out. IF (big IF) everything stays in Vegas's RGB within my system then what the incoming / outgoing video is doesn't matter. If my system is calibrated entirely against Vegas's internal system then I'm good, assuming Vegas does it's thing correctly, if it doesn't then there's not much one can do anyway.

Bob.
GlennChan wrote on 6/1/2006, 7:47 PM
Yeah.

But just to complicate things... :D

In Vegas, there's two different R'G'B' color spaces.... "studio RGB" and "computer RGB". For Y'CbCr-based/affiliated codecs, those codecs usually work in studio RGB. The Sony Vegas DV codec, and presumably the SonyYUV codec.
*Vegas calls Y'CbCr YUV, which is a little incorrect but doesn't really matter. Because YUV almost never refers to YUV analog encoding, but rather Y'CbCr.

This matters in a few cases:

1- Ingest. If you ingest a .jpg for example, it uses computer RGB color space. You'll want to convert it to studio RGB via the color corrector preset.

2- Output. If you output to .wmv or whatever, you should probably convert from studio RGB to computer RGB.

3- If you use the Video preview window in Vegas, the colors aren't necessarily accurate. This is because the view preview window usually shows the video is whatever form it's in. If the video is studioRGB, then it shows it studioRGB... with black level at 16 and white level at 235. Showing colors with black level at 0 and white level at 255 would be more representative.

In the case of color bars, the RGB values for the red bar are 180 16 16.
This does not give you a fully saturatred red. What you want to see is 210 0 0 or something like that (I can't remember the right numbers). But point is, X Red 0 Green and 0 Blue would give you 100% saturation, if X isn't 0.
The bars are supposed to be 100% saturated. They are correct, but just displayed "wrong".
*Arguably, displaying colors as studioRGB is a reasonable tradeoff in favour of performance + seeing superwhites.

4- A lot of filters are designed to work with the 0-255 computer RGB color space... and they don't necessarily give the right results if you want to output to 16-235 studio RGB.
i.e. the invert filter will give you the wrong levels. Or if you use the levels or color corrector filter, you have to 'juggle' the controls because changing one setting will effectively change another.

A potential solution would be to specify in the project settings what color space you want to work in (studioRGB or computerRGB). Then Vegas handles the levels on ingest and output, doing the appropriate conversions.
Each filter should have a setting to choose between what color space you want to work in. When you apply a filter, Vegas should automatically apply the default preset but with the setting for the right color space.

farss wrote on 6/1/2006, 8:20 PM
Some good points there.
Problem I see is many of us don't just do video, we're the graphics arts dept, CGI and sound and video editing depts all rolled into one!

So we bring stills from our DSC into PS to work on them and then take them into Vegas (or another NLE for that matter) and there could well be grief unless we really know what we're doing. Worse still our PC monitors need to be changed between those apps!

This isn't a trivial geek issue either, I'm reliably told the most common reason for our biggest network rejecting programs is caused pretty much by this, errant video levels!

I suspect things are going to get worse before they get better too, more and more devices seem to be working in computer RGB than studio RGB.

Bob.
GlennChan wrote on 6/2/2006, 11:27 AM
So we bring stills from our DSC into PS to work on them and then take them into Vegas (or another NLE for that matter) and there could well be grief unless we really know what we're doing.
Vegas could be better in this regard, in that it could apply the right levels conversion for you based on the project settings.

Worse still our PC monitors need to be changed between those apps!
That isn't entirely necessary. If you use any of the external preview options, they can display the right color.
DV/firewire will give the right color.
Windows secondary display will display the right color, but *make sure* you check the setting for studioRGB 16-235.

Coursedesign wrote on 6/2/2006, 11:41 AM
*make sure* you check the setting for studioRGB 16-235

Also when going to DVD?
GlennChan wrote on 6/2/2006, 11:52 AM
That setting only affects how things are displayed on Windows secondary monitor/display. So it's something else.

When going to DVD, your video levels should be 16-235 if using the MPEG2 encoder that comes with Vegas. One way to do that is by reading the video scopes... but the video scopes can be confusing because they have settings.
My levels tutorial explains one way of doing things:
in-depth levels
Coursedesign wrote on 6/2/2006, 2:53 PM
Good article!

I'm not clear on the "Windows screen monitoring" though (perhaps because I haven't had to do it, I prefer Decklink analog out to a pro NTSC video CRT monitor, in SD of course).

If the Windows screen can reproduce 0-255, but you only output 16-235, won't the visual appearance of that be different from outputting to a monitor that shows black at 16 and white at 235?

Wouldn't a correct pluge adjustment solve this, with Studio RGB or not?

GlennChan wrote on 6/2/2006, 8:22 PM
If the Windows screen can reproduce 0-255, but you only output 16-235, won't the visual appearance of that be different from outputting to a monitor that shows black at 16 and white at 235?
You can configure Vegas to convert from 16-235 to 0-255 on the Windows secondary display... so that's not really an issue.
Coursedesign wrote on 6/2/2006, 9:16 PM
You can configure Vegas to convert from 16-235 to 0-255 on the Windows secondary display.

Cool.

If I work in 0-255, should I still specify 16-235 StudioRGB for the monitoring and then ask Vegas to convert this back to 0-255 for my display?

Someday all this analog stuff will be a fond memory...
Jayster wrote on 6/2/2006, 10:09 PM
I'm getting quite frustrated with this myself...

I'm doing a project with some PIP effects, where the each PIP is cut with the cookie cutter. One of the PIPs is a still with a gold oval frame inside a (legal) black border at RGB16, thus it didn't need a cookie cutter. None of the PIP's fill the entire screen.

When I look on my preview CRT (a consumer TV which I calibrated!), the "black" borders are not as black as the empty portions of the screen! It makes my PIPs stick out like a sore thumb! The only way I could make the PIPs blend into the background was to drop the blacks down to zero (i.e. illegal).

Thought maybe I was crazy so I did some tests. Loaded the color bars and made sure they showed up on the monitor so that the middle pluge disappeared into the black pluge to the left. Snapshot to jpg checked in photoshop proves the middle pluge is RGB 16, thus if it disappears to black on the CRT it must be right. Then I did a similar snapshot on the real project and confirmed the PIP with the black border (which sticks out like a dark grey on the monitor) is ALSO RGB 16. Why does it stick out?

What I seem to be finding is that Vegas only regulates the black levels on the portions of the frame where you actually have content. Leave half the frame empty and you get blanketed with IRE 0.

What's more, the various color correction FXs only change the levels in the portions of the trame that have actual content PIP content. They make no effect on the portions of the frame that were empty! And I was applying the FXs at project video scope (not at track or event scope). Again, the only way to make the PIPs blend into the background was by setting illegal blacks at 0 IRE/RGB. I even checked my A/D converter. It has an IRE 7.5 / 0 switch. Turing it up to IRE 7.5 made it worse!

I guess if I want to make the entire screen "legal" I will need to drop a black jpg (set to RGB 16) as the bottom track and make sure it fills the entire frame. THIS IS NUTS!!!
farss wrote on 6/3/2006, 4:47 AM
You are entirely correct, this I found out the hard way back in V4 days.
Think of it this way, Vegas has an invisible 'bottom track' which is at 0 IRE, so all fades to black etc fade to 0 IRE, used to drive my old consummer monitor nuts, total loss of sync. Funny thing was when I started burning DVDs the problem went away because most DVD players clamp to legal levels.

So the answer is as you've noted to run a track of 16:16:16 right at the bottom. To be honest though I've pretty much stopped doing this for DVDs as the player will take care of it (hopefully, mostly).
Anything destined for tape is another matter.

Bob.
GlennChan wrote on 6/3/2006, 3:20 PM
farss:

Most *but not all* DVD players will clip some of the illegal colors. An argument can be made for clipping or not clipping, but in practice there are many DVD players that do both or either.
I'm not sure what the standard is, but whatever it is no one is following it.



Technically, you might want to avoid using IRE to describe digital levels since IRE is an analog unit, not a digital unit. This can get really confusing if you think about it.

But really, the way to look at it is:
Usually you want to work with 16-235 ("studio RGB"), where black level is at 16 16 16 RGB. The Vegas default black is 0 0 0 RGB, which is an illegal black and can cause problems.

If you add a media generator black set to 16 16 16 RGB, that's one way of dealing with it. The broadcast colors filter is another way of dealing with it, although not as well.

2- Really, I think we should ask for a project settings between studio RGB and computer RGB.
One use would be the default black level... studio RGB would automatically set the default black to 16 16 16 RGB.
farss wrote on 6/3/2006, 11:23 PM
Glenn,
I think where things get really sticky is having mixed black levels. The Sony cameras mostly set black (not whites though) to around the correct levels. Then if not careful you can have things Vegas and other things graphic with blacks set to 0 0 0. In this scenario I can't think of any overall way to correct the problem.

My approach is if I'm doing a stills only project is to work in computer RGB and apply correction to the video bus. If I'm mixing stills and video then I'll match the stills to the video's black and pull the super whites down on the bus.

As you say though it would be good if Vegas gave more control over this but even then I don't think there's anyway for it to avoid some stuff ups unless every source has metadata to tell it what space it's using.

Bob.
GlennChan wrote on 6/3/2006, 11:28 PM
Well I think it would be reasonable to assume that all .jpg, .bmp, etc. are 0-255 computer RGB color space. Vegas would then have a switch to do the appropriate conversion, and automatically turn that switch on when importing.

DV and HDV just depends on the codec... it always comes in as Y'CbCr and that always uses 16-235 levels for Y'. Vegas by default will use its own DV codec, so hopefully that shouldn't be an issue. Users just have to know not to muck around... i.e. Use 3rd party DV codec should say "not recommended, advanced users only" or something to that effect.

I think it would be better than what we have now... and would save a lot of keystrokes.
Jayster wrote on 6/22/2006, 2:33 PM
Bob:
I know that it's an LCD you were calibrating for HD, but I found something interesting about calibrating when component video is used. This article, which is actually for calibrating a projector, says: "No hue adjustments should be required with HDTV signals as the PB and Pr color difference signals travel separately in the component video format." Here's the article. I have an HD RPTV and I'm thinking I'll just plug my Z1's component video cable into it and use the Z1's color bars for calibration. I hope that's good...

In your case, using an an LCD as secondary monitor which will be operating in Computer RGB color space, I think I would suggest you may be better off calibrating it with a colorimeter that creates an ICC/ICM profile for the video driver to use. I bought one of these for my primary monitor, and once you set it up its really quite easy to use. Best thing is that it is not subjective and doesn't depend on using color bars. Colorimeter
farss wrote on 6/22/2006, 2:45 PM
Thanks for that, I'll look into it.
The other option from memory is the Spyder.
GlennChan wrote on 6/22/2006, 4:09 PM
It seems that the main use of colorimeters is to generate an ICC profile of the monitor. However, I don't think Vegas is actually able to take advantage of them. I've done a quick (perhaps incorrect) test using drastically different ICC profiles and seeing no change in the image. I took screenshots in Photoshop and used the difference composite mode + levels... I'm definitely not seeing a difference.

My guess is that ICC was designed for print work, and you can't apply it real-time. Most people doing color management for film/video are using 3D LUT solutions.

2- Some of the colorimeters don't work that well... see a review at the following site:
http://www.drycreekphoto.com/Learn/monitor_calibration_tools.htm

Jayster wrote on 6/23/2006, 9:15 AM
Glenn:
As I understand it, the ICC monitor profiles created by a colorimeter are not intended for use by Vegas. In fact, the monitor profiles are only intended for direct use by one application: the driver for your video card. It's goal is only to ensure that the output to the monitor is calibrated (in Computer RGB terms). This would be irrespective of what other software is being displayed on the monitor.

I haven't done much experimentation with this in PhotoShop. Usually you load the color profile from a digital camera image or whatever you are using as the color management workspace (you decide which takes precedence). Usually seems to be Adobe RGB or sRGB. Probably you only get a fully color managed workflow (for stills in Photoshop) if you profile the digital camera, which is not simple.

In any case, I think the only benefit to Vegas is that profile created for the LCD monitor ensures the video driver's output is calibrated. And it also includes a program that loads when you login which applies gamma correction to the monitor. The Pro version of the Monaco Optix XR includes LUT solutions, too.

Also in the manual for the Monaco Optix it says that if you change a variable in your viewing environment (such as the amount of ambient light, the time of day, etc.) you should rerun the profiling routine and update the profile. I have one and it takes about 10 minutes, quite comparable to what it would take to do it by hand with color bars.

All of which is only useful for a computer monitor. These are not useful at all for a FW connected television type monitor.
GlennChan wrote on 6/23/2006, 12:54 PM
From what I know, to get accurate color you need to:

A- Calibrate your monitor so that it interprets its input signal correctly. Not a problem with DVI, since it's digital.

B- Calibrate your monitor to get perfect grayscale tracking (i.e. whether or not B&W images have color casts), white point (D65 is the standard), an ideal gamma (sort of like applying color curves in Vegas).
I believe this can be done via your video card's LUT... a colorimeter could be useful here.

C- Gamut mapping. This is handled by an ICC-aware application (I don't believe Vegas is) or by another system of color management (i.e. 3D LUTs, Truelight).
Gamut mapping is necessary when the display's gamut differs from the output device's gamut. If the display's gamut is bigger, then the ICC-aware application needs to adjust the colors/saturation down to fit. If the display's gamut is smaller, then it's trickier. It can't reproduce colors as saturated/pure as the output device can. You either clip the colors, or apply some form of compression/mapping so that the colors look close.

If your display and the output device's color gamuts match, then you don't need to do anything. This is the case when you use a broadcast monitor with standardized colorimetry (SMPTE C, EBU, 709).


For video, B can be a problem. There are color probes designed for video monitors that will calibrate the monitor (Sony, Ikegami, Minolta are companies that make these). I don't believe there's any way to apply a LUT to A CRT monitor, so those probes can only fiddle with simple monitor controls (gain, bias, gamma) to achieve good grayscale tracking. As the monitor ages, its colors won't track that well and should be replaced.

LCDs are a different story.
Jayster wrote on 6/23/2006, 7:46 PM
Thanks again for the informative discussion, Glenn. Color management is a tricky business. But really important, too. Can make a world of difference in the quality of our work.