Putting on that "final shine"

Comments

farss wrote on 8/7/2007, 8:39 PM
By my reading of the text what it does is to set the minimum value of each of the R,G & B channels to 0 and the peak value to 100, kind of like normalising audio but doing it for the softest and the loudest values.

Theory is that in a typical shot the darkest thing will be black and that should be 0,0,0 and the brightest thing should be white and that should be 100,100,100. All values in % of full scale. Of course where that can fail isn't if the shot isn't typical. You can achieve the same thing manually, best done while watching scopes and a monitor.

Biggest problem I see is one of the sample shots is around 1 stop underexposed, trying to pull that up is problematic, you start to get problems with noise and banding. Plus that shot looks soft, maybe that's because of how you captured the sample frame. If not I suspect the camera was having a hard time of it. More light might have been a big help. You can always pull an image down exposure wise, add noise (or 'grain'), soften it, do lots of things to it but going back the other way is very, very difficult.

Bob.
Serena wrote on 8/7/2007, 10:17 PM
>>>You can always pull an image down exposure wise<<<

provided you don't clip whites (or bright areas)
GlennChan wrote on 8/7/2007, 10:34 PM
As far as monitoring goes, there are certain things you should watch out for:

--Overscan / cropping. The edges of the picture will be cropped off by TVs... for SD the standard is to assume 5%. The remaining edge is assumed to be distorted by the CRT TV... so for titles, the standard is to assume that 10% from the edge will not have correct geometry. (Though nowadays this is less true. And there's not reason why you can't have titles in safe area if a particular effect is desired.) In Vegas you'd just turn the safe area markers on to check.

--Interlace flicker. Thin 1-pixel lines will flicker on a CRT.

--Cross-color artifacts. If a viewer is viewing the material through a composite connection, then that viewer will sometimes see those funky rainbow/moire patterns and crawling/hanging dots.
The composite decoder can tradeoff between sharpness and how much artifacting there is. Some TVs and most broadcast monitors let you toggle in a comb filter that delivers sharper images with lots of artifacts (I would prefer this option).
Not everyone monitors for this.

--A calibrated interface between the computer and monitor- is the signal being decoded correctly? This is what the blue gun feature + calibration to SMPTE bars is for. It doesn't ensure accurate color, though it's part of the solution.

--Is the monitor itself close to displaying reference standard colors?
a- One factor is the exact color of the RGB elements (the more technical term is primary chromaticities... where primary refers to the primary colors red green and blue and chromaticity is color measured objectively). The standard colorimetry or SD NTSC is SMPTE C, for SD PAL (and Japanese NTSC) it's EBU, for modern Rec. 709 HD it's Rec. 709. In a CRT you'd simply look for a monitor with the right phosphors (e.g. SMPTE C phosphors). Though in practice you just need to be close... in a lot of professional situations, the differences between SMPTE C and Rec. 709 primaries are glossed over in up/downconversions and in grade A reference monitors.

b- Another factor is whether other aspects of the monitor is calibrated correctly... is the transfer curve what it should be. In CRTs, the bias calibration needs to be correct and the phosphors shouldn't be worn out. In LCDs, the default transfer curve is s-shaped compared to a CRT's (sort of like applying a mild s-shaped color curves in Vegas).

c- White point should be in the ballpark of D65. All light sources in the room should match.

One way to get all of the above is to buy a good broadcast monitor. Another potential method would be to profile the monitor (with a color probe) and then use a 3-D LUT to calibrate the monitor... but Vegas can't do this. I'm not sure how well it would work either.

d- The luminance (how much light emitted by the monitor... over surface area) should be a particular level... though in practice it's not a huge deal. Sony's very expensive HD BVM CRTs can't hit that standard for(without problems). Consumer TVs are much brighter than the reference standard.

--Is the monitor doing funky image processing to the image? Consumer TVs tend to do this a lot... white point/color temperature is set intentionally (too) high, and they try to making the image look right by decoding red incorrectly.

2- Anyways, to answer the question, what can you do to calibrate a computer LCD... not much. If it is hooked up via analog, there are some calibrations you can do to get the interface calibrated right.

You should however set the white point to be close to D65 and to match other light sources in the room. This way your eyes' "white balance" won't drift around. You can generally do this on the monitor. Some LCDs don't let you change things since it can be slightly better to manipulate the image before the DVI input (to make the most of the LCD's limited bit depth).
farss wrote on 8/8/2007, 12:49 AM
Anyone got any thoughts on the Pantone Huey?
For AUD 99 it's certainly cheap enough but I have to wonder how it compares to the venerable Spyder.


When it comes to LCD monitors the rumours I've heard are that the Samsung panels are better than the LGs used in Dell and Apple monitors.

Bob.
Grazie wrote on 8/8/2007, 12:56 AM
Dunno about "better"? But just using a bit of "common sense" I've set my Samsungs to "look" like my JVC pro monitor? I don't know if this is THE correct way, but it sure makes for a more relaxed Grazie? My DVDs don't looked washed out and stuff on my 2nd Full Screen Samsung is viewable.

Grazie
farss wrote on 8/8/2007, 3:00 AM
Well in the context I used "better" it would mean able to display a wider color range, wider contrast ratio and perhaps better color tracking.

A hardware calibrator such as the Huey would let you get your LCD and CRT as close as possible to producing the same colors and the same brightness. Also the Huey will adjust the monitor to compensate for changes in ambient light. I'd guess as these things do the job automatically you're more likely to use them and keep things in calibration.
apit34356 wrote on 8/8/2007, 7:10 AM
Not to start a pissing war, but your basicly correct about the Samsung panels. Samsung is a big manufacturer of the panels and is a partner with Sony in a few big panel projects. Word is that Samsung and Sony has options on the "best" on each "run" to fill their retail needs.
GlennChan wrote on 8/8/2007, 2:31 PM
Anyone got any thoughts on the Pantone Huey?
I don't think it works with Vegas? You can change .icc profiles in the windows secondary display but it doesn't do anything (you can check with extreme profiles like the cinema ones).

It might be that you could change the 1-D LUTs on your video card to calibrate your monitor to some degree (this won't fix all the problems though). No idea how to do this however. You also get lower than 8-bit precision.

2- Vegas' de-interlacer is still pretty shoddy/rudimentary... a broadcast monitor will tend to implement something decent.
farss wrote on 8/8/2007, 4:43 PM
Don't have a secondary display on the PCs here but I can load an .icc into the nVidia driver and it certainly affects Vegas's internal preview monitor.

The driver also lets you specify if the profile is applied to All, Desktop, Overlay.

Bob.
Grazie wrote on 8/8/2007, 9:51 PM
I DO have a 2nd Monitor, so how does one, but I can load an .icc into the nVidia driver - nVidia here too!

Oh, and what IS an " .icc"? How do you load it? And from where? Are these the profiles that I have been loading from the supplied nVidia?

TIA

Grazie
farss wrote on 8/8/2007, 11:56 PM
ICC explained here:

http://www.color.org/iccprofile.xalter

I don't think you can get icc profiles to work with an external firewire monitor, don't know about SDI driven ones, the drivers might support that.

To load a profile, RClick the empty desktop.
Properties>Settings>Advanced. Then the Nvidia tab and then Color Correction, Color Profile.
Grazie wrote on 8/9/2007, 12:31 AM
"INTERNATIONAL COLOR CONSORTIUM

Ah ha! Now I know - great site Bob! Thanks!

And yes I HAVE been playing with these. I tried the ICC that was ingested into my system, and I can't immediately see any difference though? Got PAL-SECAM too. Maybe I'm just about cooked anyway? Separately and within the 2nd Monitor setup window pane, should I be using the "Use Studio RGB (16 to 235)" option? Pros- Cons - ?

And NO, I AM speaking of a 2nd NON-firewire monitor. You don;t get to fiddle with these types of settings anyway? Yeah?

Grazie
Grazie wrote on 8/9/2007, 12:31 AM
"INTERNATIONAL COLOR CONSORTIUM

Ah ha! Now I know - great site Bob! Thanks!

And yes I HAVE been playing with these. I tried the ICC that was ingested into my system, and I can't immediately see any difference though? Got PAL-SECAM too. Maybe I'm just about cooked anyway? Separately and within the 2nd Monitor setup window pane, should I be using the "Use Studio RGB (16 to 235)" option? Pros- Cons - ?

And NO, I AM speaking of a 2nd NON-firewire monitor. You don't get to fiddle with these types of settings anyway? Yeah?

Grazie
farss wrote on 8/9/2007, 1:37 AM
No way to edit the profiles from with the nVidia setep. I assume things like the Huey can write them. For calibrating CRTs we (this is way back) used a device made by Minolta, although we were only displaying graphics, in a control room with 20 or more monitors getting them all looking the same was vital to customer satisfaction and to keep the unions happy, not enough nits and there could be a strike. Can't say I blamed 'em either. If my job was watching a CRT all day, hoping for a core meltdown to relieve the tedium I'd be worried about me nits too.

Bob.