Interesting,
my LCDs reads 2.0, some of those figures seem pretty low.
Where I'm getting even more confused is up the top of the page they say NSTC specs gamma at 2.2 and PAL at 2.8. Macs use a gamma of 1.8 to match the printed page, is that why graded FCP output looks brighter?
I'm not going to loose any sleep over it, I thought the design of the test pattern itself was the most interesting part.
I wish I knew the asnwer to that too, heck I'm not even certain what it should be. 2.2 is a number I hear bandied around a lot but as I said above, from what I'm reading further up the page I'm far from certain.
Unfortunately, it's not quite as simple as that. Monitor gamma is set by the software that's driving your video system, along with the Microsoft drivers, be they Vfw(Video for windows) or directX. Some people have "overlay" turned on, which makes changes depending on cues from the software, like Mozilla or Internet Explorer.
For the most reliable and repeatable screen performance, turn off all overlays in the video card manager and calibrate your screen with a spectrometer like Monaco Spyder or Gretag Macbeth's Eye One. Most calibration systems will advise you to max out the contrast and set the brightness to a value determined by their measurement instrument. When we set our brightness according to taste, where we end up depends on the ambient room brightness. The recommended gamma of 2.2 is generally for a well lit office space, not an editting suite, where the room ambience is fairly low.
Ah I don't think it's that simple either, well if I'm reading the first part of the article correctly. The recording gamma is designed to at least in part offset the display gamma. However it gets really messy if different recording systems use different gamma. Using a Spyder would adjust for print gamma and that's probably a good thing, you can hold your printed page upto your monitor and actually see that they match, not so easy with video or film usually unless you've shot test charts in the same lighting setup.
Setting gamma and black level matters a lot. For years I was totally ignorant of the issue, and the still photos I "corrected" from that era now look washed out.
There is an excellent description of the subject matter here and if you scroll about halfway down the web page there is a chart that you can use to easily and accurately set your black level and gamma.
The simplest way for me to adjust monitor gamma is with the driver that comes with my video card. I use an Nvidia card, so I right-click on the desktop, and select Properties / Settings / Advanced / GeForce (or whatever your video card's name is) / Color correction. On that window one can manually adjust the Gamma with a slider. Try different settings until the above-mentioned gamma checker shows a gamma of about 2.2. Then adjust your monitor brightness control (which really should have been labeled "black level control") until the bar on the left just extinguishes at 2.2. Then check your gamma setting again.
The gamma chart works very well with CRT displays and also with my new 24" Soyo LCD display. It works horribly on my 14" Dell laptop. I think it may have to do with pixel size, but I don't know for sure. Also, there are several different LCD technologies out there. The type commonly found on laptops (and on many desktop LCD monitors) will have major shifts in black level if you move your head up and down. For these monitors, it is important to make sure you keep your head in the same relative position while doing critical image color corrections. Other LCD monitors don't have this issue. The monitor I just purchased does not have the black level shift effect, nor would the Dell 24" or any other monitor using the same panel.