Anyone using the Spyder2Pro to calibrate their computer monitors for Vegas use?
I think I understand what it's doing, but I want to be sure. I think it's making a calibrated gamma curve for the monitor itself, not changing the gamma curve in the video driver. And then these two gamma curves are actually added together to create the resultant that you see. I point this out since I notice it didn't touch my video driver's gamma curve, which I had customized before with the driver's utilities.
So should I set the video driver's gamma curve to just a straight line before doing the Spyder2Pro calibration? This way if I need to tweak brightness slightly for my NTSC pluge to be correct, it will be linear for the whole gamma curve?
Also, it mentions how normal monitors are typically set to 6500K. But there are also profiles you can calibrate for NTSC or Rec. 709. Well, I use both Rec. 709 video (HDV and XDCAM HD) and NTSC (DVCAM) in my projects. So I'm not clear what standard I should be calibrating my monitors to.
Any advice? Thanks.
I think I understand what it's doing, but I want to be sure. I think it's making a calibrated gamma curve for the monitor itself, not changing the gamma curve in the video driver. And then these two gamma curves are actually added together to create the resultant that you see. I point this out since I notice it didn't touch my video driver's gamma curve, which I had customized before with the driver's utilities.
So should I set the video driver's gamma curve to just a straight line before doing the Spyder2Pro calibration? This way if I need to tweak brightness slightly for my NTSC pluge to be correct, it will be linear for the whole gamma curve?
Also, it mentions how normal monitors are typically set to 6500K. But there are also profiles you can calibrate for NTSC or Rec. 709. Well, I use both Rec. 709 video (HDV and XDCAM HD) and NTSC (DVCAM) in my projects. So I'm not clear what standard I should be calibrating my monitors to.
Any advice? Thanks.