Comments

OldSmoke wrote on 3/17/2015, 4:04 PM
I would suggest to use the latest Spyder Pro for calibration.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Peter100 wrote on 3/17/2015, 4:33 PM
OK. But what white point do you calibrate your monitor?
If you prapare content for printing - it is quite easy. You adjust your monitor's white pont to ambient light.
But what is wite point standard for video editing? Do I have to adjust it to the color light in my room?
videoITguy wrote on 3/17/2015, 4:41 PM
Actually tools like Spyder Pro can fine tune your monitor just fine in the ordinary way. But if you are asking for managing preview out of Vegas and not a second card output - then I question what you will finally expect to accomplish. WHAT -You want to understand - how Vegas manipulates video preview and that is best done with preview window being less tweaked and absolutely flat.
Peter100 wrote on 3/18/2015, 3:53 AM
videoITguy - thank you for the information.
I have been participating it those long threads concerning Computer RGB <=> Studio RGB conversion. So I understand how to set the preview monitor correctly.

What I do not understand is the white point calibration.
So if I understand you correctly, if I calibrate the preview monitor as an ordinary sRGB (computer RGB) monitor (white point = 6500K), I will get accurate colors.
I'm afraid of a situation that I shift white point to much. Then, when I perform a color grading, I may over correct the image. For example: I have an interview footage. It needs color correction. I do this correction using preview monitor calibrated to white point = 3200 K (warm). In the result the image that looks correct on my monitor will look very bad on monitor calibrated to white point = 6500 K. In the result the skin tones of the interviewed person will look very bluish, as he would have died two hours ago.

I simply do not know what white pont should be set for the industrial standards like rec. 601 (DVD) or rec 709 (HD).
musicvid10 wrote on 3/18/2015, 6:39 AM
You use the monitor's native white point.
This has no connection to white balance, a camera adjustment that is not static.
Apples and oranges; you're doing it again.
Peter100 wrote on 3/18/2015, 1:56 PM
@musicvid10
No, I'm not doing it again.
Below is a screen from Eizo Color Navigator. It is a software for calibrating Eizo monitors.

The default setting of white point is 6500 K. This is default for most of the monitors. On the other hand many modern TVs have very bluish color temperature above 9000 K.
So I was wondering in there is any broadcast standard of TV white point temperature.