0,
Color space/YUV: video signal structure.
Color space/definition: color gamut that hardware should match or tolerate (e.g. Rec.709/sRGB).
Color space/specification: hardware parameters representing its performance (e.g. 76%sRGB, 450CD/m2).
1,
Video colors are saved in YUV, not RGB.
Our eyes, camcorders and display panels all work in the RGB-modus.
What's the hell that video must transcode its signal into YUV-modus and again back to RGB for displaying?
pic01_videocolors
1.1,
Because our first video signal was black-white and that time the TV was black-white, too. Both of them were able to handle with one information: Y, the lightness.
And that was enough.
1.2,
Engineers developed colorTV and colorcamcorder.
They quickly found out, these hardware cannot deal with the whole RGB-spectrum like our human eyes.
So, they defined the area that TV and camcorder can take/show as a colorspace.
This definition changed from time to time, and comes finally to a very famous one: Rec.709.
pic02_Rec.709
1.3,
Engineers discovered from the very beginning, saving the captured information of a colorcamcorder direct into its RGB-space was almost impossible.
It requires too much bandwidth, no equipment could transport its signal.
Even if, this kind signal was not compatible with blackwhite TV which sit in millions of family that time!
Engineers are genius.
They divided video signal in 3 channels:
Y carries lightness as it always does; U+V take the Cred and Cblue (the Cgreen can be easily recreated by differential coding).
1.4,
YUV colorspace is born.
It uses 4:4:4, 4:2:2 or 4:2:0 pattern to (sub)sample any captured RGB.
Its size is thus flexible, the signal is free from colorgamut definition, or any hardware specification.
But to represent/translate the signal back into right RGB, additional information must be appended.
pic03_videoINFO
2,
As you see, the Rec.709 definition goes to Color Primaries. Yes, for a long time the TV and camcorder could only deal with Rec.709space, it becomes de facto standard.
But gradually, new hardware with advanced performances come and come again, firstly for branch profi, then, largely into the everybody-can-buy-market.
The de facto standard attempts to hold its position with LUT filters, or some thing like that for its editing environment.
Finally it collapsed.
As we know, this editing environment doesn't allow producing anything others than Rec.709 defined.
pic04_newspaces
2.1,
There's a "none-standard" editing modus in Vegaspro called ACES: Academy Color Encoding System.
This system sets the whole RGB area as a large logistic center.
Materials for editing, no matter where they are come from, will be located into the correct (color) space.
pic05_logisticentre
2.2,
And logically you need monitor to monitor.
pic06_monitoring
2.3,
No, not that way!
ACES doesn't take care of the materials them-self. It cares about which color space/definition they belong to.
e.g. an sRGB-picture looks like before and after ACES knows its definition.
pic07_sRGB-image
And a Kodak dpx-format image looks like before and after I told ACES it could be a C-Log-file.
pic07_C-Log
3,
From where do I have the screen for ACES-show?
Isn't ACES independent from hardware? - Yes, it is.
It gives me, at the same time, a simulated pipeline, too.
This is the View Transform.
Off means you are not using any simulation, you are in a virtual (so called 32-bit floating point) colorspace and you can still only produce Rec.709 video.
pic08_V-Transform
After you take a display hardware you own, e.g. your normal computer monitor, for the simulation, 〔View Transform is selected as sRGB(ACES)〕, then you can edit any material in this simulated environment and out put your editing to any space kind of video.
pic09_fromoldtime2moderntime
3.1,
If you own an HDR-monitor, you can use the Vegaspro default settings for HDR-production with pleasure.
Thank you very much.