Where vs Why: Color Space/Range; InputLUT/LookLUT; IDT/ODT

Yelandkeil wrote on 10/27/2022, 12:46 AM

In VP20b214, we are skilled to take the “8-bit (full range) pixel format” for conventional video editing.

Footage that has the limited 16-235 color range will be automatically lifted/expanded to the full 0-255 color range so that our computer monitor can display the timeline motion pictures properly instead of washed-out.

The lifting or expansion is easy to accomplish because conventional video color space Rec709 and computer color space sRGB have the same primaries including its white point location.

Illustr-01 Rec709/sRGB primaries

 

Footage with different primaries can't be lifted or expanded, it must take a process of color transcoding to fit the Rec709/sRGB.

In most case the finished process is called input-LUT or camera-LUT (look-up-table).
Here the illustration of a Panasonic V-Log file taking LUT to Rec709 then into sRGB.

Illustr-02abc V-log-go-sRGB

 

Be aware of the look-LUT.

It's not color transcoding, it is kind of film style imitation – the ancient celluloid era – “looking styles” come from chemical developing, or limitation.

The fx-plugin “AutoLooks” plays the same theater.

If you have such nostalgia, here’s the famous Crime City from look-LUT.

Illustr-03 crime city

The plugin “LUT-Filter” which introduced very lately into VEGAS does the job for everything, depending on what you feed.

So, why different primaries?

Hm, because our camera can capture wider and wider color gamut in better and better quality due to enhanced technology.
And manufacturers want to make money through their know-how...

Does the transcoding here and there impact your quality?

Sure!
Transcoding into Rec709, especially editing in the 8-bit pixel format – no matter legacy or full range – both of them generate post artifacts, e.g. color-banding.

Thus!
VEGAS has developed editing modus in the 32-bit floating point format.

It is also called single-precision floating-point format - a computer number format, usually occupying 32 bits in computer memory; it represents a wide dynamic range of numeric values by using a floating radix point - hence the name 32-bit floating point.

Principles!
If you use Legacy 8-bit (video levels) modus, after editing you can change your project to 32-bit floating point (video levels) and then render out your product.
This will reduce post artifacts due to the precision of the floating point format, at least theoretically.

Use 8-bit (full range) modus to edit a project can be also changed to 32-bit floating point (full range) modus. Be careful: you must change its compositing gamma from “Linear” to “Video” at the same time.
Or you can use it as your project set, if your HW strong enough. Again: don't forget video gamma.

Illustr-04ab floating point Video Gamma

 

 

About the initial “32-bit floating point (full range)” with linear gamma in VP20b214, I will say:

  • it works great with footage from Rec709, sRGB and HLG, for footage from other spaces/primaries, I haven't found way to deal with;
  • it's an editing alternative for conventional video production with high precision thanks to the computer floating-point technology. Not less not more.

 

The 8-bit Rec709/sRGB has a long endurance due to maturity of SDR-display and interactivity of worldwide network.

But with the breakthrough for HDR-display we are experiencing new horizon of vision.
We focus to HDR10 format.
It means High-Dynamic-Range in 10-bit color depth, its definition is called ST2084 and the primaries based on Rec2020.

Illustr-05 Rec2020 primaries

 

In VP20b214, we reach HDR10 through ACES.
ACES is a color management system, it will cover the space chaos of film industry under one cap and give them an interchange chance.

Illustr-06 color chaos

 

In VP20b214, ACES uses the 32-bit floating point (full range) interface to establish its “View Transform”, and “Master” pipeline.
Under View transform you can find your preview device or the device compatible to your equipment, e.g. sRGB (ACES).

Illustr-07 view transform


As soon as the view transform determined, ACES locates its default (ACES2065-1) – or the virtual AP0 master – onto Rec2020 primaries.

Illustr-08 Rec2020primaries

Here you can edit your Log footage whatever and output it to another Log-format, or intermediate or end product...as you like.

 

Let's go direct HDR10.
It's the same as you open ACES, but this time VEGAS does all the job for you.
You only need to activate “HDR10” mode, then

  1. it changes the monitoring tools into HDR-peak value;
  2. it selects the HDR10-compatible view device with 1000nits peak, if your equipment can higher, then change to 2000 or 4000; if your equipment is a P3-D65 with (at least) 1000nits, select it.
    Don't fancy you can make HDR10 with a sRGB-display!
  3. the AP0-master is ready. Attention: modify your master to fit your view transform if changed.  

Illustr-09ab HDR10mode master

 

Windows system has its color management for WCG and HDR.
But I think everybody knows how to adjust it.

illustr-10 windmanager

 

My AMD-graphics card has the driver version 22.5.1.
The 10-bit pixel format on the panel should be enabled, otherwise your displays will flash madly when you launch HDR10-project.

Illustr-11 AMD driver

 

If you have any external HDR10 preview equipment connected to the Windows Graphics Card or a special HW, you'd see “Enable HDR output” is activated and check whether the signal comes through.

Illustr-12 extern display

 

Now to our HDR10-project.
VEGAS identifies automatically the HDR10 and HLG footage.

Illustr-13ab HLG/HDR10 footage

Footage from other spaces will be transformed through IDT into ACESpace, manually.  
As said, ACES is a virtual space – independent from all HW – thus the transform is called Input-Device-Transform, with other words, transformed from which camera with what specification etc.

Illustr-14 IDT

 

 

Our V-Log file in HDR10 project looks so out.
Perhaps your brow is furrowing as you watch it, and that HDR10/HLG-scene?
Do you believe an internet JPG-format picture can show HDR10 content?

Illustr-15 V-Log

 

Finally the ODT, Output-Device-Transform.
ACES is developed for color space interchange. And only in ACES you have the chance to output your editing to different color spaces by rendering.
Where is the device?
Behind Rec709 comes all SDR-TV device.
In HDR10 I see OLED, QLED...

I remembered someone saying:

HDR mode can be thought of as a specialization of 32-bit full-range (when a View Transform is specified). HDR mode preselects some options, enables the HDR preview, and revises the Render-As filter to make the HDR render templates visible. Other than those things HDR mode and 32-bit full-range are identical.

I really don't understand what it will express.

And more to my shock, from where do people have the bravado to show a 10000Nits future vision picture or just argue there's not much difference between HDR and SDR.

And all these are presented on one and the same sRGB-screen!

 

 

Thanks on all sides.

Comments

Wolfgang S. wrote on 10/27/2022, 3:14 AM

"And more to my shock, from where do people have the bravado to show a 10000Nits future vision picture or just argue there's not much difference between HDR and SDR.

And all these are presented on one and the same sRGB-screen!"

Why is HDR footage reviewed on a sRGB screen? That makes no sense to me. Sure, one can grade also HDR footage to sRGB, but I would like to grade that to HDR10/PQ. Hopefully, then you will see a difference.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 10/27/2022, 3:29 AM

"In most case the finished process is called input-LUT or camera-LUT (look-up-table).
Here the illustration of a Panasonic V-Log file taking LUT to Rec709 then into sRGB."

Given the limitations of LUTs, for grading v-log (10bit) I would tend to use ACES workflow. Using the ACES transformation to SDR/rec709.

"If you use Legacy 8-bit (video levels) modus, after editing you can change your project to 32-bit floating point (video levels) and then render out your product.
This will reduce post artifacts due to the precision of the floating point format, at least theoretically.

Use 8-bit (full range) modus to edit a project can be also changed to 32-bit floating point (full range) modus. Be careful: you must change its compositing gamma from “Linear” to “Video” at the same time.
Or you can use it as your project set, if your HW strong enough. Again: don't forget video gamma."

Especially, if the hardware has not enough performance to use the 32bit/ACES mode, as long as we grade for SDR by using LUTs, it is a nice workaround to work in 8bit mode. But if we go for HDR, I think the better approach in terms of quality would be to apply the ACES transformation to HDR - and use the wonderfull color wheels. For sure, this will require an improved hardware, and even that a reduced preview quality.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Yelandkeil wrote on 10/27/2022, 6:45 AM

...if the hardware has not enough performance to use the 32bit/ACES mode...

Oh please don't! @Wolfgang S.

I wrote the blablabla here, just because of you; I'm trying to stop 2 false propositions:

  • 32bit/ACES mode never exists!
  • Use proper monitoring tools for its preview device. No fake no fantasy!

If you wanna categorize, the 8-bit pixel format and 32-bit point format belong to one world, the world of conventional video editing and producing: 8bit pixel/32bit floating mode;
and the ACES/HDR10 world with no limitation.

In which world you do perform your editing, with what filter or through what method you do think better, that's quite another topic.

Is your noble display already into operation?
It's almost the same price as my whole PC. 😰😰

 

Wolfgang S. wrote on 10/27/2022, 7:06 AM

I edit 10 and 12bit log files now for some years to HDR, and I simply use the 32bit floating point mode and ACES. So I do not get your point, why you think that you had to write that for me? But thank you for that.

I wonder even more why you seems to be upset about somebodys else HDR display? Makes no sense to me, really.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Yelandkeil wrote on 10/27/2022, 7:18 AM

Pardon my English!
Not upset but jealous. I'm poor, so poor that you can't find a mouse in my room.

Again: 32-bit floating point is just an interface concerning ACES; as soon as we enter the ACES mode, there's no 32-bit any more.
This is why I assumed you were confused at the beginning.

All after all, English is not a perfect language, too.

Wolfgang S. wrote on 10/27/2022, 12:31 PM

Sorry to hear that you are poor. I do not consider myself as rich, for sure I can spend some money on video equipment, and I know that I tend to spend much to much here. And the funny thing is, that I am poor in other terms - I have not enough time at all today, to shoot or edit a loot of footage. So, poor is relative to me.

One issue with HDR monitoring is, that it has been expensive from the very beginning. I had purchased an Atomos Sumo, that has been a great solution in the beginning - together with an Blackmagic Extreme 4K 12G. I am still not so sure, how great this combination is for the actual Vegas version. In former days, it was great - and it is still great with Resolve and Edius today. But my major NLE has always been Vegas, what I use since version 4. That is why I think about another HDR monitor.

Before I purchased a new system, Vegas required a lot of workarounds for HDR grading. Working in 8bit, switching later to 32bit for rendering, or using LUTs for gamut and gamma transformation. So I know that quite good, but found it never satisfying.
It was me to suggest Heman to allow the nit scale in the waveform monitor also in 8bit projects, for example. With a Threadripper 3960X using ACES in Vegas has become much easier, and that is why I like the 32bit floating point mode and ACES.

 

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems