Vegas 32 (Full Range) Mode Does Not Correctly Decode Sony SLOG3

ALO wrote on 3/13/2023, 10:45 AM

For the purposes of this thread you need the following materials:

SLOG3 footage available for download in the video description here:

Sony SLOG3 conversion LUT available from Sony's pro site:

https://pro.sony/en_GB/support-resources/software/00263050

(you want S-Gamut3.Cine/S-Log3)

You should also download and install Davinci Resolve on your machine if it isn't there already

 

Don't get hung up on the sample footage -- you can use your own provided it is set to SLOG3 and color space either SGamut3 or SGamut3.Cine. I just wanted to give you something you could play with that has skin tones in it, and this is what I found on the internet. For our purposes, things with skin tones, yellows, and greens will give the most dramatic results.

 

Let's start with Davinci Resolve. Open Resolve, import the clip, put it on the timeline, go to the color page, drag "Color Space Transform" onto the correction node, and set the appropriate input gamma & color space. I am assuming he shot these using SGamut3.Cine. There are other ways to handle Log in Resolve but this is the one I recommend.

Here's how Resolve decodes the footage:

That's a very nice result.

Now let's use the Sony LUT file in an 8-bit (or 32-bit video levels) Vegas project. I assume this is straightforward for you, but either add the LUT filter to your clip's fx chain, or directly add the appropriate Sony LUT in the Color Grading panel, and you should get this result:

It's not exactly identical to Resolve, but the colors match, and as a starting point for a grade, this is absolutely an acceptable result.

 

Now let's switch to Vegas' 32 full range (ACES) mode**:

Clearly there's a big color shift, especially in the greens, yellows, and skin tones. If you are using this workflow, you're going to have to start by trying to address the color shifts and the weird boost in saturation.

As noted above, you can definitely come up with clips that will look a whole lot worse in 32FR than this. This is just something I found freely available on the internet to get you started.

**Be sure you right-click on the clip in 32-FR mode and set the correct color space -- in this case, SGamut3.Cine

 

I don't consider myself well-versed in Vegas' ACES mode, so it is possible there is something I'm missing here. But it sure looks like Vegas is not correctly decoding SLOG3 in 32-FR mode.

 

Comments

Adis-a wrote on 3/13/2023, 11:17 AM

GPU acceleration of video processing, is it on or off?

Wolfgang S. wrote on 3/13/2023, 3:28 PM

And do you see here a huge difference?

The major difference that I see in your snapshoots is, that you mix legal range with full range. What derives from the project setting and by the selected LUT as suggested by you.

Yes, there is still a difference - but you should be aware that it is minor, and that you bring in here a LUT in the system that will show slightly different results.

Here I have applied the Sony LUT from the webpage above. But you can also use the Vegas internal LUT - and then you will see that the luminace is spread in a quite different way. One of the LUTs seems to map the result to the full range, one to the limited range. And that is one problem with LUTs - you never can be sure how they were build. What is here correct - to map the rec709 to the full or limited range?

Last changed by Wolfgang S. on 3/13/2023, 3:34 PM, changed a total of 2 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Howard-Vigorita wrote on 3/13/2023, 3:33 PM

@ALO Did you disable Sony camera luts you had loaded into the fx chain or cgp when you switched to 32bit-full aces? Don't think you can mix those luts with aces transforms. Also, I think slog3 is a full-range lut meant to be applied to full-range log footage so you might want to try using 8-bit full project for comparison making sure the media properties pick up the footage as full range. Curious if it's possible to go from 8-bit full with luts to 32-bit full aces without luts or re-grading.

Musicvid wrote on 3/13/2023, 4:21 PM

Currently, Vegas does not automatically recognize the Sony native camera profiles (among others).

In a 32 bit Full project, do just this, then using the LUT is not necessary

Adis-a wrote on 3/13/2023, 5:13 PM

GPU acc. off:

GPU acc. on:

Wolfgang S. wrote on 3/13/2023, 10:59 PM

@ALO Did you disable Sony camera luts you had loaded into the fx chain or cgp when you switched to 32bit-full aces? Don't think you can mix those luts with aces transforms. Also, I think slog3 is a full-range lut meant to be applied to full-range log footage so you might want to try using 8-bit full project for comparison making sure the media properties pick up the footage as full range. Curious if it's possible to go from 8-bit full with luts to 32-bit full aces without luts or re-grading.

I think he has not applied the LUT together with the ACES mode. This makes no sense really, agree with you

What he has done was

A) to use the ACES transformation for the slog file

B) to use the 8bit mode legacy with the Sony Lut

and compares both. This delivers wrong results, I agree with you here.

So see above my post - if you use the 8bit full project settings plus the LUT, and compare it to the ACES transformation, you receive the same results.

So ACES works here, what he wanted to test. But one has to know how to apply ACES. But one must also know, which 8bit mode must be used, to deliver the same results.

Last changed by Wolfgang S. on 3/13/2023, 11:18 PM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 3/13/2023, 11:02 PM

Currently, Vegas does not automatically recognize the Sony native camera profiles (among others).

In a 32 bit Full project, do just this, then using the LUT is not necessary

He has applied the Input transformation in the correct way. See his snapshoot above, that was posted by him.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 3/13/2023, 11:05 PM

GPU acc. off:

GPU acc. on:

The GPU acceleration on or off will not show a different result. However, given that an ACES transformation required a lot of calculation power, and since that is done by the GPU too, you may not be happy using the CPU only.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Adis-a wrote on 3/13/2023, 11:31 PM

GPU acc. off:

GPU acc. on:

The GPU acceleration on or off will not show a different result. However, given that an ACES transformation required a lot of calculation power, and since that is done by the GPU too, you may not be happy using the CPU only.

The Difference:

The Difference+gamma push:

Wolfgang S. wrote on 3/14/2023, 1:03 AM

And you think that this small difference is relevant? For log footage, that you have to grade anyway?

Have you ever tried to disable the GPU support for playback, either in Vegas but also in Resolve?

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

RogerS wrote on 3/14/2023, 1:27 AM

Looks like a different in saturation affecting skin tones significantly though certainly correctable. Wonder why that would be (the only difference here is GPU on or off?)

fr0sty wrote on 3/14/2023, 2:02 AM

He has applied the Input transformation in the correct way. See his snapshoot above, that was posted by him.

I'm also curious of the output transform might be where the difference is... As he isn't using default sRGB, but setting it to Rec709. It should be similar to sRGB, but the difference might be enough to create the difference when comparing side by side with (non ACES) Resolve.

Because he isn't using ACES in Resolve either, so it isn't an apples to apples comparison... It might not be that it is decoding SLog "incorrectly", it just might be displaying it differently due to the color space differences and output transforms at play. In either case, to my eye, the difference only lies in saturation, which is an easy adjustment if you don't like it.

Changing the default sRGB to Rec709 output transform in project settings does indeed slightly change the look of the image.

sRGB

Rec709 (ACES)

Difference Composite (8 bit full range)

That said, the difference appears to affect the highlights/gamma more than the skin tones/saturation.

Last changed by fr0sty on 3/14/2023, 2:17 AM, changed a total of 8 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Wolfgang S. wrote on 3/14/2023, 2:46 AM

Looks like a different in saturation affecting skin tones significantly though certainly correctable. Wonder why that would be (the only difference here is GPU on or off?)

Since you have to grade log footage anyway - is that relevant really?

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 3/14/2023, 2:52 AM

I'm also curious of the output transform might be where the difference is... As he isn't using default sRGB, but setting it to Rec709. It should be similar to sRGB, but the difference might be enough to create the difference when comparing side by side with (non ACES) Resolve.

rec709 should be the prefered setting, I think. The difference is small again, anyway.

The major difference are the project settings (8bit legal vs 8bit full) in the LUT case. You see that very clear in the scopes, what I would recommend everyone to use really.

... It might not be that it is decoding SLog "incorrectly", it just might be displaying it differently due to the color space differences and output transforms at play. In either case, to my eye, the difference only lies in saturation, which is an easy adjustment if you don't like it.

The ACES transformation is not incorrect at all. What would have been necessary to show that, would be to measure the input/output transformation for both gamma and gamut. And he has not measured that at all.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Musicvid wrote on 3/14/2023, 4:59 AM

He has applied the Input transformation in the correct way. See his snapshoot above, that was posted by him.

OIC

Devising a level playing field in Vegas presents a paradox I'm pretty sure. There may be a way to simulate a test environment, I'll think on it.

 

 

Wolfgang S. wrote on 3/14/2023, 6:16 AM

There may be a way to simulate a test environment, I'll think on it.

Theoretically this is simple. Have a set of log test charts. Put them through the ACES transformation. Measure on an calibrated display that outcome. Both in gamma and gamut.

Only then you can be sure, that the transformation works in a correct way.

Last changed by Wolfgang S. on 3/14/2023, 10:38 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Musicvid wrote on 3/14/2023, 11:39 AM

It will be interesting to see your results.

Adis-a wrote on 3/14/2023, 11:54 AM

And you think that this small difference is relevant? For log footage, that you have to grade anyway?

Have you ever tried to disable the GPU support for playback, either in Vegas but also in Resolve?

- It's relevant enough for me to go "ugh" and try to correct the skin tones, which I don't have to do with GPU enabled since they look pleasing to me.

- Of course. Don't use Resolve though...

Wolfgang S. wrote on 3/14/2023, 12:36 PM

But you know that you have to grade log footage anyway?

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Adis-a wrote on 3/14/2023, 1:06 PM

Huh?

Grading≠correcting for wrong color transfer function. What's wrong with healthy starting point?

fr0sty wrote on 3/14/2023, 3:30 PM

We haven't even verified that it is "wrong" yet, only that it is different than what Resolve's color system uses. If we set Resolve to use ACES, and match the input and output transforms, and there's still a difference, then we can say one or the other is "wrong".

Last changed by fr0sty on 3/14/2023, 3:31 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Adis-a wrote on 3/14/2023, 5:17 PM

OK.

Resolve aside (OP referred to it, not me!) what have we "verified" so far? That we have two correct ACES pipelines in Vegas? That what you're suggesting? :)

ALO wrote on 3/14/2023, 5:59 PM

I trust Vegas in 8-bit mode and I trust the Sony LUT. So that look is what I would consider "correct" (meaning, the log file has been accurately decoded as a starting point for your grade/look).

Resolve is too widely used to get this wrong, so you can trust that also (albeit with Resolve's slightly different color space handling).

The Vegas 32/FR implementation is definitely askew, esp again if you work with footage with the right shades of green in it.

Have no idea why GPU on/off makes a difference, but clearly it does. That would confirm to me that you are seeing a bug rather than just a "small difference."

If you're going to use SLOG3 in Vegas, my recommendation would be to use Vegas 32/Video Levels and the Sony LUT as your starting point.

I'm not a fan of ACES even when it's working correctly. :)

 

RogerS wrote on 3/14/2023, 7:41 PM

Have no idea why GPU on/off makes a difference,

Agreed- this isn't expected that Vegas would process ACES differently with GPU on or off. That said I'd never use it with it off.