Vegas 32 (Full Range) Mode Does Not Correctly Decode Sony SLOG3

Comments

Adis-a wrote on 3/14/2023, 7:51 PM

I trust Vegas in 8-bit mode and I trust the Sony LUT. So that look is what I would consider "correct" (meaning, the log file has been accurately decoded as a starting point for your grade/look).

Resolve is too widely used to get this wrong, so you can trust that also (albeit with Resolve's slightly different color space handling).

The Vegas 32/FR implementation is definitely askew, esp again if you work with footage with the right shades of green in it.

Have no idea why GPU on/off makes a difference, but clearly it does. That would confirm to me that you are seeing a bug rather than just a "small difference."

Wasn't always the case, last build where it didn't make a difference was v18.0.0.527.

If you're going to use SLOG3 in Vegas, my recommendation would be to use Vegas 32/Video Levels and the Sony LUT as your starting point.

I'm not a fan of ACES even when it's working correctly. :)

 

I am. I use ACES to get DPX Image sequence out of Vegas, needed for DCP creation. DCP player then perfectly matches Vegas's preview. Export In 32 bit float doesn't. With TIFF there's also a mismatch, and 8 bit for me is a meh anyway...😂

ALO wrote on 3/14/2023, 8:00 PM

I am. I use ACES to get DPX Image sequence out of Vegas, needed for DCP creation. DCP player then perfectly matches Vegas's preview. Export In 32 bit float doesn't. With TIFF there's also a mismatch, and 8 bit for me is a meh anyway...😂

I know there are people who have to use ACES for industry reasons/collaboration, but if you have a choice, I think there are better/more modern options

fr0sty wrote on 3/14/2023, 9:53 PM

@VEGASDerek may want to keep an eye on how this progresses, since we've identified a version and build where such unwanted changes took place with GPU acceleration.

I am. I use ACES to get DPX Image sequence out of Vegas, needed for DCP creation. DCP player then perfectly matches Vegas's preview. Export In 32 bit float doesn't. With TIFF there's also a mismatch, and 8 bit for me is a meh anyway...😂

This is kinda confusing, as you must enable 32 bit float in VEGAS in order to use ACES? Mind clarifying?

Last changed by fr0sty on 3/14/2023, 9:54 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Adis-a wrote on 3/14/2023, 10:54 PM

I am. I use ACES to get DPX Image sequence out of Vegas, needed for DCP creation. DCP player then perfectly matches Vegas's preview. Export In 32 bit float doesn't. With TIFF there's also a mismatch, and 8 bit for me is a meh anyway...😂

I know there are people who have to use ACES for industry reasons/collaboration, but if you have a choice, I think there are better/more modern options

Hm, like what? Resolve? No, thanks! Tried once the free version and never again. Can't edit for sh** with it. :)

@VEGASDerek may want to keep an eye on how this progresses, since we've identified a version and build where such unwanted changes took place with GPU acceleration.

I am. I use ACES to get DPX Image sequence out of Vegas, needed for DCP creation. DCP player then perfectly matches Vegas's preview. Export In 32 bit float doesn't. With TIFF there's also a mismatch, and 8 bit for me is a meh anyway...😂

This is kinda confusing, as you must enable 32 bit float in VEGAS in order to use ACES? Mind clarifying?

True, but you don't have to enable ACES in order to use 32 bit float. With ACES enabled DPX export, after DCP creation, matches Vegas's preview, without ACES it doesn't.

Wrote couple of years ago about it:

https://www.vegascreativesoftware.info/us/forum/can-vegas-be-made-epic--116584/

:)

Former user wrote on 3/14/2023, 11:55 PM

and 8 bit for me is a meh anyway...😂

I know there are people who have to use ACES for industry reasons/collaboration, but if you have a choice, I think there are better/more modern options

Hm, like what? Resolve? No, thanks! Tried once the free version and never again. Can't edit for sh** with it. :)

 

@Adis-a Use Resolve for color grading, not editing. There's all these 'influencers' talking about how they changed to Resolve.... I haven't worked out what that's about yet, either stealth marketing Resolve or trying to renew or force a new contract with Adobe/Apple after negotiations failed right before publicly declaring their love of Resolve for editing and changing NLE's.

*Maybe they really do love editing with Resolve, but seems hard to believe

Wolfgang S. wrote on 3/15/2023, 5:16 AM

I trust Vegas in 8-bit mode and I trust the Sony LUT. So that look is what I would consider "correct" (meaning, the log file has been accurately decoded as a starting point for your grade/look).

I have shown you the use of the 8bit legacy mode instead of the 8bit full mode was the reason for the difference to the ACES mode. A clear error, because the slog3 footage does not follow the legal range but the full range in most cases. And even more important: the Sony-LUT has also been designed for full range!

Please see the technical documentation "Technical Summary for S-Gamut3.CineS-Log3 and S-Gamut3S-Log3.pdf" of the Sony LUT for that: It states very clear in FAQ Q4, that there is no option to "select legal range to prevent miss operation" - so the LUT is for full range.

So if you apply a LUT because you "trust" the LUT, then you have to make sure that you get the settings in Vegas right. And this is may be complex, because you have to get the project settings and the media properties right, and that must fit to the selected LUT. The Vegas internal LUT seems to be designed in another way.

ACES is here much simpler. Regardless, if youc choose undefined, legal or full range in the media properties, you will receive always the same outcome. That is one reason why I tend to work with ACES.

 

Resolve is too widely used to get this wrong, so you can trust that also (albeit with Resolve's slightly different color space handling).

Resolve is a great grading application, but here you have the same problem: you can decide to "trust" Resolve, and agree that both Vegas and - even more - Resolve are trustworthy tools. However, Resolve is more complex then Vegas is, and you can get a lot of things wrong in Resolve, by choosing the wrong settings.

The Vegas 32/FR implementation is definitely askew, esp again if you work with footage with the right shades of green in it.

Have never seen that.

Have no idea why GPU on/off makes a difference, but clearly it does. That would confirm to me that you are seeing a bug rather than just a "small difference."

It is a minor difference, and who is willing to shoot log must be always willing to grade the footage.

If you're going to use SLOG3 in Vegas, my recommendation would be to use Vegas 32/Video Levels and the Sony LUT as your starting point.

Sorry, but that contradicts again the clear technical documentation, that the Sony LUT *must* not be used in legal project settings. If you do that, you will end up with wrong scopes again. So this recommendation is not constistent.

What one can do is to set the project properties to 8bit full, the media properties to undefined, and apply the Sony LUT in the CGP, at the best at the track level. And for rendering switch to the 32bit floating point full range mode, disable the LUT at the track level. This can be clever, if playback performance is not high enough on the machine used to grade the footage.

I'm not a fan of ACES even when it's working correctly. :)

Up to you, what you do. But there is no reason for this prejudice.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo

Howard-Vigorita wrote on 3/15/2023, 10:31 AM

Agree with @Wolfgang S. Also, you cannot visually compare a limited range project with a full-range project without applying a view-transform to the limited-range project. Because Vegas now uses Full-Range internal normalization. I believe that change was implemented with VP18. If you don't attend to this with all your limited-range projects, your renders will never agree with your Vegas previews.

Btw, the Vegas Levels FX preset for Studio to Computer levels is incorrect... Input End needs to be changed from .920 to .922 to nail it. It's implemented correctly however in the SeMW plugin which also automates it for renders.

Since both your footage and Vegas are intrinsically full-range, you can avoid jumping through those hoops by sticking to 8- and 32-bit full-range projects. Only complication is the need to make 3 adjustments to turn off Aces in 32-bit full. This other thread discusses how to do that... in summary, this project setting does it for me:

If you do it that way, I think you should be able to make a valid comparison between 32-bit full-range with manufacturer-supplied luts versus using aces instead of those luts.

Wolfgang S. wrote on 3/15/2023, 10:54 AM

And unfortunately, if one has a look to the discussion that Howard has linked too, there is also a table as last postng where I have analysed the different combinations of project settings, media propertie settings - to show the outcome in terms of changes in luminance and/or gamut. Yes, the topic full/legal range is complex in Vegas but that is true for Resolve too. I am afraid, but we have shown you how to match ACES with the LUT suggested by you. As free help for you.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo

Adis-a wrote on 3/15/2023, 11:47 AM

and 8 bit for me is a meh anyway...😂

I know there are people who have to use ACES for industry reasons/collaboration, but if you have a choice, I think there are better/more modern options

Hm, like what? Resolve? No, thanks! Tried once the free version and never again. Can't edit for sh** with it. :)

 

@Adis-a Use Resolve for color grading, not editing. There's all these 'influencers' talking about how they changed to Resolve.... I haven't worked out what that's about yet, either stealth marketing Resolve or trying to renew or force a new contract with Adobe/Apple after negotiations failed right before publicly declaring their love of Resolve for editing and changing NLE's.

*Maybe they really do love editing with Resolve, but seems hard to believe

I use BCC Continuum and Saphire in occasion where I can't get what I want from Vegas directly, and stay inside Vegas. Works for me.

Just don't start me with those "every frame is a painting" YT colorists, makes me wonder how they ever get the movie finished... 😂

ALO wrote on 3/15/2023, 7:12 PM

I trust Vegas in 8-bit mode and I trust the Sony LUT. So that look is what I would consider "correct" (meaning, the log file has been accurately decoded as a starting point for your grade/look).

I have shown you the use of the 8bit legacy mode instead of the 8bit full mode was the reason for the difference to the ACES mode. A clear error, because the slog3 footage does not follow the legal range but the full range in most cases. And even more important: the Sony-LUT has also been designed for full

edited

Musicvid wrote on 3/15/2023, 9:55 PM

Nope, no color shift. By applying a stock Studio->Computer RGB filter to your LUT/legacy 8 bit preview image (for the obvious reason), and a slight saturation boost (because a LUT is a static filter and ACES is dynamic), an exact match is pretty easy to obtain...

One half is LUT, other half is ACES. Which is which? Where is the dividing line? Why does this matter?😴

RogerS wrote on 3/15/2023, 10:32 PM

I don't get the questions about 8-bit full vs video. It's full range footage so Vegas doesn't do anything to it in legacy mode or full mode. Video range footage would be a different story.

Pre-LUT

Post LUT

This is an excellent starting point for grading, nice and flat while well exposed.

32-bit full (view transform off) is also a great starting point and looks the same as 8-bit full.

or if you prefer Photoshop difference mode, there is no difference:

ACES looks acceptable, though to me this is more of a finished look given the higher contrast and saturation. I'd rather be able to add contrast myself.

a few steps later in CGP and I'm back to a similar starting point in ACES.

Bonus: Leeming LUT for S-log3 (with exposure adjustment as it's expecting a much brighter image)

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

ALO wrote on 3/16/2023, 9:32 AM

@Wolfgang S. I misunderstood the point you were trying to make regarding levels. I regret the error.

Wolfgang S. wrote on 3/16/2023, 9:34 AM

There was the interesting approach, that the user ALO trys to prove, that the 32 bit floating point ACES transformation from slog3 to rec709 does not work correctly.

As an proof he brings one slog3 file, and assume that this file is S-Gamut3.Cine.Slog3. Maybe this is true, let us assume that it is true.

Then he applies a Sony LUT to an 8bit legacy video levels project, even if the technical instruction for the LUT states very clear, that it is for the full range only.

Picture 1: Technical Instruction of Sony, how to apply the LUT

 

So in the picture 3 of the initial posting of ALO, he shows an picture with the 8bit legacy video project settings – but does not tell us not how he had adjusted the media properties for the file in the 8bit legacy video project. Only with the snapshoot in the ACES mode he shows us media properties, that were adjusted to Slog3.S-Gamut3.Cine. And to full.

In the following tests, the slog3 file was set to “Sony S-Log3-S-Gamut3.Cine” in all following tests. Another question mark is, which project settings were combined with which media file properties really in the 8bit legacy mode.

 

TEST 1:

If you bring in the file in an 8bit legacy project setting, the media properties goes as default to “Undefined”. I keep that, as it is – then the waveform shows a luminance range from 20-70% (for all measures I take frame 208, since that was used by Roger):

Picture 2: slog3 file in 8bit legacy project without LUT- media properties in default mode

 

TEST 2:

If I apply his recommended Sony LUT “1_SGamut3CineSlog3_To_LC-709”, we receive a luminance range from 6% to 89%.

Picture 3: slog3 file in 8bit legacy projekt with LUT applied - media properties in default mode

 

The media properties are here unchanged, since he did not tell us something else – they are still set to “undefined”.

Picture 4: Media properties in the default mode are "undefined". Tests indicated, that this modus is identical to the legal mode

 

TEST 3:

If we disable the LUT again, keep the media properties still as “undefined”, and enable the ACES transformation, we find the luminance between 4% to 94% (instead 6% to 89%).

Picture 5: 32bit floating mode (full range) with ACES transformation to rec709 (ACES). LUT is disabled

Conclusion 1: the original slog3 file, brought into 8bit legacy project settings, and with applied LUT, shows a different luminance range as the 32bit floating point (full range) ACES transformation to rec 709.

 

Ok. Knowing, that the stone age of legal projects has become outdated for slog3, since slog3 is full range typically, there are some possibilities to continue:

Possibility 1:

TEST 4: we apply 8bit full range project settings and apply the LUT. The we see a luminance range between 4% to 94% immediately (still with the unchanged media properties as “undefined”):

Picture 6: 8bit full range project settings, media properties as "undefinded", and LUT applied

Conclusion 2: the original slog3 file, brought into the 8bit full range project settings, and with applied LUT, shows the same luminance range as the 32bit floating point (full range) ACES transformation to rec 709.


Possibility 2:

TEST 5: we asign the slog3 file the media properties to the full range, instead of running it with “undefined” – since we can assume for sure that slog files has the full range. And apply 8bit video level project settings.  So, going to the 8bit video level project settings, and enabling the LUT, we see now a luminance range from 6% to 89%. So this is not dramatically different to the ACES result, but it is difference.

Picture 7: 8bit video level project settings, and the media properties set to full range. LUT applied

 

Picture 8: as picture 7, here the project settings

 

Conclusion 3: the original slog3 file, with media properties set to the full range, with 8bit video levels project settings and with applied LUTs shows a similar, but not exact the same luminance range as the 32bit floating poult (full range) ACES transformation to rec 709.


Possibility 3: TEST 6: since the 8bit video level project settings show a small difference in the luminance, we can try the 8bit full range project settings, set the media properties to full range too, and apply the LUT.

We see a very slight increase in the luminance level, to 5% to 90%.

Picture 9: 8bit full range as project settings. Media properties full range too. LUT applied

 

Conclusion 4: the original slog3 file, with media properties set to the full range, with 8bit full levels project settings and with applied LUTs shows a similar, but not exact the same luminance range as the 32bit floating poult (full range) ACES transformation to rec 709.

 

Overall conclusion: the ACES transformation to rec709 shows a difference of about 4% points in the luminance, shown in frame 208 in this file, compared with the 8bit project settings. Since one can expect that a LUT will show a small difference to the ACES color management approach in Vegas, a difference of 4% points in the luminance is something the use has to accept to my opinion.

But to conclude, that the ACES mode is wrong, cannot be supported by those tests and findings at all. It could be the other way around too - that the LUT is not as precise as it should be! The approach here is at the best an approach based on comparison - but not more. Thes test presented here is not an appropriate way to be able to conclude that. All we see, is that there is a difference in the levels.

Beside the more academic approach there is also the practical approach: Since slog footage must be graded anyway, after performing the color correction to a neutral view. So, from the perspective of an colorist it does not matter, if ACES or a LUT will be applied – since the true work of color grading, the alignment of different takes both in terms of gamut and gamma has to be performed anyway AFTER the transformation to rec709 was done. Reasoning, why it has to be graded, see for example here:

http://www.beyondthesight.com/color-grading-importance/

Last changed by Wolfgang S. on 3/16/2023, 9:42 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo

ALO wrote on 3/16/2023, 9:36 AM

Here is another clip from the YT channel with more greens in it. I am testing this with my GPU on/off. I believe you'll find this is a better example of what I'm describing:

32 FR w/GPU off:

That's unacceptable.

32 FR w/GPU on:

That looks properly decoded to me.

8-bit (legacy) with the Sony LUT applied and a studio-to-computer levels correction:

The colors match. There are some differences, but you could absolutely work with this in 8-bit mode if you wanted to.

 

Howard-Vigorita wrote on 3/16/2023, 11:53 AM

@ALO I think I can replicate your conclusion. However I don't have your Sony camera, so I used my Canon xf605 with Clog3-CinemaGamutRec709-WideDR Lut that came with the camera. Shooting for that Lut is strictly full-range, just like Slog3. And I am able to get identical previews which exactly match lut renders with 8-full and 32-full (aces disabled). But I was not able to get a match with Aces. All my previews and renders with aces come out noticeably darker in previews and renders which match each other. I tried all the combinations Vegas offers for clog3. I think maybe the reason nothing matches is because none of the options match my lut or my camera. All of Vegas' clog3-CinemaGamut options have either Daylight or Tungstun wired in and the lut does not. And the straight clog3 menu choice in the media properties color space is not CinemaGamut. And there is no WideDR option. My inclination is to stick with the manufacturer-supplied clog3 lut, if only because it's more general purpose. And matches my camera. And I can't find a way to grade at 8-bit full and switch to Aces without regrading. Might be workable if I were able to convert an ofx lut to the format Vegas uses and drop it into the Vegas aces lut folder:

C:\Program Files\VEGAS\VEGAS Pro 20.0\OpenColorIO\configs\aces_1.2\luts

Btw, my 65-point manufacturer-supplied ofx 3d lut appears to have a similar balance of mathematical accuracy/precision as the Vegas Aces luts... the Aces luts have 12 decimal-place mantissas plus exponent on 65,535 pre-calculated data points. The OFX clog3 luts have 8 decimal places, fixed-point, on 823,875 pre-calculated data points. Probably a toss-up between precision and accuracy.

Wolfgang S. wrote on 3/16/2023, 4:15 PM

Here is another clip from the YT channel with more greens in it. 

What type of footage is that? Which camera? slog?

I have here some slog3 footage from the FS7-

A) ACES 32bit floating point full range - transformation to rec709 - GPU enabled

Looks ok

 

B) 8bit full range + LUT - GPU enabled

Looks ok

 

C) 8bit video range (because that is so beloved here) + LUT - GPU enabled

Looks ok

 

The LUT shows lower figures in the luminances, but more in the upper midtones.

 

D) 32bit floating point full range - transformation to rec 709 - GPU disabled

Shows a higher color saturation.

Ok, but who wants to work really with disabled GPU? Especially with the 32bit mode and ACES?

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo

ALO wrote on 3/16/2023, 7:23 PM

 

Ok, but who wants to work really with disabled GPU? Especially with the 32bit mode and ACES?

I don't myself, but in my case, I was using hybrid (Intel) mode so that my fx would update in real time when I make adjustments, which is why I noticed the issue.

In any case, I think it's for the developers to deal with (or not) at this point. The clips I linked to are all from the same shoot -- SLOG3 on a Sony A7SIII, I believe. Don't think it matters -- should be replicable with any 10bit SLOG3.

Wolfgang S. wrote on 3/17/2023, 1:01 AM

I don't myself, but in my case, I was using hybrid (Intel) mode so that my fx would update in real time when I make adjustments, which is why I noticed the issue.

That is not unusual. I have also two GPUs in my laptop - the RTX 3070Ti is used as main GPU, but the intel-GPU inside the processor is used for the I/O settings, so for playback.

In any case, I think it's for the developers to deal with (or not) at this point. The clips I linked to are all from the same shoot -- SLOG3 on a Sony A7SIII, I believe. Don't think it matters -- should be replicable with any 10bit SLOG3.p

The clip that I used above was shoot with the FS7 and is a 10bit slog3. I took that clip because he has huge green parts inside.

The findings are funny: all 3 settings deliver results, that seems to be optical acceptable.

What I see in the first 3 settings is, that the luminance range becomes broader a little bit , and that the midtones seems to be increased, both compared to the 8bit settings + LUT. It is not dramatical, but it is here. But what is now the "correct" transformation? With the LUT or with the ACES approach? I think, with this approach nobody can say. This is a visual approach, and we measure here nothing.

To disable the GPU in the ACES mode seems to result in an increase in the color saturation. Ok. To disable the GPU cannot be done really, or one accepts such a significant drop in the playback performance, that one cannot work anyway in ACES. So the only advise can only be, to keep the GPU enabled, and to correct anything in the manual grading.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti * Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED (ProArt Studiobook 16 OLED (i9 12900H with i-GPU Iris XE, 32 GB Ram. Geforce RTX 3070 TI 8GB) with internal HDR preview on the laptop monitor. Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG, Atomos Sumo