Switching from 8-bit full range to 32-bit full range before rendering

JaredF wrote on 12/27/2021, 2:34 PM

I have found that editing in 8-bit (full range) mode has advantages for my workflow over 32-bit floating point (video levels) mode. I use Vegas Pro 19.

There’s just one problem with 8-bit full range mode. I shoot and edit 10-bit 422 footage and I definitely see more noise and banding in MP4s rendered out in 8-bit mode vs. the same exact footage rendered out in 32-bit floating point mode.

So that suggests I might switch from 8-bit (full range) to 32-bit floating point (full range) just for rendering final projects. I have done this successfully (without unexpected changes in the final rendered video) by making sure I set the composting gamma to 2.222 and view transform to “off” when making the switch.

However, several strange behaviors I’ve noticed in 32-bit floating point (full range) mode make me wonder if I’m not setting myself up for unpredictable troubles. This is the funny behavior I’ve noticed so far:

I change a project from 8-bit full range to 32-bit full range and everything appears fine. The jagged edges on the histogram smooth out, confirming I’m now working with the higher bitrate.  However, if I then open up the color grading panel, the image in the preview window will at that moment have an unexpected color shift. This shift doesn’t happen to all images, but it does happen to any image on which I’ve used the “color curves” adjustment in the color grading panel.

So then I close the color grading panel and the unexpected color shift remains. If I go up to the edit menu to explore, I see that “Undo ColorSpace Updated” is now an option. If I click "undo" the colors go back to normal. So apparently the ColorSpace is being updated when I open the Colorgrading panel in 32-bit full range mode?

This is all mysterious to me. Does anyone understand what’s happening and why? More importantly, is my method of switching from 8-bit full range to 32-bit full range, touching nothing else, and then rendering, a reliable way to go? Or might other unexpected program changes be lurking?

Comments

RogerS wrote on 12/28/2021, 7:07 AM

8-bit full and 32-bit full aren't equivalent. 32-bit full triggers ACES color management so things like view transforms and file color spaces become important.

Personally I'd go 8-bit video to 32-bit video (and handle levels conversions between limited and full manually just like we did <VP 18). The colors won't change at all and as far as I've seen all the tools work properly in 32-bit video mode.

I'm hoping Vegas will come out with an intermediate mode (16-bit?) which will yield better results with >8-bit footage without the performance penalty of 32-bit video or requiring ACES color management.

Yelandkeil wrote on 12/28/2021, 7:45 AM

Repeat:

32bit floating point edit modus has nothing to do with ACES.

 

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

JaredF wrote on 12/28/2021, 7:53 AM

For many people I believe your solution (8-bit full range to 32-bit video levels before rendering) will work fine, but my tests have found that when using input LUTS (which I always do) the LUTS apply differently to full range vs. video range footage. So changing from full range to video levels before rendering doesn't give me accurate results to what I color graded to.

So I guess what I need to hope for in a future version is a 32-bit full range mode that has nothing to do with ACES?

RogerS wrote on 12/29/2021, 12:11 AM

I never recommended 8 bit full to 32 bit video. I recommended 8 bit video to 32bit video.

Conforming levels properly is up to you. I also use input correction LUTs.

JaredF wrote on 12/29/2021, 8:27 AM

Thank you, and apologies that I misunderstood your previous comment.

I see now that your method will work. In my case I'll need to:

1. Start a new project with video levels in project settings.
2. First apply studio levels to computer levels transform to log footage.
3. Then apply input correction LUT.
4. Before rendering, make sure there's a 0 black solid color layer as bottom track.
5. Then apply computer levels to studio levels transform to entire project.
6. Then render.

You can see why a properly functioning 32-bit full range mode would be helpful. It would eliminate the need for steps 2, 4 and 5. I can't tell you have many projects I've rendered out in which I forget step 5 and then need to do it all over again.

RogerS wrote on 12/29/2021, 11:05 PM

Is this log footage studio levels to begin with? It's slightly unusual. A user database of min/max levels here.

While the order you apply 2/3 doesn't matter I assume this is so your Fx chain you have the levels transform last.

I'm curious what the purpose of #4 is. You are doing some compositing or have transparency somewhere?

Totally agree I wish we had a higher bit version of 8-bit full. Maybe a GPU-accelerated "16-bit full" could become a new default and work with high-bit log footage and apply automatic levels corrections, giving the best of both worlds.

JaredF wrote on 12/30/2021, 8:09 AM

1. The official Canon LUT I use for Clog2 footage is labeled full range to full range. If I start with my CLog2 footage in a video levels project, the result I get from applying the LUT is different than if I first apply the studio levels to computer levels transform to the footage and then apply the LUT. I can get both methods to look the same eventually, but it's less grading work to start with full range footage.

2. In my experience the order you apply the levels transform (before or after the LUT) definitely makes a big difference. If you apply the levels transform after the LUT, the blacks get crushed much more quickly after you bring down midtones and there's a lot more wrestling to get the footage where you want it. Conversely, with the levels transform applied first (or in full range project mode) the footage grades very easily.

3. If you render out a "fade to black" that just goes to an empty bottom Vegas track, and then apply the computer levels to studio levels transform, that black will still render as 0 black, making it broadcast illegal. But if you put a 0 black solid color track under everything, and apply the same computer levels to studio levels transform, it renders as 16 black, broadcast legal. Interesting, do the same exact test in a full range project and the results are different: both the empty bottom track and solid black track will render as 16 black when opened in a video levels project.

I've been wrestling with the intrigues of Vegas video levels vs. full range for what feels like a very long time. Full range 32-bit mode without any strange behaviors is still what I'm hoping for. (I do see a performance penalty using 32-bit over 8-bit, but it's not very dramatic. A 3-minute 4k project I rendered recently takes 5 minutes vs. 4 minutes to render out in 32 bit mode over 8 bit.)

 

Howard-Vigorita wrote on 12/30/2021, 10:41 PM

@JaredF I only started working with Canon log the other day but I noticed that whenever I shoot log it defaults to capturing it as full range and the metadata shows that way in the MediaInfo. Pulling the clip into Vegas, makes no difference if I set the project property to 8-bit full or video level. Vegas does not alter the level range either way. Same if I change the video property to 32-bit limited. But if I change it to 32-bit full there's a dramatic change in contrast. So I won't be doing that.

As an aside, I also noticed that the Canon "C-Log3 to Rec 709" built into the Input LUT of the Vegas 19 Color Grading panel looks pretty different from the official one I got with my camera. My camera came set to use BT.709 with c-log3 which shows as the matrix coefficient in MediaInfo so I think the official LUT I'm supposed to use is the one named "BT709_CanonLog3-to-BT709_WideDR_65_FF_Ver.2.0.cube" ... the Canon LUT pack also has an analogous one like that for Log2 which I imagine would apply to your footage. I think I have a setting to try the BT.2020 version of log3 but I haven't spotted a CinemaGamut setting yet. In any event, matching the color matrix to the official WideDR 65-point Canon LUT looks quite a bit better to me than the 33-point one built into Vegas. It also causes the scope range in Vegas to look pretty much like the range of the scope in my camera which I used to set the exposure when I shot the clip.

JaredF wrote on 12/31/2021, 9:35 AM

I do see a difference with switching between Full Range and Video levels after the Canon LUT is applied to a clip, although the difference is subtle. I'm looking at one clip now in which the bottom of the histogram is at 0 and the top at 248 in full range. I switch over to video levels and the bottom moves up to 1 and the top goes down to 238. So it's a small but meaningful difference. Once you try to change the midtones though you'll notice a much bigger difference. So I believe best practice is either:

1. If you don't care about increased banding and blockiness, just edit in 8 bit full range and be done with it.
2. If you do, use the video levels project process I described above earlier in this post.

I don't use the built in Vegas LUTS, I believe the Canon supplied ones are superior. It looks to me like you've got the right LUT for CLOG 3, although I use the Cinema Gamut setting which you set in camera (and the appropriate Cinema Gamut Lut from Canon)

Howard-Vigorita wrote on 12/31/2021, 2:15 PM

Does your color range metadata read like mine?

Color range                              : Full

My understanding is that Vegas 18 and later will not remap the color range when it sees that metadata and you set video range to 8-bit/full in project properties. Full discussion is here. Have only run into trouble when the metadata is missing or worded slightly differently and Vegas assumes it's limited and remaps accordingly, throwing off the downstream transform lut. Setting the project to limited or rewriting the metadata can fix that.

JaredF wrote on 12/31/2021, 3:20 PM

I'm not seeing that piece of metadata one way or the other, but perhaps I'm looking in the wrong spot. What steps are you doing to check the metadata?

RogerS wrote on 12/31/2021, 6:34 PM

Use MediaInfo, it's a separate program.

Video
ID                             : 1
Format                         : AVC
Format/Info                    : Advanced Video Codec
Format profile                 : High@L4.1
Format settings                : CABAC / 2 Ref Frames
Format settings, CABAC         : Yes
Format settings, Reference fra : 2 frames
Format settings, GOP           : M=1, N=6
Codec ID                       : avc1
Codec ID/Info                  : Advanced Video Coding
Duration                       : 13 s 13 ms
Bit rate mode                  : Variable
Bit rate                       : 48.2 Mb/s
Maximum bit rate               : 60.0 Mb/s
Width                          : 1 920 pixels
Height                         : 1 080 pixels
Display aspect ratio           : 16:9
Frame rate mode                : Constant
Frame rate                     : 23.976 (24000/1001) FPS
Standard                       : NTSC
Color space                    : YUV
Chroma subsampling             : 4:2:0
Bit depth                      : 8 bits
Scan type                      : Progressive
Bits/(Pixel*Frame)             : 0.970
Stream size                    : 74.8 MiB (97%)
Encoded date                   : UTC 2021-07-10 00:20:42
Tagged date                    : UTC 2021-07-10 00:20:42
Color range                    : Full
Codec configuration box        : avcC

In Vegas in media properties you can see what color range Vegas thinks the file is.

(This is an example showing Sony S-log 2)

Last changed by RogerS on 12/31/2021, 6:43 PM, changed a total of 2 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

JaredF wrote on 12/31/2021, 7:43 PM

I will have to mess around with MediaInfo later, but within Vegas the color range on my Canon Clog2 footage is listed as "undefined."

RogerS wrote on 1/1/2022, 3:20 AM

Feel free to upload a few second clip somewhere and I'm happy to test and give you MediaInfo. Vegas interprets undefined as video levels and does a transform.

As far as what levels the file actually is (regardless of wrong or missing metadata), try the test here and shoot with the lens cap on and at a light and report back histogram values.

JaredF wrote on 1/2/2022, 8:42 PM

In MediaInfo, the color range of my Clog2 Cinema Gamut footage reads: "Min: 0, Max: 1023, Chroma range: 1023"

On the next line it says: "colour_range_Original: Full

RogerS wrote on 1/2/2022, 9:17 PM

So that is full range.

10-bit should be a 1-1023 scale (limited range means no data between 0-64 and 940-1023). Vegas may be misreading the metadata.

Next question is what is the footage actually- I'd try the test I referenced in the last post and see if you get values above 940 for brights or below 64 for darks when looking at the histogram in 32-bit video mode (or 8-bit equivalents if you are in an 8-bit project, you get the idea).

Last changed by RogerS on 1/2/2022, 9:17 PM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

JaredF wrote on 1/2/2022, 10:15 PM

I did the black test, with lens cap on and gain all the way down. After opening up the footage in a 32-bit video levels project, the black registered as 95 on the histogram. Interesting, when I then apply the official Canon Lut, the black jumps down to 0, with just a little spread up until about 8. If I apply the studio levels to computer levels transform, then the LUT, it registers as perfect black.

fr0sty wrote on 1/3/2022, 4:17 AM

What format are you rendering to? If rendering back to an 8 bit format, it will re-introduce color banding, as you're only encoding 1/4 of the levels present in your source video. If you're trying to get around that, you want to render to HDR, ProRes, or another 10 bit format.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

RogerS wrote on 1/3/2022, 7:26 AM

Hmm, 95- that's high. What about on the highlights side? Does it go into superwhites?
So it is video levels with incorrect metadata?

8-bit is fine for an output format for HD as far as I can tell- it's what our TVs display after all. Blu-Rays are 8-bit. All the 14-bit raw pictures I process are output as 8-bit. The banding comes with heavy editing of 8-bit images.

JaredF wrote on 1/3/2022, 8:32 AM

What format are you rendering to? If rendering back to an 8 bit format, it will re-introduce color banding, as you're only encoding 1/4 of the levels present in your source video. If you're trying to get around that, you want to render to HDR, ProRes, or another 10 bit format.

Sometimes I render to ProRes HQ, sometimes to a high bitrate MP4. In either case, I've noticed that the final result looks better if I stick with a 32-bit project.

Howard-Vigorita wrote on 1/3/2022, 11:41 AM

In MediaInfo, the color range of my Clog2 Cinema Gamut footage reads: "Min: 0, Max: 1023, Chroma range: 1023"

On the next line it says: "colour_range_Original: Full

@JaredF The red flag I see is that the exact syntax Vegas looks for to identify full range media is not there. Presence of the word "Original" suggests this metadata is from a clip that's been processed by some other software which has already converted it to limited range. In which case Vegas is correctly identifying it as limited range and is remapping it back to full range, costing you some color detail with round-trip level conversions. You might do better if you can get that other software out of the picture or configuring it to not remap the levels. If you go with the original clips from your camera in Vegas, just make sure they have the full range syntax Vegas expects in the metadata if the project property is set to full range. If you're sure they're full range clips, you can always set the project to either limited 8- or 32-bit which I think will disable metadata-based level remapping by Vegas. If they're full range and marked the way Vegas expects, you should see no level changes with 8-bit full or either of the limited settings.

JaredF wrote on 1/3/2022, 2:18 PM

Hmmm, there's no other software involved. The info provided by MediaInfo is on original clips straight from the C70.

JaredF wrote on 1/3/2022, 3:51 PM

Thank you Howard and RogerS for helping me think through this. Complicated stuff I don't always fully understand.

Updated conclusion: I think the footage is truly full range. And it seems my Vegas is treating it as video range.

RogerS wrote on 1/3/2022, 10:17 PM

Well this is confusing as heck but hopefully once you understand what the footage truly is you can process it properly, regardless of the metadata.

For me it was like a lightbulb went off when I finally did these tests with my Canon DSLR (full range) and then newer Sony mirrorless (video range) and understood why they looked so different on my VP timeline (pre 8-bit full).