Your preferred 32bit Full Range Pixel Format Workflow / Pipeline ?

set wrote on 1/26/2023, 4:31 PM

I have been using 8-bit (Full Range) a whole time, but I believe there much more potential for having better result with 32bit (Full Range) Pixel Format. However, in 32bit, that means you have more customization on settings, like the ACES Color space, Media Color space input, or even decide of whether you want to use VEGAS-supplied IDT LUTS or use Manufacturer's IDT LUTS through Color Grading Panel. (I wonder also of the Color Space selection of 'Default' in Media setting. What is 'Default' ?)

These customizations can make your result better, but on the other side also a pitfall to a lot of confusion.

Target video is to be published YouTube or Social Media.

 

Last week had a chance to go with 10 bit source records (XAVC-HS 4K 4:2:2), and I just tried my best:

  1. set to 32bit full range pixel format
  2. View Transform, which is my basic daily monitor, then sRGB (ACES)
  3. Set Media Color Space input to match my recording setting, which is Sony Slog3 & Sgamut3.cine
  4. Do color fix and grading through Color Grading Panel tool.

 

Like to hear your opinions regarding this 32bit pipeline workflow and reason of your selection decision.

Last changed by set on 1/27/2023, 12:51 AM, changed a total of 2 times.

Setiawan Kartawidjaja
Bandung, West Java, Indonesia (UTC+7 Time Area)

Personal FB | Personal IG | Personal YT Channel
Chungs Video FB | Chungs Video IG | Chungs Video YT Channel
Personal Portfolios YouTube Playlist
Pond5 page: My Stock Footage of Bandung city

 

System 5-2021:
Processor: Intel(R) Core(TM) i7-10700 CPU @ 2.90GHz   2.90 GHz
Video Card1: Intel UHD Graphics 630 (Driver 31.0.101.2127 (Feb 1 2024 Release date))
Video Card2: NVIDIA GeForce RTX 3060 Ti 8GB GDDR6 (Driver Version 551.23 Studio Driver (Jan 24 2024 Release Date))
RAM: 32.0 GB
OS: Windows 10 Pro Version 22H2 OS Build 19045.3693
Drive OS: SSD 240GB
Drive Working: NVMe 1TB
Drive Storage: 4TB+2TB

 

System 2-2018:
ASUS ROG Strix Hero II GL504GM Gaming Laptop
Processor: Intel(R) Core(TM) i7 8750H CPU @2.20GHz 2.21 GHz
Video Card 1: Intel(R) UHD Graphics 630 (Driver 31.0.101.2111)
Video Card 2: NVIDIA GeForce GTX 1060 6GB GDDR5 VRAM (Driver Version 537.58)
RAM: 16GB
OS: Win11 Home 64-bit Version 22H2 OS Build 22621.2428
Storage: M.2 NVMe PCIe 256GB SSD & 2.5" 5400rpm 1TB SSHD

 

* I don't work for VEGAS Creative Software Team. I'm just Voluntary Moderator in this forum.

Comments

fr0sty wrote on 1/27/2023, 12:16 AM

For HDR:

Enable windows HDR mode on my external HDR monitor. Set project to HDR mode, verify VEGAS can talk to my HDR TV in preferences>display devices and make sure the HDR output check box is checked.

Import Media (Panasonic S1 UHD 24p VLOG/VGamut)

Right click on all media files at once, navigate to properties, set color space to Vlog/Vgamut

Most of the work is done at this point, from there I open the color grading panel, check my levels on the scopes, and start coloring from there.

For SDR:

I'm a bit unsure of whether or not I should be using sRGB or Rec709 view transform in project settings here, I want to say I've tried both with similar results...

Same as above, import media, set color space.

Color grade to my liking.

Apply a levels filter to the master bus, and load the studio levels preset.

Render

For SDR, not sure if I'm getting those levels right on output doing it this way, but it seems to work. I do know the HDR method has produced some amazing results, though.

Because 32 bit takes so long to render, I haven't used it for much other than HDR, which I rarely render out... usually I just use LUTs and keep everything 8 bit for non HDR stuff... but 32 bit definitely produces far superior results using view transforms than LUTs in my experience.

Last changed by fr0sty on 1/27/2023, 12:18 AM, changed a total of 3 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Wolfgang S. wrote on 1/27/2023, 3:14 AM


Executive summary: Use 10 or 12bit footage, with the appropriate IDT and ACES. Everything else may lower your quality.

 

In more details:

Footage and shooting: It simply makes no sense to shoot in slog and 8bit - since slog forces an intense grading to be able to use it at all. Without 10bit, your results can look dramatical, even if you try to "save" 8bit footage in a 32bit pipeline.

Shoot EITHER in 8bit and rec709 curves if required, for example if your customer is not willing to pay the grading time or you simply do not need that (hypergamma curves as they called it with the FS7, but also the FX6 offers really great footage out of the camera, that can be used without any grading for professional usage, what is done really by professionals even today in many many cases).

OR shoot in 10bit and log, and grade it - but then you have to spent the additional time for grading, and will need the suitable hardware to be able to do that (see below).

If you wish to go for HDR, shoot in 10bit or higher, and to shoot in log or raw is still the best way in terms of quality. And apply a good color management system in the post. Same is true if you grade for rec709 from log/raw.

Be aware that 8 bit footage offer you an reduced amount of latitude in the post only, but allow you an much easier workflow what can still be done in 8bit project settings in Vegas! To work in 32bit mode with 8 bit footage can have the advantage that you avoid banding generated during grading - but has the huge disadvantage of increased calculation time, maybe with no improved quality that your eyes may be able to detect.

But you may decide to accept all those facts and shoot in 8bit. But then I would not shoot in slog3 but in extended rec709 profiles to reduce the grading to a minimum to avoid that the footage breakes.

Today it is even cheaper and easier then it was some years ago to use 10bit. Starting with cameras like the GH5/6 or the EX3, using as compromise long-GOP, make it much easier. But do not oversee the time and equipment you will need in the post.

 

Time requirement: Is takes significant time to grade in the CGP. Spent the time to learn how you grade your footage really. Both for SDR but also HDR. Some general guidelines for the Grading has been shown in the BRAW tutorial published here in the tutorial section, and can be usefull for slog too.

 

Input LUTs: Is there something wrong if you use input LUTs in the CCP, regardless if you use the official Sony LUTs or the Vegas internal LUTs? Well, it depends. The disadvantage is, that - if you come from slog footage with a wide color space SGamut3.cine - and grade to rec709, you go from a huge color space down to the small rec709 color space , and then you grade that. This is never a good idea, the smaller color space may cause some troubles. The better approach is, to stay in the huge ACES color space, make the grading there, and then reduce the graded output to rec709 in the last setp only.

To use such input LUTs may be attractive, if you need quick results, and if your output quality is not so important. Or if there are not huge grading requirements

I would always tend to use the high quality color management systems that we have today - and that is ACES in Vegas. The advantage are the more precise 32bit calculations in a huge color space, and that different cameras will deliver the same results.

 

IDT Settings for the ACES workflow: Do not mix up those settings with the input LUTs. All, what you tell here the software, is how the footage will be transformed in the huge ACES color space. To define that in media properties is a MUST for the ACES workflow. Not sure what default is here really - so set our input footage to an appropriate source.

 

Grading in the ACES workflow: Yes, this is an calculation intensive process, both for grading and rendering. Be prepared that you will need a powerfull machine for that (both for the GPU and CPU) for a significant preview speed, even with a reduced preview quality. With Resolve, you edit 12bit BRAW 6K 50p footage even on a high end laptop, using the high quality color management system like ACES or their own color management system - at full 50p with an UHD preview. So, I think there is headroom left to improve the performance of Vegas further.

But the quality is the best, what you can bring out from your footage.

 

Grading hardware: There may be also the requirement to use the appropriate calibrated monitor and a Ultrastudio 4K mini or Decklink 4K 12G as I/O device. Both is supported by Vegas too, even for HDR (I will check that, it was broken for HDR for a long time).

An appropriate but expensive HDR monitor is this one

https://www.asus.com/displays-desktops/monitors/proart/proart-display-pa32ucg-k/

but one may choose significant cheaper solutions like this one

https://www.philips.de/c-p/27B1U7903_00/professional-monitor-4k-uhd-mini-led-thunderbolt-tm-4-monitor#see-all-benefits

both in combination with an Blackmagic Ultrastudio 4K Mini or Decklink 12G 4K.

To use the GPU for the HDR preview - well, Vegas offers that. It is great to be able to have an internal HDR preview picture, at least as an proxy for HDR. Funny enough, even Resolve offers that - but for the mac version only. But it should not be recommended really to my opinion. The reasons for that are, that the signal of the GPU is influenced by both the GPU and the OS, and is nor relyable really.


 

 

 

Last changed by Wolfgang S. on 1/27/2023, 4:49 AM, changed a total of 2 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Howard-Vigorita wrote on 1/28/2023, 10:17 AM

I've been working my projects in Limited Range since I started using Vegas long ago but am trying to make the switch to Full Range for a more streamlined workflow that avoids view-transforms like SeMW or Aces. I've found it optimal to edit in 8-bit mode and switch to 32-bit for the final render because it always looks better, no matter what my mix of media might be. And I do mix it up in multicam projects with a variety of cameras. Here's what I came up with that seems to be working for me:

The left 2 panes are my 8-bit full default settings. Notice I tried to save a Full32 template but that didn't work. For some reason Vegas only saves stuff above the blue line in templates saved this way. But if the box at the bottom is checked, everything is saved including the Audio tab.

The right-hand pane is what I want to end up with for 32-bit full. The view-transform is off which is exactly what I want. Unfortunately, I have to also change the Gama which sometimes turns the view transform back on, so it can take some fiddling.

This setup seems to suit my purposes with Canon and Zcam provided Luts that are higher in quality than those provided in the Vegas Color Grading Panel. I never use the Vegas built-in luts which are typically medium quality. Be nice if Vegas had a place for users to put manufacturer-supplied luts, perhaps in the Documents folder next to custom scripts, but for now I've been dropping them here:

C:\Program Files\VEGAS\VEGAS Pro 20.0\OFX Video Plug-Ins\Vfx1.ofx.bundle\Contents\Resources\AutoLooks

 

Wolfgang S. wrote on 1/29/2023, 8:47 AM

Your workflow 8bitfull-cut >32bitfull-render is the true and correct one.
But, your render-out won't get correct Rec709 productions unless you change the gamma from 1.000 linear to 2.222 video: The signal levels ain't the same.

I do not know what a "true and correct workflow" is. The workflow has to be consistent with type of footage, if it was shoot with full or limited range. What is a pure technical decision.

Depending it the footage was shoot with limited or full range, the settings must be adapted in the correct way. For example, last year I have shoot with my older GH4 to rec709 with limited range (can be confirmed in media info). Correct settings in Vegas woul be "32bit floating point (full range)", if the media properties are either set to "limited" or "undefined". Otherwise the luminance range would be compressed, showing a flat picture. To set them to "full" in the media properties would show the wrong luminance, too.

To use "32bit floating point (video levels)" would show with with my type of footage in all settings in the media properties the wrong luminance. So it depends, how Howard has shoot his footage.

Last changed by Wolfgang S. on 1/29/2023, 8:49 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Howard-Vigorita wrote on 1/29/2023, 11:02 AM

Your workflow 8bitfull-cut >32bitfull-render is the true and correct one.
But, your render-out won't get correct Rec709 productions unless you change the gamma from 1.000 linear to 2.222 video: The signal levels ain't the same.

@Yelandkeil I just tested some footage shot hevc 10-bit 60 fps Rec 709 and you are right. Did a light grade at 8-bit full, switched to 32-bit full for a MainConcept render, and put the render side-by-side with Vegas 8-bit preview screen. Had to also set the compositing gama to 2.222 (Video) on the 32-bit full render to get a match. I must have messed up when I tried this earlier. Oh, well. Gotta change 2 things to go back and forth... maybe 3 because messing with gama turns the view transform back on. Ha, ha, like playing whack-a-mole. Be nice if I could just save it all as a preset. Will correct my earlier post to avoid misleading anyone.

Here's a capture of the 32-bit full render next to the 8-bit full preview:

ALO wrote on 1/29/2023, 12:31 PM

I think Vegas' 32-Full implementation is a bit strange and (I may be wrong here) unnecessary for what you're thinking.

You should be able to work in 8-bit full, and then for rendering switch to 32-video levels, add a computer-to-studio levels fx on the output bus, and be done.

If all goes well (ie, you've chosen 32-bit compatible plugins for your project), that should let you edit in 8-bit mode and then render out with the advantages of 32-bit math without having to readjust everything when you switch project properties. Fingers crossed!

Wolfgang S. wrote on 1/29/2023, 12:55 PM

@Howard-Vigorita
what type of footage do you edit? I understood rec709, but has the luminance legal or full range?

 

@Yelandkeil

do not know what you mean here. If you have detected another bug, fine. But is it not acceptable if setttings as suggested by you does not fit for every footage?

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 1/29/2023, 2:00 PM

If you post something here in this thread, then the relevant information should be here. So, what do you mean???

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 1/29/2023, 3:59 PM

You should be able to work in 8-bit full, and then for rendering switch to 32-video levels, add a computer-to-studio levels fx on the output bus, and be done.

If all goes well (ie, you've chosen 32-bit compatible plugins for your project), that should let you edit in 8-bit mode and then render out with the advantages of 32-bit math without having to readjust everything when you switch project properties. Fingers crossed!

Alo,

if you work in 8bit full, you can switch to 32bit full without any change in scopes. Important is, that you adjust the media properties to the correct figure - so to limited if you shoot in the legal range. Then the scopes stay fine.

But it makes no sense to switch from full to video range - since that will again compress the luminance range, so the contrast - but twice makes no sense. Due to the change in contrast, you would be forced to grade the footage again, if you switch the project settings from full to video.

File from the GH4, shoot to limited (settings in camera).

 

Set to limited in media properties (can also stay as default, since that is limited):

 

Scopes in 8bit full project settings

 

are the same as in 32bit full project settings

 

If you render with 32bit full, your output shows the same levels (rendered file below, and original track muted, so you see the rendered file in the scopes):

 

and media info confirms that the output is limited in range

 

Rendering took place with the Magix AVC encoder, range default to limited

 

Last changed by Wolfgang S. on 1/29/2023, 11:37 PM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Howard-Vigorita wrote on 1/29/2023, 4:02 PM

what type of footage do you edit? I understood rec709, but has the luminance legal or full range?

@Wolfgang S. When I shot the original footage on the day after Christmas I set the two 4k cameras to limited range and edited in a limited range project to keep it simple. Just re-graded in full-range without a view-transform and it worked.

Wolfgang S. wrote on 1/29/2023, 4:28 PM

Why should it not work? The only issue could be the change in contrast, if one switches between video and full range.
Another issue is, that a lot of cameras record also in superwhite (so >235). That is confusing if one sees the histogram going up to 255, even with files flaged with a legal range.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Howard-Vigorita wrote on 1/29/2023, 4:37 PM
You should be able to work in 8-bit full, and then for rendering switch to 32-video levels, add a computer-to-studio levels fx on the output bus, and be done.

@ALO The SeMW extension or Level FX are only needed if you want to edit in limited-range and have the render resemble what you saw in the Vegas preview. SeMW uses the Vegas Level FX but disables it automatically when rendering. You could use the Vegas Level FX without SeMW when you edit but you'd have to turn it off manually before rendering. Otherwise the render won't match what you looked at when editing.

Neither the Level FX or SeMW are needed if you work with a full-range project instead. My motivation to edit in full-range is to get rid of them both. And be more done than merely done. With no other fx, grading changes, or view-transforms. I think I got it.

Howard-Vigorita wrote on 1/30/2023, 4:13 AM

@Yelandkeil I think the scopes monitor the preview screen. If the preview stays the same when switching from 8 to 32, the scopes would too. And vice versa. However I'm more interested in being certain that what I saw in edit is what I got in the render... so in my mind, the acid test is to look directly at those 2 things. The sun's about rise where I am right now but I don't want to see it for at least 8 hours. So I'm going to sleep.

Wolfgang S. wrote on 1/30/2023, 4:34 AM

VEGAS must be crazy that its scopes do NOT monitor the output-signal but the input ones:

?

It is as @Howard-Vigorita said. It are always the output-signals that are shown.

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 1/30/2023, 6:57 AM

Just to answer you @Howard-Vigorita - no comment - how can it be the same after you edited your timeline (output signal) in 8bit full with 2.222 gamma and switch into 32bit full with 1.000 gamma?

You can simply test that. Use a motive that shows you enough of the scopes and switch between the project settings - and observe if your preview changes or not for the two project settings posted by me. As I have done that above.

Try to be polite and rethink what you have stated here.

VEGAS must be crazy that its scopes do NOT monitor the output-signal but the input ones

 

Last changed by Wolfgang S. on 1/30/2023, 7:16 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

john_dennis wrote on 1/30/2023, 9:15 AM

I don't currently have a 32-bit workflow of any kind.

Howard-Vigorita wrote on 1/30/2023, 3:33 PM

how can it be the same after you edited your timeline (output signal) in 8bit full with 2.222 gamma and switch into 32bit full with 1.000 gamma?

@Yelandkeil It only looks the same in a shrunken Vegas preview screen which reflects mostly the overall color balance, contrast, and brightness which also model easily in scopes. Not so much the sense of detail, crispness, flow, and dimension which scopes don't portray either. On my 32-inch full screen 4k monitor, 32-bit renders look noticeably better in every respect. Wouldn't bother, otherwise.

You're absolutely right about the gama... I corrected my original screen shot to reflect the fact that 32bit gama needs to match 8bit in order to switch back and forth. I originally thought that the gama setting was ignored when the Aces view-transform was off. Thanks again for catching that. Saved me allot of regrading when I eventually noticed it on my own.

ALO wrote on 1/30/2023, 7:10 PM
You should be able to work in 8-bit full, and then for rendering switch to 32-video levels, add a computer-to-studio levels fx on the output bus, and be done.

@ALO The SeMW extension or Level FX are only needed if you want to edit in limited-range and have the render resemble what you saw in the Vegas preview. SeMW uses the Vegas Level FX but disables it automatically when rendering. You could use the Vegas Level FX without SeMW when you edit but you'd have to turn it off manually before rendering. Otherwise the render won't match what you looked at when editing.

Neither the Level FX or SeMW are needed if you work with a full-range project instead. My motivation to edit in full-range is to get rid of them both. And be more done than merely done. With no other fx, grading changes, or view-transforms. I think I got it.

The 32-video levels workflow I describe has the advantage of being easy to understand and shouldn't (I think) result in less image quality compared to going directly from 8-bit full to 32-bit full.

I stopped rendering out in 32 FR a few years ago because I got tired of all the strange issues that kept popping up. I agree, if the current version of Vegas does a better job going from 8FR to 32FR, that would be the better workflow (although it does involve either turning off the view transform or getting your ACES properties set correctly on a per-clip basis).

ALO wrote on 1/30/2023, 7:53 PM

Ok, out of curiosity I tried grading some 8-bit source (just a simple color correction) with the Color Corrector (Secondary) plugin which is listed as 32-bit compatible.

Here's 8-bit FR:

Here's the same project switched to 32VL:

Easy peasy -- looks exactly the same. I can add my levels fx to correct the output and render out no problems.

Now here's 32FR with view transform off:

Uh-oh. Clearly not the same.

How about 32FR with view transform set to sRGB and clip properties set to sRGB to match:

Nope. Doesn't match, either.

I say no thanks. I'll stick with the 32VL workflow when I need 32-bit math. Life is too short for ACES...

RogerS wrote on 1/30/2023, 8:03 PM

Of course ACES is different- it uses info about the media and transforms it into its own color space which no other mode does.

View transform needs to be kept off to stay in 32 bit full mode. I assume you changed gamma to 2.2 to match the 8-bit full project?

Last changed by RogerS on 1/31/2023, 2:06 AM, changed a total of 2 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Wolfgang S. wrote on 1/31/2023, 2:02 AM

I say no thanks. I'll stick with the 32VL workflow when I need 32-bit math. Life is too short for ACES...

@ ALO, ACES is different. But the discussion started by @ Howard-Vigorita was, if it is possible to work in 8bit full, and switch to 32bit (without any ACES transformation):

The right-hand pane is what I want to end up with for 32-bit full. The view-transform is off which is exactly what I want.

And that is possible.

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 1/31/2023, 2:08 AM

You're absolutely right about the gama... I corrected my original screen shot to reflect the fact that 32bit gama needs to match 8bit in order to switch back and forth. I originally thought that the gama setting was ignored when the Aces view-transform was off. Thanks again for catching that. Saved me allot of regrading when I eventually noticed it on my own.

At least I am not able to see any difference between the "8bit full project settings" and "32bit full project settings", as shown above in my posting above. No difference in waveform, vectorscope nor visual impression.

But if you switch from full to video level, sure there will be a difference. Because the luminance will be converted from 0..255 to 16..235.

Last changed by Wolfgang S. on 1/31/2023, 2:14 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 1/31/2023, 2:31 AM

Grading hardware: There may be also the requirement to use the appropriate calibrated monitor and a Ultrastudio 4K mini or Decklink 4K 12G as I/O device. Both is supported by Vegas too, even for HDR (I will check that, it was broken for HDR for a long time).

An appropriate but expensive HDR monitor is this one

https://www.asus.com/displays-desktops/monitors/proart/proart-display-pa32ucg-k/

but one may choose significant cheaper solutions like this one

https://www.philips.de/c-p/27B1U7903_00/professional-monitor-4k-uhd-mini-led-thunderbolt-tm-4-monitor#see-all-benefits

both in combination with an Blackmagic Ultrastudio 4K Mini or Decklink 12G 4K.

Testing this combination in some more detailts, I see the issue that Vegas still does not deliver a HDR flag to the aktual Decklink interface, as implemented in Vegas at the moment.

For this specific ASUS monitor, that means that the monitor can be run in what ASUS calls "HDR simulation" only. The monitor runs here in SDR mode, and the HDR settings like PQ or HLG can be accessed. However, since the monitor runs with 400 nits in this mode only, that makes limited sense only.

Wtih other monitors like the Atomos Sumo this works with the Decklink 4K 12G Extreme. For the ASUS, the use as windows monitor is the better approach,

 

Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

ALO wrote on 1/31/2023, 12:26 PM

Of course ACES is different- it uses info about the media and transforms it into its own color space which no other mode does.

View transform needs to be kept off to stay in 32 bit full mode. I assume you changed gamma to 2.2 to match the 8-bit full project?

Roger you're right -- if you change the gamma to 2.2 to match the 8-bit project's, then it looks like my experiment with the CC (Secondary) plugin gives the same results.

But again: why do I want to do that? At that point, I've just gone through several clicks to get to where I was with the 32-bit video levels option.

Yes, maybe I'd like to use 32FR's linear 1.0 gamma (I would, actually), but there's no way I'm aware of to go from 8-bit FR to 32FR (1.0) without having to regrade my clips.

Wolfgang, am I wrong about that?