[VP16_248] 'Autolooks' and 'Film Effects' vs HDR Colour Grading

AVsupport wrote on 8/26/2018, 8:33 PM

Was hoping for a little bit more 'Pro' Grading tools in this edition, given the ever increasing need for sophisticated HDR and LOG colour grading found elsewhere to deal efficiently with SLOG/Cine profiles . IMO 'Autolooks' with unchangeable presets, 'Film Effects' noise and 'Sepia' seem better suited to some lower tier NLE..

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

Comments

ahshing wrote on 8/26/2018, 11:17 PM

Yes I really need assist on editing HDR/HLG footages. PrCC works great on the HLG with Lumetri Color, but I find no information about color grade the HLG inside the Help/Contents inside the VP16, even the VP16 webpage it just mentioned VP16 can supports HDR but no more information about that.

ahshing wrote on 8/26/2018, 11:24 PM

Switch to 32bit full range>sRGB/Rec709 view transform resulting preview/render performance becomes unacceptably slow on my XPS9550/i7 6700HQ/GTX960M laptop.

fr0sty wrote on 8/26/2018, 11:30 PM

Premiere doesn't do real HDR... it has no way of outputting it properly (every time I try it screws the levels up, and the colors are Rec709). It also lacks an HDR preview option.

32 bit mode requires some processing power, more than you have for sure. HDR editing in Vegas (or Resolve, in my experience) isn't really suited for mid range laptops.

To the OP, if you could have any specific color features Vegas is currently missing, what would they be? I ask because I tried grading an HDR timelapse in Resolve, then in Vegas 16... while there seems to be a levels bug in there currently, the process of getting a HDR image I really liked was far easier for me to do in Vegas. To be specific, Resolve has great tools and they can indeed be more accurate, but the performance on my GTX970 wasn't very good at all in HDR mode, it is far better in Vegas, and when I attempted to do even the most basic noise reduction, it simply refused, citing insufficient VRAM. With Vegas I am able to run Neat Video with GPU acceleration while Vegas is also using NVENC, no problems.

I work with VLOG in Vegas often, haven't had many issues there either, so I was curious.

Last changed by fr0sty on 8/26/2018, 11:36 PM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

ahshing wrote on 8/26/2018, 11:52 PM

Premiere doesn't do real HDR... it has no way of outputting it properly (every time I try it screws the levels up, and the colors are Rec709). It also lacks an HDR preview option.

32 bit mode requires some processing power, more than you have for sure. HDR editing in Vegas (or Resolve, in my experience) isn't really suited for mid range laptops.

To the OP, if you could have any specific color features Vegas is currently missing, what would they be? I ask because I tried grading an HDR timelapse in Resolve, then in Vegas 16... while there seems to be a levels bug in there currently, the process of getting a HDR image I really liked was far easier for me to do in Vegas. To be specific, Resolve has great tools and they can indeed be more accurate, but the performance on my GTX970 wasn't very good at all in HDR mode, it is far better in Vegas, and when I attempted to do even the most basic noise reduction, it simply refused, citing insufficient VRAM. With Vegas I am able to run Neat Video with GPU acceleration while Vegas is also using NVENC, no problems.

I work with VLOG in Vegas often, haven't had many issues there either, so I was curious.


So you edit the HLG in 32bit mode? And you use laptop or desktop for your works?

fr0sty wrote on 8/27/2018, 1:14 AM

I use a desktop, my laptop is far too weak to edit even 8 bit 4K. My desktop is barely powerful enough for Vegas, and isn't really powerful enough to do much in Resolve without it running out of VRAM, and it is a Ryzen 7 1800x with 64GB RAM and a GTX 970 (the GPU being the main bottleneck). I edit VLOG, as the research I've done has shown it grades better than HLG, but I was recently persuaded to try HLG by a friend of mine that is a pro colorist who I have a lot of respect for, saying he preferred the way the highlights roll off vs. using VLOG, so I am going to give it a shot on my next HDR shoot.

Last changed by fr0sty on 8/27/2018, 1:15 AM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

AVsupport wrote on 8/27/2018, 1:15 AM

Not that I like PP, but I think Lumetri is quite capable; Resolve has LOG grading options and you can save your own LUTs. In VP I feel I always have to resort to 3rd party plugins (Hitfilm Ignite, Filmconvert, Newblue Colourfast..) to colour correct effectively, which is sad, after so many years. I would also love to create & save custom 'chains' of FX plugins, and have a Favorite FX folder I can copy plugins into..and a searchbar that doesn't take 20 secs to search

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

Peter_P wrote on 8/27/2018, 1:44 AM

Switch to 32bit full range>sRGB/Rec709 view transform resulting preview/render performance becomes unacceptably slow on my XPS9550/i7 6700HQ/GTX960M laptop.


It is the same on my i7-8700k desktop system. Probably a powerfull additional graphics card could help.

RogerS wrote on 8/27/2018, 2:56 AM

Are you all commenting on the performance of 16?

Peter_P wrote on 8/27/2018, 3:04 AM

Yes, I did.

ahshing wrote on 8/27/2018, 3:18 AM

Not that I like PP, but I think Lumetri is quite capable; Resolve has LOG grading options and you can save your own LUTs. In VP I feel I always have to resort to 3rd party plugins (Hitfilm Ignite, Filmconvert, Newblue Colourfast..) to colour correct effectively, which is sad, after so many years. I would also love to create & save custom 'chains' of FX plugins, and have a Favorite FX folder I can copy plugins into..and a searchbar that doesn't take 20 secs to search


Do you use 32bit or 8bit pixel format to edit with filmconvert?

I was always wonder if I have already chose the Camera profile (eg.Slog3/HLG) inside filmconvert, should I use 32bit and sRGB view transfrom or just leave it at 8bit?

Wolfgang S. wrote on 8/27/2018, 3:20 AM

The issue is that the HDR grading works with ACES 1 mainly. And ACES 1 is available in the 32bit Floating Point mode only - what makes HDR grading very slow. To my opinion impossible on a Laptop today. Even with my 8 core Desktop I have to switch to preview/quaterly to have enough fps for the preview from the timeline if I enable the conversion to rec2020 or rec709.

For me a workaround is that I use the Atomos Sumo or Sumo M as grading monitor.

https://www.atomos.com/sumo19m

I run the Sumo either from my R9 390X or my Decklink 4K Extreme 12 G. The Decklink works only if you have enabled the conversion to rec2020/rec709 in ACES what seems to be a bug in Vegas. But here the good news are that you can use a GPU like the R9 390X if you use the Sumo - simply because you can switch the Sumo to make im adjust the Gamma and Gamut conversion. So either you use the ACES 1 conversion in the Project Settings what will bring down your performance, or you you use the Sumo for the conversion and set it to rec2020/PQ/rec2020. With the later you can run the HDR editing also in the 8bit mode - what is great in terms of performance. And switch the Project Settings for the final Rendering only. And you have a great preview too - because you can set the Sumo to vlog/v-Gamut/rec2020 if you use footage from the GH5 or EVA1. Or to slog if you use a Sony camera.

The Sumo sounds expensive to some of you? Mabe. But beaAware that you will Need a 1000nit unit for HDR editing anyway. The Sumo has also the disadvantage that it is HD only - but the advantage that you can use the internal waveform monitor with a better nit scale as available by now in Vegas (the new implemented 10bit waveform/rgb Parade has no nit scale yet, you need to know how the relationship between the levels and nits to grade for a specific nit level as required). A nit scale may come later. And another advantage of the Sumo is that you can set the Sumo HDMI output to HLG and use a Standard rec709 UHD monitor or even a HDR-TV as second Control monitor behind the Sumo. So this unit offers a lot of Advantages.

Another unit that could be used is the Dell UP2718Q, https://www.tomshardware.com/reviews/dell-up2718q-hdr-pro-monitor,5231.html . But it is not cheaper then the Sumo and allows only PQ but not HLG.

What I have seen is that it is most appropriate to use the ACES 1 with 1000nits in the project Settings for the ACES conversion. That will suit the most OLED-HDR-TVs, but be Aware that you have to grade to 600-800 nits today at the best (OLEDs does not achive a higher nit figure today).

At the Moment they have implemented PQ onyl. HLG may come later.

What they have implemented are metadata in the export function of the HEVC Encoder. That allows to create footage that should work on every HDR-TV, what is a big Advantage.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

AVsupport wrote on 8/27/2018, 6:02 AM

Do you use 32bit or 8bit pixel format to edit with filmconvert?

My source material is XAVC-S being 8-Bit, CINE4. That is ~+800% dynamic range than vanilla 709. SLOG doesn't make much sense in 8-bit as it will break apart because of the lack of information in the highlights/shadows. That said, you do get more stops into CINE but it needs to be treated back to 709 if that's where you want to go (and yes, that is where I need it to go). Hence I like grading tools that can handle a CINE (or LOG) curve and can treat (compress/expand) highlights and shadows as needed. Grading in 32bit doesn't seem making much sense, as you cannot alter the source resolution. Looking ahead, 10Bit would be just fine if it was me. Everything else (including HDR delivery) I consider a waste of computing power for my purposes at this stage..

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

Wolfgang S. wrote on 8/27/2018, 8:53 AM

Filmconvert uses the OFX interface, and I really wonder if it makes sense to use here more then 8bit in the project settings. While I love the results with Filmconvert, I am still not sure about that.

10bit is not a waste of Computer power - but what is required to work with all kind of log files.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

ahshing wrote on 8/27/2018, 9:23 AM

I tried several combination with the filmconvert, eg.

8bit/enabling HLG camera profile from filmconvert

32bit/sRGB generic camera profile from filmconvert

seems the latter produce better skin tone/overall color for me.

AVsupport wrote on 8/27/2018, 5:21 PM

10bit is not a waste of Computer power - but what is required to work with all kind of log files.

If you read my post again, thats not what i said Wolfgang, I said 10 Bit would be fine, I think 32 bit is a waste being 4x the data. I also think 4K is a bit of a waste (for me), also being 4x the data. If it was for me, I'd rather shoot/edit 10Bit Cine or LOG, 2.5K/ Double PAL 50p.That should easily be do-able on my machine without having to go to a cumbersome proxy workflow. Unfortunately XAVC-S doesn't do 2.5K, nor 10Bit currently. Waiting for A7Siii. There will be more, we need good workflow options now.

@ahshing, HLG is designed as an 'instant delivery format' to a HLG capable playback device, and not really as an intermediate acquisition instrument prior to grading, like LOG. and with Filmconvert, it will always grade your camera towards the filmstock selected, there's no 'plain vanilla 709' option. I asked them once, but they don't want to do that (there might be a different software package down tne track..). My guess is, since you could save the LUTs, it might make them redundant..

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.