HDR projects rendering bad results when Windows on SDR display mode

Erkki wrote on 5/24/2023, 11:28 PM

Hi.

Version used Vegas 20, build 403. Display adapter NVidia 2080 super. Clips used had slog3.cine mode.

I started experimenting HDR10 video and did following:

Windows 11 display mode was SDR (tested earlier also windows 10).

  1. Set project mode to HDR10.
  2. Imported slog3 clips and set their media properties Sony slog3.cine.
  3. Preview mode set to REC2020 709 limited.
    1. On some colors there were artifacts/color errors
  4. Rendered video (NVENC).
    1. On same computer screen same artifacts were shown. --> Not expected quality
    2. Checked same video with calibrated television. --> Same problem.
    3. When rendering same stuff SDR/8-bit project (used input LUT for color transformation) output was ok and no artifacts.

 

Then after random mouse clicks here and there by accident switched windows HDR mode and tried same HDR project again.

  1. No artifacts and rendered output looked good as expected. HDR powered flowers on large tv screen looked great.

 

My questions are:

  1. Is it expected that while doing 32-bit projects (SDR/HDR) Windows 11/10 display mode should be always HDR?
    1. If this is not the case, could this somehow relate NVIDIA display adapter? Different output based on display mode.
    2. If this is the case, is this requirement documented somewhere in Vegas docs?

 

My expectation was that display mode should not affect to render output. It just make color grading more dependent on video scopes.

Or am I doing something fundamentally wrong at first place?

 

 

 

 

 

Comments

Wolfgang S. wrote on 5/25/2023, 12:59 AM

I started experimenting HDR10 video and did following:

Windows 11 display mode was SDR (tested earlier also windows 10).

The settings in Windows internally have to be set to HDR for your display before you start up Vegas, but not to SDR. Otherwise, Vegas will not enable the HDR preview, since Vegas has to see an HDR monitor connected to your graphic card. In other words, you need an HDR-monitor in your system, also because you need such an HDR-monitor for an accurate preview to be able to see your grading in the preview. Should be an unit that is capable of 1000 nits.

For the different settings for HDR, you can also use the BRAW tutorial (especially for the settings). See here

https://www.vegascreativesoftware.info/de/tutorials/hdr-grading-of-6k-50p-braw-footage-in-vegas-pro-20-english--138440/

Without an HDR monitor in your system, you will not be able to enable the HDR output for the preview - see Fig3 in the tutorial.

 

Set project mode to HDR10

That enables the correct HDR settings. In the tutorial you have that in Fig2.

Please be aware that this brings you to the 32bit floating point mode in full range, it enables ACES as color management, and it brings the view transformation to "Rec2020 ST2084 1000 nits (ACES)". This are the correct settings for HDR10 or PQ - you could also use HLG.

Imported slog3 clips and set their media properties Sony slog3.cine.

This is correct. ACES always need the input transformation, in your case to slog3.cine.

  1. Preview mode set to REC2020 709 limited.
    1. On some colors there were artifacts/color error

This are wrong settings. You do not wish to combine rec2020 with 709... but you wish to combine the gamut of rec2020 with the gamma curve as specified for PQ. So use the settings mentioned above.

  1. Rendered video (NVENC).
    1. On same computer screen same artifacts were shown. --> Not expected quality

Maybe an result of the wrong settings. Or you have brought the material into clipping, I do not know. Clipping will take place later and less likely if you use ACES due to this wide gamut, compared with an input LUT.

    1. Checked same video with calibrated television. --> Same problem.
    2. When rendering same stuff SDR/8-bit project (used input LUT for color transformation) output was ok and no artifacts.

You would need an input LUT from slog3.cine to HDR10. That is available, but not in the LUTs build in in Vegas. But you could import such a LUT in Vegas. However, I would not recommend to use that approach anyway, and for sure not with 8bit project settings (HDR should be 10bit, what means that you work with the 32bit floating point mode in Vegas). Additional, the ACES approach as color management is superior to such LUTs.Input LUTs may result in worser results then grading in the wide ACES gamut - that is why I would recommend ACES.

Then after random mouse clicks here and there by accident switched windows HDR mode and tried same HDR project again.

  1. No artifacts and rendered output looked good as expected. HDR powered flowers on large tv screen looked great.

I assume that you had now the correct project settings, as described about. That is why everything worked now.

My questions are:

  1. Is it expected that while doing 32-bit projects (SDR/HDR) Windows 11/10 display mode should be always HDR?
    1. If this is not the case, could this somehow relate NVIDIA display adapter? Different output based on display mode.
    2. If this is the case, is this requirement documented somewhere in Vegas docs?

Yes, this is expected.

 

My expectation was that display mode should not affect to render output. It just make color grading more dependent on video scopes.

Or am I doing something fundamentally wrong at first place?

Summary: Vegas assumes that you have a true HDR preview available. That will only be the case if you have a HDR monitor in your system. To grade to HDR without a HDR monitor means, that you do not see what you are doing. The scopes are an important indication, but will not show you the picture really.

As said, it should be a HDR monitor that has the technical specification of 1000 nits. So HDR400 is not enough really, even if you may be able to set such a monitor to HDR. However, you will not be able to see the full grading result really, since either such a unit may clip, or may use tone mapping to reduce the 1000 nits to what the monitor is capable to show. For example, I work with the ASUS ProArt Monitor PA32 UCG-K, that can 1000 nits about 100% of the monitor surface - and goes up to 1600-1700 nits as peak luminance. But this monitor is quite expensive, and for sure you may find cheaper solutions in the market, for example this monitor here:

https://www.philips.de/c-p/27B1U7903_00/professional-monitor-4k-uhd-mini-led-thunderbolt-tm-4-monitor#see-all-benefits

 

Last changed by Wolfgang S. on 5/25/2023, 5:57 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Erkki wrote on 5/25/2023, 10:54 PM

Hi. Thanks for clarification and confirmation of need of HDR mode! 🙂

I used rec709 for checking if there is something wrong with clips itself.

Turning windows HDR mode seems to be key. I still don't understand how it affect to render output. When testing clips with 8-bit project (use sony SL3SG3Ctos709.cube as input lut) on SDR or HDR Windows display mode, output is ok (no color artifacts). No color grading done before render.

When doing same by turning project to HDR mode and setting clip properties to Slog3.cine while Windows is still SDR mode and rendering sample, I would expect ~ similar result without artifacts. When setting view transform to rec2020 rec709 limited or rec709, artifacts are visible on preview and rendered output.

And when turning Windows HDR mode and repeating previous, output is valid.

--> Your display settings alter rendered output when project is not altered any way.

Based on your recommendation, I will consider changing my current display BenQ EW327OU that is has peak nits 308 (Windows display properties dialog) to get also HDR preview up to date.

 

 

 

Wolfgang S. wrote on 5/26/2023, 12:30 AM

You can also grade the log footage to rec709 so SDR - to use the higher dynamic range of log for SDR. That workes. And use then a rec709 display for the preview. But that is not grading to HDR.

The combination rec2020rec709 makes not much sense really. Even a workflow to rec709 does not use that.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems