Version used Vegas 20, build 403. Display adapter NVidia 2080 super. Clips used had slog3.cine mode.
I started experimenting HDR10 video and did following:
Windows 11 display mode was SDR (tested earlier also windows 10).
- Set project mode to HDR10.
- Imported slog3 clips and set their media properties Sony slog3.cine.
- Preview mode set to REC2020 709 limited.
- On some colors there were artifacts/color errors
- Rendered video (NVENC).
- On same computer screen same artifacts were shown. --> Not expected quality
- Checked same video with calibrated television. --> Same problem.
- When rendering same stuff SDR/8-bit project (used input LUT for color transformation) output was ok and no artifacts.
Then after random mouse clicks here and there by accident switched windows HDR mode and tried same HDR project again.
- No artifacts and rendered output looked good as expected. HDR powered flowers on large tv screen looked great.
My questions are:
- Is it expected that while doing 32-bit projects (SDR/HDR) Windows 11/10 display mode should be always HDR?
- If this is not the case, could this somehow relate NVIDIA display adapter? Different output based on display mode.
- If this is the case, is this requirement documented somewhere in Vegas docs?
My expectation was that display mode should not affect to render output. It just make color grading more dependent on video scopes.
Or am I doing something fundamentally wrong at first place?