I hesitate to start another thread about pixel format, but I’ve been having an issue that I haven’t been able to find elsewhere on the forum.
Using Vegas Pro 18 Build 482. I shoot and edit 10-bit video. Up until now, I’ve been editing my projects in 8 bit full range for best system performance. Then, when it’s time to render, I switch over to 32 bit full range, for better quality and less banding.
PROBLEM: Sometimes when I switch the pixel format setting from 8 bit full range to 32 bit full range, the image in the preview window stays exactly the same and the final render looks exactly the same as the preview window. Success. But other times, same project, same footage, unpredictably and randomly, the image in the preview window gets dimmer when I switch over to 32 bit full range, and the histogram changes as well. I’m being very careful to make sure my settings stay the same between 8-bit and 32-bit floating point, which are:
Composing Gamma: 2.2
Aces Version 1.2
Aces Color Space: Default
View Transform: Off
Is there something I’m not understanding in this process? Or is it possible there’s some sort of glitch here? Has anyone else experienced this?
PROBLEM 2: If before rendering I instead switch the pixel format from 8 bit full range to 32 bit video levels, the image in the preview windows gets dimmer and less contrasty, as is to be expected. However, when I render out the video, the final video looks ALMOST as I want it to look, except some of the very darkest black detail is crushed and lost. So this isn’t a good workout around.