I was trying to wrap my mind around what Vegas Pro is doing when you set the bit depth to 32 bit and maybe like some others, found myself confused and probably misinformed about what is going on under the hood. I found this article, and even though it references a software program I will not mention by name, I think it explains it well and assume lot of the same processes apply to Vegas. Initially, the scenario I was attempting to fully understand is what happens when you import 10-bit source footage such as Sony's own AVC-IntraFrame codec, set the bit depth before render to 32 bit, and then render out to Magix Intermediate XQ .
My question was: Is the output on the rendered video file 8 bit or 10 bit? Would it ever be 10 bit, for instance, if you rendered to MXF wrapper? When I ck MediaInfo for Magix Intermediate XQ, it doesn't list Bit Depth whereas in the other "non Magix Intermediate" codecs it shows Bit Depth as 8. In the article, the scenario I'm asking about is most closely related to Scenario #5 at the bottom of the article.
So I am not sure of the answer but I do wish to understand. I want to thank those on this forum for sharing their knowledge in taking the time to explain these things. I do try to figure things out on my own before I post here and don't mind trying to research and test myself.
http://blogs.adobe.com/VideoRoad/2010/06/understanding_color_processing.html