10-, 12-, and 16-bit DPX sequences

vtxrocketeer wrote on 8/18/2016, 12:08 PM
My FS-700 outputs a 12-bit 4k RAW stream to my Shogun, on which I record cDNG sequences that I subsequently grade in Resolve after cutting HD proxies in Vegas.

Trying out a potential new workflow to preserve as much color bit depth as possible until final compression (within Vegas on my round trip), I decided to export DPX sequences from Resolve. My choices are 10, 12, and 16 bits. Vegas has no problem opening 10 and 16 bit DPX sequences, but it doesn't recognize (and can't open) 12 bit sequences. What accounts for this apparent discrepancy?

And why do I care? My thinking is that because source video is 12 bits, I want to keep a 12 bit workflow: I throw away some information upon rendering from Resolve with 10 bits, and I generate unnecessarily large files with 16 bits.

I'm OK with someone pointing out flaws in my proposed workflow and thinking. In addition, I do not have a 10 bit or higher display, so I'm cognizant of that fact that I might not be able to discern visual differences between varying bit depths of DPX sequences. Mainly, I want to know why Vegas does not recognize 12 bit DPX sequences whilst 10- and 16 bit sequences are fine.

Comments

No comments yet - be the first to write a comment...