Assuming I can capture 10bit video over SDI how will Vegas deal with this, will it read all 10bit of the data and do it's internal calcs using all 10bits or will it truncate to 8bits and then perform the calcs.
I see no way to output 10bit video so I guess we're still stuck with a pretty serious loss in the process, at least knowing all available data is being used would be something. I was hoping that V6 would free us of the 8bit limit but it seems not.
Yes, I understand that 8bits are 'adequate' for most things but I'm looking at working with 35mm film scans for digital projection in cinemas.
And please don't anyone give me yet another lecture about how much storage space and bandwidth I'll need!
Bob.
I see no way to output 10bit video so I guess we're still stuck with a pretty serious loss in the process, at least knowing all available data is being used would be something. I was hoping that V6 would free us of the 8bit limit but it seems not.
Yes, I understand that 8bits are 'adequate' for most things but I'm looking at working with 35mm film scans for digital projection in cinemas.
And please don't anyone give me yet another lecture about how much storage space and bandwidth I'll need!
Bob.