I have googled and googled till I cannot google no more!
I am trying to find the low down on x.v.Color - I know that it uses full 0-255 values in a 8bit word to define color (opposed to 16-235 in BT.601 and BT.709) - but what I am trying to figure out is why Vegas needs to be set to a 32 bit project in order to 'expose' all the values in the video stream.
My camera is a simply Sony AVCHD cam... it offers the x.v.Color option, and of course compresses to an AVC stream... but isn't AVC still only an 8bit codec? If so why would Vegas need to be changed from 8 bit to 32bit? If I import an 8bit MPEG2 stream or an 8bit DV stream, changing the project from 8 to 32 bit produces no visible changes. However with my x.v.Color streams you can see a change when switching the projects bit depth. So does that mean the stream from my cam is more than 8bits (i.e a 10bit stream)?
Just trying to get my head aroun dhow Vegas deals with x.v.Color - even more so if it is supposed to still be 8bit.
Chris
I am trying to find the low down on x.v.Color - I know that it uses full 0-255 values in a 8bit word to define color (opposed to 16-235 in BT.601 and BT.709) - but what I am trying to figure out is why Vegas needs to be set to a 32 bit project in order to 'expose' all the values in the video stream.
My camera is a simply Sony AVCHD cam... it offers the x.v.Color option, and of course compresses to an AVC stream... but isn't AVC still only an 8bit codec? If so why would Vegas need to be changed from 8 bit to 32bit? If I import an 8bit MPEG2 stream or an 8bit DV stream, changing the project from 8 to 32 bit produces no visible changes. However with my x.v.Color streams you can see a change when switching the projects bit depth. So does that mean the stream from my cam is more than 8bits (i.e a 10bit stream)?
Just trying to get my head aroun dhow Vegas deals with x.v.Color - even more so if it is supposed to still be 8bit.
Chris