I have experienced that Sony Vegas Pro 10 detects wrong colour depths for avi clips using the lagarith lossless codec, x264vfw codec and xvid codec. For instance, when importing a 1280x720 24 bit RGB lagarith lossless clip into Vegas, the media tab under file properties shows that the clip is 32 bit RGB:
Can anyone confirm/disconfirm that this is or is not the case for other versions of Vegas?
The reason I am pointing this out, is that this issue seems to prevent smart rendering of these types of clips.
The odd thing about this is that if you render the clip out as an avi file with the lagarith lossless codec and import the rendered clip back into Vegas, it detects the correct colour depth. It seems as if Vegas only detects the correct colour depth for the lagarith lossless codec if the clip is produced by Vegas itself.