This got me stumped...
I've been checking some of the files I rendered in Vegas with a little utility called AVIcodec. (it checks MPEG too) Its main purpose is to identify what CODEC is used but it also gives some other info like the compression, bitrate, etc..
Anyhow, regardless if I check a Vegas rendered project made using either the NTSC DV AVI or DV MPEG template and for sure the slider was all the way to the right, when created this program reports the video at 98%. I've checked twenty so far and they'll all at 98% quality regardless if a AVI or MPEG.
What am I missing? Maybe because the templates are set to a max of 8 Kbps?
I've been checking some of the files I rendered in Vegas with a little utility called AVIcodec. (it checks MPEG too) Its main purpose is to identify what CODEC is used but it also gives some other info like the compression, bitrate, etc..
Anyhow, regardless if I check a Vegas rendered project made using either the NTSC DV AVI or DV MPEG template and for sure the slider was all the way to the right, when created this program reports the video at 98%. I've checked twenty so far and they'll all at 98% quality regardless if a AVI or MPEG.
What am I missing? Maybe because the templates are set to a max of 8 Kbps?