Render Quality

AlanC wrote on 5/21/2004, 6:01 AM
I've just rendered a short video, 1 min 2 secs, made up of 62 bmp images.

The first render was done at best quality (Windows AVI) and produced a file of 230,295kb
The second render was done at good quality (Windows AVI) and produced a file of 230,295kb
The third render was done at preview quality (Windows AVI) and produced a file of 230,295kb

I would have expected a much smaller file size with each degredation in quality.

Am I missing something obvious?

Alan

Comments

logiquem wrote on 5/21/2004, 6:06 AM
Yup! Render quality has no relation to data rate. It refer to the degree of precision Vegas use to calculate frames.

If you want to adjust encoding data rate, go to the video tab in the render settings window.
AlanC wrote on 5/21/2004, 6:11 AM
Thanks for the very quick response.

So why would anybody choose to render at anything less than Best (render times didn't seem any different between the 3 rates but that could just be because it was only a 1 minute video)

Alan
jetdv wrote on 5/21/2004, 6:14 AM
Because Good really is faster than Best and Best is only useful in a few situations (such as working with photos).
AlanC wrote on 5/21/2004, 6:16 AM
Thanks guys, sorry to sound so thick!

Alan