Comments

johnmeyer wrote on 9/20/2004, 3:50 PM
Is there a big loss in quality as the bitrate is lowered?

Yes. Lower bitrate = lower quality.

If so, how noticeable is it?

Depends on the material you are encoding and on your tolerance for the artifacts created by lower bitrates. Noisy video (like that taken in low light, or transferred from low-quality sources, like VHS tape), require higher bitrates to look as good as the originals because lots of bits are required to encode all that random "snow" that overlays the real picture (which is why professional encoders use noise reduction on poor footage prior to encoding). Fast moving video requires higher bitrate to maintain quality than does talking head footage.

What is the lowest bit rate setting that is recommended for a good quality DVD.

See the above "it depends" paragraph. Everyone will give you a different answer, even after writing their own version of "it depends." For me (I am fussy about quality), I do not like to encode DV video at an average bitrate below 6,000 kbps, although I have gotten resonably acceptable results going as low as 4,500 kbps (although I used an external MainConcept encoder with lots of tweaked settings, and used the two-pass feature). Some people say that you get compatibility problems if you encode at average rates above 7,000 kbps, although I often encode at 8,000 kbps (average bitrate) and have not had problems with the disks I have sent out.

Another way of putting this is that when using DV video as the source, I try to never put more than 90 minutes on one DVD (single sided, single layer 4.7 GByte).