Rendering an MPEG-2 master for DVD. To avoid nasty dropouts, I am forced to use a constant bitrate. 8 Mbps constant looks good; 9 Mbps a bit better. Is 9 too much? Will it cause compatibility issues? The DVD will be replicated at a professional house.
It may be with an older DVD player. If you're using a Blu-ray player to play your DVD, (which IMO everyone should) then definitely not. If your player is oldish, and If you've got the time, then as mentioned above try a 2 pass variable. Something like 9.5 max, 6 average, 4 min. Don't be tempted to go too low on the min, it gives poor fades etc...
Thanks OldSmoke and Arthur for the advice. Unfortunately, in my case, only a constant bitrate will avoid blocky dropouts (I have a pretty crazy shot that makes the variable bitrate encoder go crazy even with the settings mentioned here). However, the movie is 63 minutes, and a 1 minute file @ 8 Mbps is 62, 562 KB, so when combined with uncompressed PCM audio, I think it will go over the 4.37 GB limit on a single layer DVD...
My average is never higher than 8, peak 9, minimum 1.
In the past I've done a 10-second DVD at 7, then 8, then 9, and see if I could actually note any blockiness or mosquito noise in taxing scenes (tiny waves on the water, lots of tiny leaves on a tree during a pan). To me 8 looks pretty good, and I had some fast-play issues with a higher bitrate some years ago on one of my DVD players.
As others have brought up, what's going on about this "blockiness?" VBR is not necessary for projects shorter than roughly one hour anyway, but it shouldn't be causing any artifacts.
And bitrate calculation is a poor horse that has been beaten to death on this board for years, but the quickie formula I use is: 600 / minutes = total bitrate. Then subtract your audio bitrate from the total bitrate (usually 0.2) and you get the average video bitrate. Really simple stuff.
... when combined with uncompressed PCM audio, I think it will go over the 4.37 GB limit on a single layer DVD.
The bitrate calculator I use says you're right :(
A 63 min. video with PCM audio comes out to a CBR of 7,552,000.
Is there a reason you're not using AC-3? You can use a very high bitrate, still use a CBR of 8,000,000 and it will sound just fine.
@OldSmoke: The dropouts are visible in the actual file when played back on Vegas' timeline. Yes, GPU off doesn't help. The shot is very complex for the encoder, only CBR works, I tried everything. A glowing white angel disappearing in a flash of white that eclipses the entire frame over a background of blurry moving smoke. It's this apparent loss of detail (as the angel becomes harder to see as the white takes over) that causes the VBR to think it needs to use a lower bitrate which results in severe dropouts. However, I don't mind CBR, it makes things look better overall in other shots, too.
@musicvid10: Even with the min bitrate set to 4,500,000 it still glitches.
@rs170a: Thanks for checking that. I think CBR of 7 Mbps should be a good compromise, unless I want to compress the audio with AC-3. I know www.gearsoftware.com recommends 7 Mbps for widest compatibility (not sure if their referring to VBR or CBR).
Why does encoding always turn into a discussion of variable. Variable is only beneficial when you are trying to "fit" a certain amount of time to a disc or type of media.
Constant renders quicker than any two pass as well.
Constant at the highest bit rate you can sustain is preferred.
Zelkien69, you have a point, but as I mentioned in my earlier post I found that running flat out at 9 would freak out one of my DVD players when in fast-play mode. Okay, so that implies running at a slightly lower bitrate. But then there are scenes that need every ounce of bitrate to render without artifacts (I'd mentioned water waves and tree leaves). So that was how I concluded that I would use a peak of 9, but not run continuously at 9.
On the bottom end I found that if I set the minimum super low then sometimes the DVD player had a hard time spinning up the disk quickly enough to keep the bit buffer filled when the scene changed. So I picked 1Mb/sec.
One-pass requires the encoder to guesstimate the quantization as it goes. Two-pass allows the encoder to know precisely what scenes need high and what scenes can do with less, so one gets the best overall image quality while also precisely meeting the target file size. But you're right, it does slow down the process.
Which is why I was delighted to discover (courtesy of this board) that while I had to suffer the initial render time, succeeding repairs to the project were literally a breeze since Vegas could race through the no-recompress portions of the timeline and deliver a fixed MPEG2 file in a matter of minutes.
Under closer examination, I've discovered that Vegas 8.1 produces a cleaner file (less visible macroblocking and fewer glitches) than Vegas 13 when rendering into the same MPEG-2 format with identical settings, even at CBR. Vegas 13 is sharper, yes, but it does actually still glitch slightly on frames that Vegas 8.1 doesn't. Maybe that's because it's rendering faster than 8.1 and doesn't analyze the video as well.
EDIT: It seems to depend on the source video. Vegas 13 is pretty good for the most part @ CBR but will glitch slightly (visible macroblocking) when one shot cuts to another shot.
The answer to your question depends entirely on what problem you are actually seeing. You use the word "dropouts," but that usually means an absence of picture, either an entirely blank frame, or portions of a frame missing. It can also mean really large spots, like those you used to see all the time with old or cheap videotape. They'd show up as brief horizontal white spots.
Dropouts are not caused by insufficient bitrate. Instead, they are caused by bad encoding. You say you have turned off the GPU, but if that is on, it can cause blank frames.
Can you post a few seconds of what the video looks like when there are dropouts present?
Also, have you actually tried the two-pass variable bitrate encoding? Try it on a problem section. 2-pass works quite differently from the one pass encoder. Try these settings with two-pass enabled:
I've done many thousands of MPEG-2 encodes using source material ranging from 10 fps 1st-generation cell phone video to 4K video. I've never had dropouts and, using the settings I just gave you, I've not seen any major artifacts, certainly nothing I would ever characterize as a dropout.
So, I think you have some other problem going on, but until you post what the "dropout" video looks like, it will be difficult to offer accurate advice. Even a still image of the screen, taken with a digital camera.
Sorry, John, "dropout" is probably the wrong word. "Blocky artifact" is closer to the point. And yes, I've tried many combinations of bitrates in the VBR encoder with 2 Pass on... only the CBR worked for this shot. Thanks for taking the time to share your thoughts.
EDIT: Actually, 2-pass SEEMED to work when only that one problem shot was rendered. However, when rendering it with surrounding video (for a total of 1 min.), even 2 pass would glitch.