Does the bitrate affect DVD compatibility with set top players? Is 6500 better than 8000? Which is more compatible? Does burn speed affect DVD comptibiltiy with set top players? Is 1x better than 4x? Whic is more compatible?
bit rate certainly does, but burn speed is more relevant to media brand and error free disc authoring.
Too high a bitrate, it may not play on some systems. Keep 8Mbps as your max, you should always be compatible with all but the oldest of the computer DVD players. All set tops will play this with no problem.
I have had problems with some older set tops at 8Mbps, as a result I had to use 6 as a maximum (there were pretty poor excuses for DVD players by the way).
The advantages of burning at low speed are largely mythical. In modern burners you may well get more errors at 1x than 4x. The faster the disk spins the more stable it should be so less jitter.
The only instance of burning at a too-high bitrate I experienced came when I tried an AVERAGE of 9.5! Disk would occasionally freeze, and refused to fast-play at 2x (duh).
This isn't a bitrate problem, probably the laser is too weak to cope with the lower reflectivity of R/RW media. A new player will cost around $100, also these things don't last forever, the lasers do wear out, first sign is when they stop playing R/RW media.
I have read hundreds of posts and tried to research this topic for almost a year. There is a lot of urban legend surrounding the idea that high bitrates cannot play on certain players, but I have seen absolutely no testing to confirm this. However, I admit I have nothing to disprove the idea either.
First, some facts. The DVD spec states that all players must be able to play up to 9,800 bits per second video bitrate, continuously:
Obviously it is possible that some DVD players cannot do this, but I doubt it. With commercially created DVDs, I would bet that every DVD player meets this spec.
The real question -- and the one that I have never seen confirmed -- is whether the same is true with DVDs that are recorded on DVD-R or DVD+R media. The problem is that recordable media have a totally different reflectivity than do DVDs that are commercially produced. This lower reflectivity may result in a higher error rate on some players, and at some point the player cannot correct the errors fast enough and therefore playback fails.
Another fact: All of us have had the experience of creating a DVD that plays on one player but not another. The question, however, is always: What caused this to happen? Is it the media, the burner, the authoring software, the encoding software, the bitrate, the DVD player, or a secret plot by al-Qaida?
If you look at this post, you'll see links to the only tests I know of that attempted to sort out some of this:
If you read everything in this test, you will find his comment on bitrate:
"Use low to moderate bitrate encoding for video if the final product will be delivered on a recordable DVD. Although most DVD players and DVD-ROM drives can read recordable DVD discs, these discs often have a higher error rate during playback than replicated discs do, causing the player to reread some data sectors. Using a low to moderate (less than 7Mbps) data rate for video encoding will make the recorded disc a little easier to reread without a visible pause in the video playback, or audio dropout."
I have never been able to trace a problem related to bitrate. This is not to say that such a problem doesn't exist; I am only saying that even LaBarge's testing didn't actually verify the problem. Also, note that he recommends 7Mbps (7,000 Kbps, to put it in Vegas terms) average bitrate as a threshold, not something lower. Since I can easily see artifacts even at 7,000, I certainly would never intentionally create a DVD at 6,000 or 5,000 -- which would significantly degrade the quality -- unless I was forced to fit a long project on one DVD or unless I was using 24p source material (which can be encoded at a far lower rate, both because of the lower frame rate and because progressive is easier to encode).
I recently created a DVD a 7,800 with peaks at 8,800 on Ritek G04 media and distributed it to fourteen clients. No problems. Hardly scientific, but if the bitrate problem was widespread, I should have had someone complain.
This is an interesting topic to me. I can burn an image on Ritek G04 DVD-R with my Pioneer A04 at 2X and it will play perfectly on my 2 Panasonic Desktop DVD players. I take the same Mpeg file and use my Rimage duplicator with Pioneer A06's burning at 4X and when i try to play on the same desktop DVD players, I get a msg that it will not play. I have yet to try and force the Rimage to use 2X to burn the DVD's to see if that helps. The DVD players are 2 yrs or less old.
One component of the problem is the specific chips on the specifc player.
We had a project that was having a number of problems. Encoded at about 8.5 we were well within specs but had about a 5 % return rate in the initial deliveries.
A study was made of the problem DVD players and there did not seem to be a pattern by manufacturer, age, price of player, etc. Finally we were qble to identify and buy three players that were problems. In each there was the same chipset by the same manufacturer. Taking down the bit rate solved the problem.
The most interesting fact was that one manufacturers model 1200 player worked fine. Model 1201 had the cheaper chip set and would not work.
Sounds like an Apex. I have an Apex AD-1500 alongside a fancy Toshiba player here at my desk. My Apex plays virtually anything (probably that first chipset?).
It is an Apex. An Apex 600A (with the loophole menu for disabling macrovision). Not that I ever use that anyway.
So do I dare lower the bitrate? I know that at lower bitrates I can see artifacts very easily. How low is too low? Is 7800 good? Should I keep it at 8000? What should I do?
I believe we took the rate down to max of 5000 to solve the problem. A little loss of quality but most customers do not notice. They are just happy to get one that plays.
Using this rate I have only ever had one customer that could not be helped. I believe she had three fisher brand dvd players. They really do not like DVD-Rs
So do I dare lower the bitrate? I know that at lower bitrates I can see artifacts very easily. How low is too low? Is 7800 good? Should I keep it at 8000? What should I do?
The previous posts pretty much cover the various ideas about lowering bitrate to achieve (maybe) better compatibility. You are raising a separate question, at what bitrate are you going to notice picture degradation compared to the original. There are many, many answers.
1. Do a search on this forum for "bitrate quality." I am sure you will get dozens of posts. This has been discussed many times. Lots of good ideas and answers.
2. It depends on the source material. Some things (like smoke and crossfades) drive MPEG-2 encoding absolutely crazy. You often see artifacts in these types of scenes even in Hollywood movies, and you always see them in these scenes on DirecTV.
3. It depends on you. Some people can't tell the difference between speakers, others hear something like fingernails on a chalkboard if they listen to certain speakers. Some people notice video artifacts, others are unfazed. If you go to the AVS Forum and look at the discussion for DLP projectors, you will find some folks that cannot stand the fringing that occurs around fast moving objects, and then others that swear they've never seen them (some don't see them until they've had the equipment for awhile, and then suddenly notice them for the first time, and then are never happy again -- ignorance truly is bliss in this regard).
4. It depends on how you encode. At lower bitrates (below 7,000 Kbps), VBR definitely helps, and depending on the material, 2-pass may also help. However, there are many other controls not available in the MainConcept encoder included with Vegas 4/5. The standalone MainConcept encoder has all sorts of other controls which, if use properly, can improve quality on low-bitrate encoding. However, they can also screw things up.
With all due respect, the comparisons some of you are drawing with bitrates is somewhat invalid because you cannot compare variable bitrate settings against one another.
With variable bitrate settings, the footagel will determine the bitrate at any given time. If a player starts skipping at a certain point, do you actually know the bitrate for that particular spot?
The only way you can compare bitrates is if you compare Constant bitrates and even then you have to use the same material for the testing. But at least in a discussion if you want to go on at length about bitrates, your premises will have more meaning in a recommendation to a poster. At the very least you will be comparing apples with crabapples.
so when you talk about lowering the bitrate for compatibility, do you lower the max, the average, or both? i've only lowered the bitrate to get more time on a disc. when doing that, i think i leave the max at 8,000,000 and adjust the average to fit more video.
so when you talk about lowering the bitrate for compatibility, do you lower the max, the average, or both?
The average bitrate determines the average quality, although as craftech points out, without the ability to manually control bitrate at certain critical points (something the really expensive encoders can do), you cannot absolutely tell for certain what the bitrate will be.
If compability is your ONLY concern, and you believe that bitrate affects compatibility (you've read all the opinions on this earlier in this thread), then you should use constant bitrate (CBR) and reduce the bitrate to whatever you have decided is going to ensure compatibility. I have already indicated that I am skeptical as to whether bitrate affects compatibility, but only because I have not seen any definitive, scientific testing that proves it. There is plenty of anecdotal evidence that bitrate DOES affect compatibility, and the most reliable of those anecdotes (such as those from LaBarge) seem to say that 7,000 is the number to stay below. Thus, my recommendation is for you to use 7,000 or below and to use CBR.