This is sort of a continuation from another thread, but I find it quite interesting an felt it deserved its own thread.
People tend to say that a 2 pass vbr setting would yield a better quality than a comparable cbr setting. This may or may not be true... but there are a few things that just aren't making much sense.
If we use HDV as an example which we all know is recorded at 25Mbps and we also know that you can't get a better quality on a 25M recording by rendering it at say... 30M, then what is the sense in using the default setting in the Blu Ray template (20M/25M/30M)?
If the bitrate average is set at the original recording's maximum bit rate (25M) then the VBR technique has no room to maneuver at the high end and even if it does render much over 25M then it's just wasted bits because you CAN'T get a better quality than was originally recorded. Logically speaking, the only thing a VBR setting would do in this case is become a 'space saver' by removing bits where they are not needed. In other words if the average setting is set at the original recording's maximum then doesn't the high setting become redundant and useless? Conversely, if I use a CBR setting of 28M (the extra few thousand bits over 25M to take care of any spikes above 25 that MAY exist) I may be wasting space by delivering more bits to a particular scene that doesn't need.... But I'm NOT going to get a better or worse quality because I'm already OVER hdv's max recording level anyway.
To me, it would make more sense when using VBR, to set the average bit rate LOWER than the hdv max (25M) so that the 2 pass vbr system not only has room to manuver at the low end... but also at the high end. If for example we set the low at15M, the average at 20M, and the high at 28M then the VBR system would be able to actually take advantage of adding/subtracting bits in BOTH directions and not just at the low end. But at the same time (in the back of my head) I'm feeling as though I'm cheating myself by setting a LOWER average bitrate!?! Am I ripping myself off by setting a lower average bitrate?
I ran a sample test with a LOWER average bit rate for hdv (at 15M) and this is what I got:
=======================================================
The test strip was indoors with average lighting
VBR was set for 15M average, 30M high, and 10M low (2 pass) and took 4:29 to render
CBR was 15M and took 2:09 to render
My results were as follows:
VBR file size= 235,794,432
CBR file size= 237,525,104
When burned to disk and played back in the PS3:
CBR varied in bit rate between 14.6 and 15.6
VBR varied in bit rate between 9.3 and 28.3
As for quality.... my wife and I couldn't tell the difference between the 2.
The VBR file size is DEFINITELY smaller..... BUT it does however produce the same quality (or at least no difference that I could see)
==========================================================
Any one else's opinion would be gladly appreciated.
People tend to say that a 2 pass vbr setting would yield a better quality than a comparable cbr setting. This may or may not be true... but there are a few things that just aren't making much sense.
If we use HDV as an example which we all know is recorded at 25Mbps and we also know that you can't get a better quality on a 25M recording by rendering it at say... 30M, then what is the sense in using the default setting in the Blu Ray template (20M/25M/30M)?
If the bitrate average is set at the original recording's maximum bit rate (25M) then the VBR technique has no room to maneuver at the high end and even if it does render much over 25M then it's just wasted bits because you CAN'T get a better quality than was originally recorded. Logically speaking, the only thing a VBR setting would do in this case is become a 'space saver' by removing bits where they are not needed. In other words if the average setting is set at the original recording's maximum then doesn't the high setting become redundant and useless? Conversely, if I use a CBR setting of 28M (the extra few thousand bits over 25M to take care of any spikes above 25 that MAY exist) I may be wasting space by delivering more bits to a particular scene that doesn't need.... But I'm NOT going to get a better or worse quality because I'm already OVER hdv's max recording level anyway.
To me, it would make more sense when using VBR, to set the average bit rate LOWER than the hdv max (25M) so that the 2 pass vbr system not only has room to manuver at the low end... but also at the high end. If for example we set the low at15M, the average at 20M, and the high at 28M then the VBR system would be able to actually take advantage of adding/subtracting bits in BOTH directions and not just at the low end. But at the same time (in the back of my head) I'm feeling as though I'm cheating myself by setting a LOWER average bitrate!?! Am I ripping myself off by setting a lower average bitrate?
I ran a sample test with a LOWER average bit rate for hdv (at 15M) and this is what I got:
=======================================================
The test strip was indoors with average lighting
VBR was set for 15M average, 30M high, and 10M low (2 pass) and took 4:29 to render
CBR was 15M and took 2:09 to render
My results were as follows:
VBR file size= 235,794,432
CBR file size= 237,525,104
When burned to disk and played back in the PS3:
CBR varied in bit rate between 14.6 and 15.6
VBR varied in bit rate between 9.3 and 28.3
As for quality.... my wife and I couldn't tell the difference between the 2.
The VBR file size is DEFINITELY smaller..... BUT it does however produce the same quality (or at least no difference that I could see)
==========================================================
Any one else's opinion would be gladly appreciated.