Vegas Pro 11 not fully AVCHD 2.0 compliant

njdiver85 wrote on 2/8/2012, 11:34 PM
I just paid a lot of money for Vegas Pro as I was hoping it would make it easier to render AVCHD2.0 files that would retain my camcorder recorded quality of 1920x1080 60p at 28 mbit/s.

The recent Vegas Pro 11 build added "Sony AVC format render template customization to support AVCHD 2.0".

The revised AVCHD 2.0 spec allows for 59.94p and 28 mbit/s. Yet despite the Vegas claims to support AVCHD 2.0, the SonyAVC customized template is limited to a bit rate lower than the spec. In fact, the highest bitrate allowed is only 25,999,360.

Seems like a false claim to me! Any thoughts?

Comments

John_Cline wrote on 2/9/2012, 12:34 AM
I just checked an AVCHD 2.0 file (1920x1080-60p) using this handy bitrate viewer and the average bitrate was 25,567,232 bps and the peak bitrate was 27,139,072 bps.

The 28 mbps figure of the AVCHD 2.0 spec is the peak bitrate and I believe that the Sony AVC encoder is a VBR encoder and the bitrate you set in the encoder is the average. It likely is actually AVCHD 2.0 compliant.
[r]Evolution wrote on 2/9/2012, 3:54 AM
njdiver85 - are you seeing differences in the file you import vs what Vegas exports?

Bring in a 60p clip, do nothing to it, and render it back out.
Compare the 2 clips and see if you can see a difference. Also, run them through something like VLC and see if the properties/bit rate have changed.

I shoot but don't edit AVCHD. I transcode when ingesting from camera. I have no reason or want to export back to this acquisition format so I'm curious about your dissapointment.

Please do tell why it's important in your workflow to be able to go back to AVCHD 2.0?
PeterDuke wrote on 2/9/2012, 5:27 AM
You could compare a before and an after with Mediainfo and see if there is any significant change in the bit rates it reports.
TheHappyFriar wrote on 2/9/2012, 6:22 AM
Total bitrates normally include the audio (IE DVD supports 8mbps, etc.). Does the AVCHD 2.0 28mbps you stated include the audio (that would be ~a 2mbps audio).
njdiver85 wrote on 2/9/2012, 10:37 AM
Using MediaInfo:

Original Footage: max overall bit rate 28.0 Mbps, video stream 24.0 Mbps, audio stream 256 Kbps

I rendered using Sony AVC 1920x1080, 59.94 frame rate, bit rate 25,999,360.

If I render to m2ts format, the result is: max overall bit rate 18.0 Mbps, video stream 14.7.0 Mbps, audio stream 192 Kbps


UPDATE: Checking the rendered file using Bitrate Viewer shows and average bitrate of 21.8 and peak of 26.1. Still not the original source bitare but closer. Why are the results different?

Also noticed back in mediainfo that the original file shows (CABAC / 4 Ref Frames) whereas the rendered file shows (CABAC / 2 Ref Frames), so another difference.
[r]Evolution wrote on 2/9/2012, 4:16 PM
I'm still confused as to WHY you feel the bitrate needs to be the same?

I have done transcodes using the same settings coming from Vegas, Premiere, & FCP, &/or used Adobe Media Encoder, Sorenson Squeeze, and Compressor - They do not all come out the same bitrate yet visually you can't tell the difference.

Maybe the Sony transcode doesn't need all that bitrate to deliver the same quality. Do you not notice when you 'unwrap' AVCHD into ProRes, mpg, avi, etc... the file size/bitrate is larger but you can't visually tell the difference if set up correctly?

The point of thinking the bitrates should be the same sounds trivial and moot if the visual quality is the same.
John_Cline wrote on 2/9/2012, 4:24 PM
MediaInfo seems to just report the bitrate which is embedded in the header of the file, Bitrate Viewer actually reads the video data in the file to determine the average and peak bitrates. Also, keep in mind that Vegas deals in bits per second and Bitrate Viewer reports bitrates in kilobits per second, which means that you will have to multiply the values reported by Bitrate Viewer by 1024 to get the actual bits per second.
[r]Evolution wrote on 2/10/2012, 4:03 PM
I'm still confused as to WHY you feel the bitrate needs to be the same?

I have done transcodes using the same settings coming from Vegas, Premiere, & FCP, &/or used Adobe Media Encoder, Sorenson Squeeze, and Compressor - They do not all come out the same bitrate yet visually you can't tell the difference.

Maybe the Sony transcode doesn't need all that bitrate to deliver the same quality. Do you not notice when you 'unwrap' AVCHD into ProRes, mpg, avi, etc... the file size/bitrate is larger but you can't visually tell the difference if set up correctly?

The point of thinking the bitrates should be the same sounds trivial and moot if the visual quality is the same.
LSHorwitz wrote on 2/13/2012, 9:43 AM
Subject: RE: Vegas Pro 11 not fully AVCHD 2.0 compliant

________________________________
Reply by: [r]Evolution
Date: 2/10/2012 5:03:42 PM

I'm still confused as to WHY you feel the bitrate needs to be the same?

I have done transcodes using the same settings coming from Vegas, Premiere, & FCP, &/or used Adobe Media Encoder, Sorenson Squeeze, and Compressor - They do not all come out the same bitrate yet visually you can't tell the difference.

Maybe the Sony transcode doesn't need all that bitrate to deliver the same quality. Do you not notice when you 'unwrap' AVCHD into ProRes, mpg, avi, etc... the file size/bitrate is larger but you can't visually tell the difference if set up correctly?

The point of thinking the bitrates should be the same sounds trivial and moot if the visual quality is the same.

________________________________________



Since the question above has been posted 3 times by the same person, and the original poster has decided to not explictly reply, I want to offer my reason to share the concerns of the original post.

If Sony offers to provide a compliant AVCHD 2.0 product, and if the AVCHD 2.0 original footage contains higher bitrate data than Vegas can ultimately produce through rendering, then the buyer can and will, in some cases, feel cheated.

Moreover, since the original content has now been needlessly subjected to a re-rendering with the content redone in fewer bits, there is both a rendering time penalty as well as a quality penalty. You cannot take an original higher bitrate file, downsample it, creating new and different GOPs, and then expect that the result is either equivalent or superior. You waste a lot of processing time, and ultimately wind up with something less.

Smart rendering has never been done well or properly in Vegas, and AVCHD 2.0 further widens the gap between what is promised and what is actually provided.

Perhaps the editor cannot see much if any difference on his/her monitor between the original and the rendered, lower bitrate file. That in no way, however, ensures that the ultimate viewer / consumer of the content will not have better display capabilities wherein the differences DO become visible.

I personally reject the comment made several times about the differences being "trivial" and also reject the notion that visual differences to one observer are the entire reason why a professional editing package should be sold and advertised in a misleading way. I, for one, am very tired of Sony's use of weasel wording and buggy updates to keep re-selling us update after update of Vegas which promise and then over-state their actual capabilities.

Larry

Laurence wrote on 2/13/2012, 10:16 AM
When I started out, I remember feeling that smart-rendering was really important, after all, I wanted "full quality" on my masters. Then as I got a little further into it, I found myself grading color on all my "important stuff" but still wanted to be able to smart-render quick things like family events. Now, some years into this, I color correct and touch up everything and and smart-rendering is something I rarely miss.
[r]Evolution wrote on 2/14/2012, 11:15 AM
I am interested in an answer from the OP as to why HE feels the need to have the exact bit rate as I reject the notion that the same bit rate is synonymous with the same quality. Naturally, everyone else can/will chime in as to what they THINK is HIS reason, but more than likely, it's NOT his original reason for posting. I value everyone's opinion... especially HIS, since he felt the issue significant enough to open conversation. I'm particularly interested in this because I too shoot AVCHD (60i/60p which is what I think the AVCHD2 spec deals with) but have never thought to Edit or Deliver it. Nor have I noticed any significant quality change by rendering out to lower bit rates, which we must do to deliver video.

... if the AVCHD 2.0 original footage contains higher bitrate data than Vegas can ultimately produce through rendering, then the buyer can and will, in some cases, feel cheated.
- If the bit rate is still within the spec and the quality is retained, I don't see how one could feel cheated unless they have a misunderstanding as to the relationship of Quality in relation to Bit Rate.

rendering time penalty
- I would need actual numbers for comparison to put this into perspective.
(adding 5 minutes to a 1hr render {insignificant} -vs- adding 30minutes to a 1hr render {significant})

quality penalty
- Scopes can say they differ but the difference can not just be 'marginal' to be considered significant.

You cannot take an original higher bitrate file, downsample it, creating new and different GOPs, and then expect that the result is either equivalent or superior.
- My low bit rate AVCHD gets transcoded to higher bit rate ProRes, DNxHD, Animation, MPG2, or even Uncompressed for editing - but that higher bit rate does NOT make it any better quality.

ultimately wind up with something less.
- 'Less' is in the eye of the beholder, NOT the bit rate. Compare renders from all NLEs & Encoders exporting to the exact same specs and you will find differences in file size as well as bit rate and probably quality.

Perhaps the editor cannot see much if any difference on his/her monitor between the original and the rendered, lower bitrate file. That in no way, however, ensures that the ultimate viewer / consumer of the content will not have better display capabilities wherein the differences DO become visible.
- This is why we edit to a Standard using Properly Calibrated Equipment & Monitors. We can't control what the end user views it on but we can control the fact that we are delivering per Standards. I wouldn't want to deliver video edited to look good on a monitor set with too much red or green or blue as then it would look bad for all others. Deliver per Standards and you know that any weirdness on the end-user's side is because of the end-user's setup therefore everything they view is off kilter.

I personally reject the comment made several times about the differences being "trivial"
- 'Trivial' in that we can't control the spec. So long as the render is within spec, it's in spec. It does not have to match exactly what was given just remain within spec on output. It may be that the Vegas output is more "In Spec" than the original. - I'm not sure of the spec so don't take me as saying Vegas is more "In Spec" In no way do I mean the topic is 'Trivial'. Nor am I saying anyone's position is 'Trivial'. I'm saying that the minor difference in bit rates may be trivial and bear no true weight on the actual quality.

and also reject the notion that visual differences to one observer are the entire reason why a professional editing package should be sold and advertised in a misleading way.
- I fail to follow as I only see it as 'misleading', not my word choice, if it is outputting something that is NOT within spec.

I, for one, am very tired of Sony's use of weasel wording and buggy updates to keep re-selling us update after update of Vegas which promise and then over-state their actual capabilities.
- Either Vegas fits your flow, for whatever reason, or it doesn't, for whatever reason. There are way too many professional grade NLE's to try/choose/use to continue using one that does not fill your needs, wants, or desires.

When I deliver to Broadcast, DVD, the www, or etc... my Quality is often times visibly lower than my original AVCHD or Transcoded ProRes, Animation, DNxHD, Edit CoDec, but that can't be avoided as I have to meet their standards of delivery. Because of this, I don't consider the difference to be significant. It's just how the game is played. I'm trying hard to see the significance of this argument because it may be pointing in the direction of something that is affecting me, my workflow, and my end product. I'm truly confused as to whether I'm missing something that a more clear understanding in this thread could correct for me thus making my end product be significantly better.

The only 2 reasons I can think of for the OP to feel 'cheated' by a lower bit rate are:
- Re-Edit the footage and worried about loss of resolution?
- Archive the footage for later Re-Editing thus worried about loss of resolution?
Former user wrote on 2/14/2012, 11:22 AM
The only 2 reasons I can think of for the OP to feel 'cheated' by a lower bit rate are:
- Re-Edit the footage and worried about loss of resolution?
- Archive the footage for later Re-Editing thus worried about loss of resolution?



I am sure you know this, but to clarify, Bitrate and Resolution are not related. There is no change of resolution caused by bitrates.

Dave T2
[r]Evolution wrote on 2/14/2012, 11:59 AM
I am sure you know this, but to clarify, Bitrate and Resolution are not related. There is no change of resolution caused by bitrates.

I misspoke/mistyped:

The only 2 reasons I can think of for the OP to feel 'cheated' by a lower bit rate are:
- Re-Edit the footage and worried about loss of QUALITY?
- Archive the footage for later Re-Editing thus worried about loss of QUALITY?
rmack350 wrote on 2/14/2012, 3:12 PM
Seems to me that if you re-render AVCHD, or any lossy codec, you'd have generation loss even if the bitrate stays exactly the same.

However, maybe the OP's example situation could be improved on. What if you had a pristine, never-been compressed, video stream and wanted to encode it as AVCHD? Can Vegas deliver the compressed video at the bitrate he wants?

Suppose you're delivering to someone who's equally rententive about their AVCHD spec. They also don't care what it looks like, they just want the bits they contracted for. Can you deliver it? Can they tell if you delivered it?

I tend to agree that for almost any situation the important factor is the visual quality. Bitrate is a proxy for that.

Aside from A/Bing tracks and watching the scopes change, another good way to see how a render has changed vs the original is to render a small region to a new track using your codec of choice. Then set the track's transfer mode to Difference. If the new track is identical to what's below it then your preview should be black indicating that there is no difference.

Rob
Former user wrote on 2/14/2012, 3:18 PM
Seems to me that if you re-render AVCHD, or any lossy codec, you'd have generation loss even if the bitrate stays exactly the same.

That is right. You are starting with a lossy codec, so it will never get any better, and will only deteriorate more. That deterioration can be lessened by higher bitrates, but it will happen. Your best approach is to render first to an uncompressed or lossless codec and do your editing from that, and then render to your final AVCHD format when all is done.

Part of the problem is we are using delivery format codecs for acquisition. Wasn't designed to be that way.

Dave T2
LSHorwitz wrote on 2/15/2012, 9:03 AM
Using 2 AVCHD 2.0 compliant cameras as I do, I would have expected that 28 Mb/sec content could be edited and then rendered by the latest Vegas without reducing the bitrate or diminishing the file size, particularly if I take a piece of video and re-render it without making changes. After all, the Vegas upgrade offers AVCHD 2.0 compliance. One of my cameras, ironically also a Sony product, an a77 alpha DSLR, makes really nice videos at that bitrate, as does my Canon camcorder.

Is it unreasonable to expect then that I could edit these videos in Vegas latest version while maintaining this bitrate? I think not.

Do I feel cheated? I think so.

The fact that both the a77 and Vegas have bugs which Sony has yet to either acknowledge or fix is yet another source of unhappiness.

No wonder to me they are sustaining huge multi-billion dollar losses.

Larry

\
Marco. wrote on 2/15/2012, 12:21 PM
It's surely not unreasonable but maybe a different view towards H.264 could help understanding why at the same time it's not unreasonable what Vegas Pro is doing here. The effiency of H.264 is not mainly based upon the data rate. And for realtime encoding like it is used in cameras it often happens that for simplicity the H.264 encoder uses zero bits here and there. A more precise encoding wouldn't use those zero bits and a different encoder just could be more efficient while using same or lower data rates.
LSHorwitz wrote on 2/15/2012, 3:59 PM
Given infinite computer resources I have no doubt that more efficient and compact representations of the data could be accomplished, and indeed h.264 if done by my 6 core processor with tons of RAM could make an equivalent, lower bitrate video presentation from the comparatively uncompressed version encoded by my camera's codec and encoding engine.

Nonetheless, if a product is released which claims to achieve AVCHD 2.0 compliant encoding, yet fails to offer a template for the speed / bit rate which AVCHD 2.0 supports, then I, for one, and the original poster as well, can quite legitimately question whether the product is indeed what it claims to be.

We can debate endlessly whether more efficient coding, or a less revealing monitor may circumvent this issue, since the implication that "better encoding" or an imperfect monitor would not show whether a difference is present or absent.

I think both of these arguments miss the essential point, namely, that a 28 Mb/sec standard agreed upon by the entire community of equipment makers, software developers, etc. should be offered as an option when the software developer states that their product is AVCHD 2.0 specification compliant. Either it complies or it doesn't. In my estimation,(and admittedly I am an anal-type design engineer since the 1960s) this Vegas 11 does NOT meet the specification, and is thus, misrepresented by Sony.

I also entirely reject the "guess" by some that the difference lies either in max versus average or in kilobytes versus 1000 bytes, both of which are totally 'red herrings' and specious arguments IMHO.

Larry
John_Cline wrote on 2/15/2012, 10:00 PM
"Is it unreasonable to expect then that I could edit these videos in Vegas latest version while maintaining this bitrate? I think not. Do I feel cheated? I think so."

The 28Mbps figure stated for AVCHD v2.0 compliance is the peak bitrate, not the average bitrate. The average is around 26,000,000 bps or 24.8 Mbps. (26000000 divided by 1048.) As I stated earlier in this thread, the Sony AVC encoder is variable bitrate and the bitrate figure you set in the encoder is the average, not peak. The maximum value you can set is 25,999,360, which is the real average of an AVCHD v2.0 file, you are not being cheated in any way.
LSHorwitz wrote on 2/16/2012, 9:16 AM
Thank you John for your clarification. Can you link me to your reference for stating that 28.0 Mb/s standard is indeed a peak rather than an average value?

This thread, BTW, disagrees with your statement:

http://forum.doom9.org/archive/index.php/t-161842.html

Thanks.

Larry
larry-peter wrote on 2/16/2012, 9:56 AM
The AVCHD information website http://www.avchd-info.org/format/index.html gives a system bitrate of (not going to figure out how to input the symbol) "Greater than or equal to 28Mbps." From my math background I would interpret that as "It ain't goin' no faster than this." Which, according to most definitions would imply "peak bit rate." Everyone is complaining that is isn't spelled out more clearly, but from what I've seen, it appears peak bit rates are the standard in codec specs.

Edit: Yes, Less than. Thank you. There go my math creds.
vkmast wrote on 2/16/2012, 10:17 AM
In case atom12 did not have a chance to edit his post, surely he means "Less than or equal to 28mbps" like he clearly states in the following sentence.
LSHorwitz wrote on 2/16/2012, 10:17 AM
As I said earlier, this interpretation is a "guesstimate", and in the electrical and computer engineering world that I come from the spec does NOT define the bitrate as peak, although some without such a background might interpret it that way.

It would be reassuring to me to see that Vegas, when given full speed AVCHD 2.0, delivers the same bitrate from rendering as I feed it from either of my two cameras which create AVCHD 2.0 content.

Since Vegas obviously lowers the bitrate during rendering, I stand by my original comment, and entirely reject the notion of peak versus average unless John Cline or someone else can show me factual evidence to the contrary.

Sorry,
Larry
larry-peter wrote on 2/16/2012, 10:37 AM
If I interpret correctly, you would feel more confident in Vegas if it allowed you to set a CBR of 28Mbps when rendering AVCHD. I can understand that. It would perhaps produce better results. But as was posted earlier, just because a camera is writing in real-time at a CBR of 28, does not mean you're getting the efficiency of encoding you would from a software VBR. And if the spec says "less than or equal to" instead of "must be equal to" and Vegas is not exceeding that bit rate, I can't agree that they're not in compliance. Maybe they could be BETTER, but I don't think they've misrepresented.