AVCHD shooting questions

coasternut67 wrote on 2/18/2010, 10:39 AM
I would like opinons / experiences from all in this forum who have shot video in AVCHD *AND* output AVCHD in what has worked best for you. I know the software has its limitations with AVCHD - that is NOT what I am concerned about...the quality of the output is what I am concerned about.

I own a Canon Vixia HF20 AVCHD camcorder and have 5 setting options: 24 & 17 MBit / sec at full HD (1920x1080) and 12, 7 & 5 MBit / sec at 3/4 HD (1440x1080).

I have tested all 5 modes - even the lowest one looks decent though I would never use it below 12 Mbit / sec.

What I have noticed is there is only a very small difference in video sharpness between 1440 and 1920 modes - only visible under certain conditons and thu an HDMI connection on a large 1080p monitor.

This is my dilemma - I am trying to justify using the 17 MBit mode over the 12 MBit and having a real hard time deciding. With natural images I see NO REAL DIFFERENCE in sharpness or compression artifacts between the top 3 modes. AVCHD disks are limited to 18 MBit so that rules out use of the 24 MBit so the choice is between 17 and 12 Mbit modes.

The advantage of 12 MBit is faster editing and more recording time on memeory and the AVCHD-DVD disk.

Please comment on your experiences - especially if 1440x1080 mode versus 1920x1080....is the extra pixels really visible??

Regards,

Rob

Comments

Mad Pierre wrote on 2/18/2010, 10:49 AM
What subjects are you shooting?

My cam handbook recommends the higher modes for fast moving objects (which is what I shoot mostly). I admit to just taking their advice and not experimenting but......?
coasternut67 wrote on 2/18/2010, 12:21 PM
I have been experiementing....trying to make artifacts with fast motion and detailed scenes....really hard to see the difference between the 12 and 17 MBit modes. The extra pixels does not buy much if at all... That's the dilema. I have shot resolution test patterns with panning and zooming in all 5 modes...the resolution difference is very slight between 1440 and 1920...you really have to look for it!

My camcorder apparently has a real good H264 encoder. I see minimal to no artifacts until I get down to the 7 MBit setting or 5 MBit setting...but even there the quality is still pretty good...you really have to look for the artifacts to see them. I am leaning towards the 1440x1080 @ 12 MBit as the best compromise....since I get almost 6 hours on 32 Gig internal and another 6 hours on an 32 Gig SDHC card

Rob
david_f_knight wrote on 2/18/2010, 1:38 PM
I have the Canon Vixia HF200 camcorder (essentially the same as yours, except mine has no built in flash memory). I've decided to shoot all my video at 24Mbps. Here's my rationale:

1) Flash memory is fairly cheap, and hard disk storage is really cheap, so the cost of capturing and retaining the video at 24Mbps is nearly negligible. Certainly, if what I'm shooting is important to me, it's worth whatever small additional amount of money is required to have the best I can get. Why pay the extra money for a quality camcorder, which the HF20 is, unless you want the very best you can get from it?

2) The more you edit your video, the more significant the quality of the source material may be. For your comparison tests, you may have been comparing first generation video (i.e., without having edited or rendered it). But if you edit your video, then what is rendered will be second generation and artifacts will build on artifacts, possibly making the small difference in quality you have observed more pronounced. The more you process your video during editing, such as by color grading or adjusting contrast or other effects, may also make small differences in quality more pronounced.

3) If you ever crop your video footage and enlarge it to fill the frame, then that will amplify whatever artifacts and blurriness that exist in it.

4) As a matter of principle, I want the best I can get (within my budget)!

I have to question some of your apparent assumptions. You wrote "AVCHD disks are limited to 18 MBit so that rules out use of the 24 MBit...." It is true that DVD disks are limited to 18Mbps transfer rates, but that doesn't mean you can't record your video at 24Mbps, and then render it at a lower bitrate. (Vegas MS Platinum offers 16Mbps and 10Mbps bitrate renders for 1920x1080 resolution using the Sony AVC video format and Blu-ray template.) (However, if you want to transfer your video directly from your camcorder to an AVCHD-DVD without editing and rendering it then you are indeed limited from recording your video at 24Mbps.)

Another apparent assumption I question is your claim that "The advantage of 12 MBit is faster editing..." is that really true? I haven't tested that, but I can't see why it should be true to any significant degree. In order to edit video, it has to first be decompressed.

The last apparent assumption I question is your claim that "The advantage of 12 MBit is ... more recording time on ... the AVCHD-DVD disk." That is only true if you transfer your video directly from your camcorder to an AVCHD DVD. If you edit it and render it, then you are limited to the bitrate choices your editor provides. Once again, with VMSP, at 1920x1080 for Blu-ray, your choices are 16Mbps and 10Mbps. (I use the Blu-ray template for making AVCHD DVDs, which is why I have cited its options in VMSP.)

Ultimately, though, it also depends on what you're shooting... if you are shooting a multi-hour long event like a baseball game from start to finish, then I'd opt for a lower bitrate, especially if you shoot a lot of events like that. That isn't the type of thing I shoot, though. I'm more into shooting vacation footage and short things not very far from home, so quality is far more important than capacity for me.
musicvid10 wrote on 2/18/2010, 6:02 PM
1920x1080 (at 1.0 PAR) has 1/3 more pixels than 1440x1080 (at 1.3333 PAR).
17Mbs is just over 1/3 higher bitrate than 12Mbs.

So the visible quality between the two should be essentially identical. The 1920x1080 may appear slightly sharper on very large screens, but that entirely depends on the equipment and the person viewing it.
coasternut67 wrote on 2/19/2010, 2:43 AM
Let me clarify a few points here -

Yes I mainly shoot vacation footage and so far have done so at 24 Mbit/second and Full HD 1920x1080.

Now I am not using VMS for my editing because of its serious limitations with AVCHD files - I use another package that is faster, can do 2 pass VBR rendering, and smart rendering (no re-coding of unaltered footage)....This allows 1st generation AVCHD as the output. That is why I want to stay AVCHD disk compliant - the majority of my footage I shoot is kept as is - I only use transitions, a few titles, and background music. I shoot the video in such a way to get the best image right from the start so I do not need to adjust it later.

As for rendering speed - less pixels to process will render 1/3 faster with any software program - so that gives 1440 mode an advantage. This has been proven in my testing with VMS and other programs as well.

My single biggest issue is does the 1920 resolution offer anything over 1440 on my camera?? (I am leaning towards an NO here)

The only time I have been able to tell any difference is with a resolution test pattern viewed as 1st generation via HDMI on a 47" 1920x1080 monitor....and the difference is minimal - you really have to look for it!! With natural images - even ones with high detail - I am unable to see any difference. This could be because the camera's sensor is not capable of showing it because a 1/3 increase in sharpness *SHOULD* be noticeable...but its not. The sensor's pixel count is not exactly the same so there is scaling done in the camera and that could be the reason...would make sense.

Yes memory is cheap and I can by more SDHC cards however I will be on vacation for 3 weeks this summer and away from my editing PC so recording time is an issue...as is the time it takes to edit the footage....1/3 more recording time and 1/3 faster rendering is a big deal to me....if there is no visual quality loss because quality is also very important.

So has anyone done shooting with AVCHD at 1440x1080 at about 12 Mbit and was there any issues noticed compared to 1920x1080 at 16 or 17 MBit??? Does the 1920 mode really offer a sharper picutre?? In theory it should but I am not seeing that on my camera with natural footage.

Maybe the image stabilization is hurting my resolution here?? it is optical not EIS but maybe I need to test further. So far it has always been on - but I have a very steady hand so I really do not need it.

I am leaning towards the 12 Mbit 1440 mode but am on the fence with this decision. I may just opt to go with full HD and buy an additional 32 Gig SDHC card (camera + 2 cards)...12 hours should be enough time for my vacation shooting,

I should also add that I shoot in 30p mode - not 60i because there is less motion blur based on my testing and reviews of my camera online. Also you dont get interlacing artifacts in this mode when editng or if I render out a DVD resoution version.

Regards,

Rob
musicvid10 wrote on 2/19/2010, 3:14 PM
Trust your eyes. Unless you are going to sell your footage to Columbia Pictures, 1440 is fine.
If you can't see much of a difference, as I suggested, there is a real advantage to saving 1/3 of your overhead.
ritsmer wrote on 2/19/2010, 9:45 PM
... one day, not far away, our current 50 inch plasmas will be exchanged for 100 +++ inches screens - and then we might wish we had done everything in at least 25 Mbit back in 2010 ...
coasternut67 wrote on 2/20/2010, 7:38 AM
I tested my camera in both modes (12 Mbit 1440 res and 17 Mbit 1920 res) using an ISO12223 resolution chart... and there *IS* a difference in the measured lines per picture height. I used the highest value that I could still see individual lines - there was also slight movement since I held the camera by hand (no tripod). also turned off the image stabilization (Optical)

1440 mode = about 600 vertical lines
1920 mode = about 800 vertical lines or 33% sharper!

Also noticed the artifacts were gone in the 1920 mode but slight in 1440 - probably due to pixel re-mapping / scaling effects. the 1920 mode was a very clean image - only line aliasing artifacts from the test pattern were observed and this is normal for a sampled image.

Strange that this is not very noticeable with natural images but that settles it for me. All of my shooting will be 1920x1080 at 17 MBit so its AVCHD Disk compliant.

Thanks for all of your input :-)

Rob
musicvid10 wrote on 2/20/2010, 8:53 AM
1440 mode = about 600

Absolutely correct. Exactly what one would expect from a vertical res test, except you would be better to do it on a tripod and compare static clips with slow horizontal pan.

Strange that this is not very noticeable with natural images

Not at all. That's why horizontal pixel stretching works in most all real-world situations. There is lots of good reading on the internet why the human eye is less sensitive to this.

But there is no argument that the highest resolution gives the best theoretical advantage, if the 1.3x increase in storage requirement and processing overhead is not important for you. I guess it was your statement that this was vacation footage that made me think it was.
Melachrino wrote on 3/4/2010, 2:50 PM
""... one day, not far away, our current 50 inch plasmas will be exchanged for 100 +++ inches screens - and then we might wish we had done everything in at least 25 Mbit back in 2010 ...""

You are absolutely right, except that you need not wait. Current properly made LCD and Plasma displays (even old well designed CRT's) can clearly show full 1920x1080 resolution. Of course future mega inch displays will make it much easier to tell the difference from far away ...

And yes, the advice to shoot and archive at the highest possible quality and resolution available is very valid. I grumble at those in my family who originally filmed in VHS at the slowest speed to save tape .....aaarrrgghh.. . Are they sorry now, but it is too late.

Shoot for the best and be happy tomorrow also.