? about AudioSyncR

MikeLV wrote on 9/16/2023, 5:32 PM

@wwaag, I just read about the AudioSyncR feature in HOS. Do I understand correctly that if I shoot video with two cameras, I can use this feature to sync up the footage on the timeline from those cameras? I watched some of the videos, and I saw where it looked like you were showing that it doesn't always get the sync dead on. Do you have any tips to increase the accuracy?

Also, and this question is for everyone, do you have any tips/suggestions on how to organize or log the shoot if there's a lot of stopping and starting recording during the shoot? Scene-wise, everything will look the same, so it could be easy to get videos mixed up.. Do you just go by the timestamps in the Device Explorer? Thank you

Comments

3d87c4 wrote on 9/16/2023, 9:08 PM

The accuracy is plus or minus 1/2 the time length of a single frame.

This can be visible in 3D videos when there are fast moving objects---I had a friend call me on this on a 3D video I made of a fiddler---he could see the mismatch in the bow motion.

If you are simply cross fading between two camera angles of a musician it isn't noticeable.

What I do to aid automatic or manual audio sync is to add "spikes" to the audio tracks. If the cameras share a mount I tap on the mount with a coin or some other metal object. Alternatively, a drum beat, block of wood, etc. can be used. The key is to create unique spike patterns that are easy to spot.

Things to watch out for are variable frame rate video sources (i.e. cheap cameras using avi) or intermediate renders with codecs that introduce audio/video offsets. There were threads about the HEVC codec introducing such offsets, as I recall.

As for organization, double check the cameras are all set to the same time before trusting the device explorer...

Last changed by 3d87c4 on 9/16/2023, 9:14 PM, changed a total of 1 times.

Del XPS 17 laptop

Processor    13th Gen Intel(R) Core(TM) i9-13900H   2.60 GHz
Installed RAM    32.0 GB (31.7 GB usable)
System type    64-bit operating system, x64-based processor
Pen and touch    Touch support with 10 touch points

Edition    Windows 11 Pro
Version    22H2
Installed on    ‎6/‎8/‎2023
OS build    22621.1848
Experience    Windows Feature Experience Pack 1000.22642.1000.0

NVIDIA GeForce RTX 4070 Laptop GPU
Driver Version: 31.0.15.2857
8GB memory
 

mark-y wrote on 9/16/2023, 9:21 PM

1. If used according to directions, it's almost always dead on!

2 (your second question). Name and import the segments in chronological order, get them "somewhere" in range, and use the sync-to-cursor method in ASR.

3. Throw away all recollections of Pluraleyes. This is entirely different workflow. Think of the "Start Range" as your lookahead / lookbehind zone. Read the instructions carefully, and refer to them often. They are expertly written.

This is video editing's best-kept secret. You won't regret getting it nor undertaking the learning curve after your first successful project.

wwaag wrote on 9/16/2023, 9:27 PM

@MikeLV

"Do I understand correctly that if I shoot video with two cameras, I can use this feature to sync up the footage on the timeline from those cameras?"

Yes. In my testing AudioSyncR (ASR) was more accurate than PluralEyes(PE) since PE only syncs to the nearest video frame and does not handle in-frame offsets. Having said that, it is slower and has a much steeper learning curve. More importantly, PE is no longer available from Maxxon (used to be RedGiant) unless you sign up for a fairly pricey monthly subscription.

"Do you just go by the timestamps in the Device Explorer?" No. Syncing is based on the actual waveform. However, there is an option to use such timestamps to place events more or less "in the ballpark" which can speed up syncing under certain circumstances in which you have lots of events over a fairly lengthy external audio recording--e.g 15 minutes of video events synced against a 6 hour audio recording.

The one feature that I really like is the option to add the "synced" audio (e.g. from an external audio recorder) as a Take to the audio track associated with the video. In this manner, it becomes "locked" and the events can be split without fear of losing sync unlike other approaches to syncing.

I would suggest taking a look at the demos on the HOS website and "try it". For those with an HOS license, the upgrade cost is significantly less.

Last changed by wwaag on 9/16/2023, 9:30 PM, changed a total of 1 times.

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

MikeLV wrote on 9/17/2023, 4:23 PM

I've never used Plural Eyes, don't even know what it is. In the past, when I needed to sync two cameras, I just did it manually on the timeline, matching up the wave forms which was pretty easy.

When I was asking "Do you just go by the timestamps," I didn't mean with regard to syncing.. I was asking about how to keep the footage organized between the two cameras.

@3d87c4, speaking of two cameras, you said something I hadn't thought about when you mentioned variable frame rate. My main camera is a Canon XA10. I need to buy a cheap second camera for picture-in-picture.. So that I don't run into frame rate issues when syncing, what spec should I look for on the second camera? Thanks.

Once I get the second camera, I will certainly give ASR a try.. I'm going to have a 4-8 hour video to edit and it will have a lot of segments, so it will really come in handy!

3d87c4 wrote on 9/17/2023, 5:19 PM

@MikeLV: "what spec should I look for..." is a good question but I don't have a good answer. I took a look at the user manuals I have on my hard drives---Nikon D7000, Nikon S4300, GoPro3+B, and a LucidCam vr camera.

The D7000 outputs MP4 video in an MOV file. The S4300 point and shoot outputs an AVI file. Mediainfo shows both to have fixed frame rates.

The Gopro specifies a variety of frame sizes, frame rates, and encoding options such as pro-tune. They claim "cinema quality for professional productions" which I agree with, but that's not really a clear spec, eh?

The lucidcam manual has diddly squat for specs in it's user manual (brochure, really). It was a fun VR180 camera with good synchronization between the left and right images in 2K mode, but their 4K firmware upgrade put the left and right images out of sync.

I went through a couple other cameras from a manufacturor who has since dropped out of the market in exchange for shooting dive videos with them in their prototype underwater housings. One of these encoded avi files with variable frame rates that drifted badly. The second camera (actually made for them by another company) behaved well.

So...the best I can suggest is going with a reputable manufacturer---Canon, Nikon, GoPro, etc. Look at the data sheets on their web sites (I just looked in the manuals here) and watch out for "variable frame rate" encoding.

Also, if you haven't already, install a copy of Mediainfo on your computer and use it to check out the files of any cameras you buy.

 

Last changed by 3d87c4 on 9/17/2023, 5:21 PM, changed a total of 1 times.

Del XPS 17 laptop

Processor    13th Gen Intel(R) Core(TM) i9-13900H   2.60 GHz
Installed RAM    32.0 GB (31.7 GB usable)
System type    64-bit operating system, x64-based processor
Pen and touch    Touch support with 10 touch points

Edition    Windows 11 Pro
Version    22H2
Installed on    ‎6/‎8/‎2023
OS build    22621.1848
Experience    Windows Feature Experience Pack 1000.22642.1000.0

NVIDIA GeForce RTX 4070 Laptop GPU
Driver Version: 31.0.15.2857
8GB memory