Correct capture settings for PAL

eddydde wrote on 7/28/2003, 12:31 AM
I would appreciate some guidance on the correct capture settings using Vegas for PAL.

This request has been prompted because I can see what appears to be a few 'data/digital'- lines/bits at the top of the captured video and a small black border on all sides of the video as well.

( The data/digital 'bits' I refer to is not unlike the 'teletext' info seen on the top of a TV picture if you put a TV out of vertical hold.)

Capture is via firewire and is not on the video viewed via the camera (VX2000) - only the captured video.

The FILE/PROPERTIES settings I am using are:

Width: 720
Height: 540
Frame rate: 25.000 (PAL)
Field order: NONE (Progressive scan)
Pixel aspect ratio: 1.000 (Square)

Full resolution rendering quality: Best
Motio blur type: Gaussian
De-interlace method: None

In OPTIONS/PREFERENCES/VIDEO DEVICE the only choices I have are : PAL DV or PAL DV Widescreen, so I am using PAL DV.

Final rendered product to be viewed on PC.

Would someone like to put an Aussie on the right track?

Comments

Grazie wrote on 7/28/2003, 1:22 AM
Okay, here are mine "FILE/PROPERTIES settings" :

I've made bold the differences I've noticed in yours. I'm a little concerned your "Template" doesn't pick up the settings I use. Okay, London UK here, can't see why you've got a "540" height setting" & the Lower field first . . hmmm . . . interesting? Do you pick up the template for PAL DV? It is in the "Template" drop down menu. Is there such a thing as an Aussie PAL DV setting? Wouldn't have thought so . . . . .?

Template: PAL DV (720x576, 25.000fps)

Width: 720
Height: 576
Frame rate: 25.000 (PAL)
Field order: Lower Field First
Pixel aspect ratio: 1.0926(PAL DV)

Full resolution rendering quality: Best
Motio blur type: Gaussian
De-interlace method: None

I use this on all my PAL projects & no problems.

Hope this helps,

Grazie
eddydde wrote on 7/28/2003, 1:52 AM
Hi Grazie and thanks for this info.

I can explain some but not all of the differences.

I am of the understanding that when you want to view PAL videos on a PC monitor, as apart from a TV set, the field order should be 'none' as a PC monitor does not do interlaced scan.
The pixels on a computer monitor are 'square'. hence the aspect ratio of 1.000
You are correct, there is no Aussie PAL DV system.
As for the 'height' I believe I saw this documented somewhere and will do a search for it tonight.

I stand corrected on all of this, hence my initial post.
farss wrote on 7/28/2003, 6:46 AM
Eddydale,
nice to find another aussie here.

I've never changed the default PAL DV template and it works with absolutely no dramas or wierd bits happening. A lot of what you had to say is correct but those damn yankee guys that wrote this thing were way too smart and took care of it all for us. In other words when VV displays on a PC monitor it does the necessary conversions to make it look as right as possible. Actaully I think I'm telling a fib here, its the video drivers or Windows itself that remaps the pixels.

The one thing I do play with from time to time is the deinterlace method. This is only significant if VV has to deinterlace, normally as far as I know it doesn't, but at times I have to convert PAL to NTSC to MPEG and then I try fiddling with it just in case it's going to make a difference
RBartlett wrote on 7/28/2003, 8:36 AM
Eddydde,

If your target is a computer and its desktop monitor - then generally accepted is for an AR of 4:3, square pixel and non-interlaced. However if you are using the PC as an NLE to go back out to TV. You should compromise your environent so that you don't mess with it where you wish parameters to be passed through.

A programmer could pick up an interleaved frame and draw the field and even simulate the persistence of the previous field with a fade - such is the update and low persistence on a computer screen. However - just to make us go slightly mad, the video in all NLEs and players is a combination of two fields striped onto the same window together. Your TV would look the same if it had high enough persistence. You don't get the scan line trouble - but you do get tooth combing and other directly comparable artifacts. Microsoft and their directshow/vfw architecture is also to blame - also you get the same in WMP, QuickTime player etc. Bob/weave de-interlacing reduces resolution - and overall a PC display lends itself to scale video up - so the benefits of doing an interleaved line draw between black or darkening of scan-lines would have its troubles too.

Nobody ever seems to suggest that Vegas or Premiere users should set their refresh rate to 60Hz, 90 or 120Hz. Yet all of these would have their advantages for NTSC work. Giving 75Hz or 100Hz as PAL ideal settings on the NLE.

So ultimately you forgive the NLE for the motion troubles you see on the PC and look across (in Vegas' case) to your DV external-preview monitor.

If your target is a PC application- you should be transcoding the video format into your PC player format. Again this might mean you change very little - but it might mean that you do head towards settings like you originally stated.

I hope that adds the clarity you seek?
Grazie wrote on 7/28/2003, 9:05 AM
NOW I'm confused!!!
RBartlett wrote on 7/28/2003, 9:14 AM
Sorry Grazie,

I was trying to point out that you can write down a whole list of attributes about a PC and the software written for it. Yet when you bring in footage from a camera and go out to DVD or perhaps VHS. The attributes associated to the PC are not that important.

The window that you look at your video through is just that - a window on what is in fact a little bit alien. 50 interleaved fields per second, 1.0926, lower field first doesn't need to be mated up to the PC spec at all. You tell Vegas what it should be expecting, set your viewing preference (like adjusting the window for target aspect ratio) and you get on with it. When you export - be it for DVD, VHS, WMV or Flash! you pass through as much information about the _VIDEO_ as possible - not your PC display technology.

I went on a bit too much on what a bunch of programmers ~COULD~ do to simulate the TV attributes on your PC desktop/monitor. Vegas is a general purpose NLE so gets on with the job and any merging of fields that you see as you scrub the timeline has occured because it is an optimum choice. Any pixel shape choice that you choose to make adjustments for, is for your target device. etc etc. Exactly why an external preview monitor is required if your target is standard def TV.

I think I've perhaps put my "geek pants" on today.
Grazie wrote on 7/28/2003, 10:34 AM
RB, thanks very much! - I must have left my pants off today, . . . I wondered why people were staring at me when I went to the shops today . ."oOOOh look at him! - Shame poor ole bloke!!...." . But, yes, very nice description and very clear, even I understood - which says a lot!!!!

So.... the way I've got my ProjectPrefs setup is the way to go? Am I missing out on something? Do I need to alter anything? From what you've said AND the success I've been having with my Canon XM2 + V4.0c is good to go - yeah?

Regards,

Grazie
farss wrote on 7/28/2003, 5:52 PM
Despite me thinking this was all much about nought one penny did drop.

I've oftenly gone into panic mode over interlace combing when previewing on my PC monitor. From what I'm guessing I may be only seeing that because the monitor is displaying a non interlaced rendition of the video. When I play my footage back through a TV it does disappear.

If I've interpreted all this correctly I need to make more use of my external monitor, at least before I start to panic about artifacts.
eddydde wrote on 7/28/2003, 6:13 PM
The keyword here seems to be 'target' - what the final use of the rendered video is going to be used on, PC or TV.

Thanks guys, this has resolved my confusion on this matter.