PAL to NTSC, Vegas is lousy! A LONG educational post with some questions... (farss and mbryant check this one!)

Comments

GTakacs wrote on 6/22/2004, 8:09 AM
I DID read all your responses even if I didn't make a reply to each of them. This board is the worst by far of its kind, I don't mind the threaded reply system (I actually prefer it), but it is hard to find which response one has and hasn't read. I visit http://forums.audiworld.com/a4gen2 regularly, and I am familiar with this setup, but over there I can at least get e-mail notifications and the unread threaded replies are bold, so it's easier to sift through. This thread is not about board bashing though, so let's get back to the point at hand.

I will try the "reduce interlace flicker" switch and see what happens. I will compare output with that feature enabled. I am glad that you actually did something similar to my testing, even took it to a further extreme! My footage is real life footage and I just wanted to get answers to my concern.

So far you've been the only one who actually gave reasonable response, I apologize for not acknowledging it earlier. I was just too fed up with the herd metality of others with their "use DV not MPEG2" chant which as you and I both know have nothing to do with the interlace artifacts. I must also mention "erratic" who also contributed valuable reply.

It seems like most people still don't get what the issue I thought was at hand, just as I said in my last post I will try and play more with tweaking the settings on Vegas if it can give acceptable results. I will report back my findings.

Thanks again for all your responses, I just wish I could get my point through to others like it got through to you!
GTakacs wrote on 6/22/2004, 8:14 AM
I will try what farss recommended about reducing interlace flicker, I wonder if you have tried it yourself.
GTakacs wrote on 6/22/2004, 8:19 AM
I am not doing frame resizing besides the 576 to 480 horizontal line downsampling. That is what causes the artifacts. I will try the interface flicker removal feature and report back.
farss wrote on 6/22/2004, 3:12 PM
That's cool!
Didn't mean to come accross so hard, it just gets a tad frustrating at times and the thread was starting to look like it was going to go pear shaped if you know what I mean!
If I can find the time I might get some real world footage and run it through a S&W box for comparison. The only problem I have doing that is the station only runs digi decks and SDI so I need to be careful that we're actually seeing the difference in the conversion and not the quality of everything else.
SonyEPM wrote on 6/23/2004, 10:57 AM
Rendering with "BEST" quality will also help.
vitalforces wrote on 6/23/2004, 11:29 AM
GTakacs: There's also some inexpensive software developed by a Texas engineer which specifically is designed to convert PAL to NTSC without sacrificing half the field resolution. Here is his link:

http://www.dvfilm.com/atlantis/index.htm
erratic wrote on 7/7/2004, 11:33 AM
I was finally able to run an interesting PAL to NTSC conversion test:

1. Vegas
2. Vegas with Reduce Interlace Flicker (as suggested by farss)
3. ProCoder 2.0
4. Avisynth (frameserving to CCE)

Click here for a fast motion field from all four mpeg-2 files.
Conclusion: Reduce Interlace Flicker does indeed improve the conversion a lot in Vegas.

And by the way: the bitrate was 6000 CBR for all encoders and CCE did the fastest but worst job as far as mpeg-2 quality goes.
alfredsvideo wrote on 7/7/2004, 2:43 PM
There's a really simple solution to all this!! Let's face it. In the early days of television, America chose the wrong path and went NTSC. Any talk of coversion should be directed towards America abandoning a second rate standard and converting all their systems and equipment to PAL. Hope I haven't upset any of my American friends!!
farss wrote on 7/7/2004, 4:44 PM
Nah,
if you want a decent system SECAM is the way to go, even more res than PAL.
John_Cline wrote on 7/7/2004, 9:27 PM
NTSC was the first television standard and PAL was developed after NTSC. Even with the benefit of having the NTSC standard from which to work, PAL is not without its own technical shortcomings. SECAM was adapted from PAL and was an incremental improvement over the two earlier systems. Nevertheless, digital television is here and the whole PAL vs. NTSC vs. SECAM argument is rapidly becoming moot.

John
Grazie wrote on 7/7/2004, 10:19 PM
. . Well, I aint no pro media technician, but when ever I get something looking horrid I slap on the reduce interlace filcker [ RIF ] and most/all of what I had been getting is "ironed-out", a bit of Gausssian kills the rest . . . I don't do conversions, PAL<>NTSC or NTSC<>PAL, but when I use Pan/Crop on fast moving real life footage and I get nasty jaggers and artifacts, RIF does it for me . . and I ONLY capture in AVI . . .I don't use MPEG anything until I go out to presentation - yeah? I'm so much all over the place, that having at least the captured footage in something plain vanilla - meaning at least one variable fixed - keeps my workflow as simple as possible. I've got enough to contend with just making sure I haven't screed something else up - yeah?

Thanks for your detailed reports. I have learnt a tremendous amount - truly! . . It just again underlines, for this simple soul at least, the value of K.I.S.S. . . . . works for me .. .

Best regards - and thanks once again for the reports,

Grazie
farss wrote on 7/8/2004, 7:14 AM
John,
I wish it were becoming moot, correct me if I'm wrong as I don't know this for certain but I'd heard we were still being stuck with different frame rates for digital broadcasts in general and HiDef in particular.
Maybe I've missed something but hasn't the reason for keeping the field rate the same as the mains frequency long gone.
John_Cline wrote on 7/8/2004, 2:39 PM
Yes, there will be different frame rates for some time, probably due to the amount of existing footage that will be incorporated into DTV productions. However, HiDef TV's are in many way like multiscan VGA computer monitors, they are capable of different resolutions and frame rates. HiDef TV's are able to display pretty much any of the standard frame rates you throw at them.

John