Fun with interlacing...

VanLazarus wrote on 3/5/2006, 3:36 AM
I'm currently rendering out a project to a DVD Achitect NTSC video stream (MPG). My source clips are both NTSC DV and 30fps Progressive scan (clips from still camera). I have "Disable Resample" selected for all clips.

Vegas seems to have problems interlacing some of the 30fps clips. For example, the first 30fps Progressive scan clip looks great in the preview window while Vegas is generating the MPG file, but a few clips later, horrible jaggies can be seen on a later 30fps Progressive scan clip. Once generated and viewed, sure enough, the first clip looks great, while the jaggy one looks just as jaggy as it did in the preview window. Both clips have an equal amount of panning.

Why is Vegas generating different interlace quality results with different material from the same device? Is this just a display issue with looking at the interlace MPG on my progressive scan computer monitor and everything will look good on a TV?

By the way, did I mention that: I hate interlace! And wish that this technological dinosaur from 60 years ago would finally be eliminated!

Comments

farss wrote on 3/5/2006, 4:24 AM
I suspect it's a function of the content, the problem is I think only visible on diagonals. Try turning on Reduce Interlace Flicker in the clips properties might help.

I assume the 30p material is still images.

Bob.
johnmeyer wrote on 3/5/2006, 9:08 AM
Why did you disable resample? Leave it in its default position. Any project that requires changing frame rate, resolution, or going from progressive to interlace or back, will want to resample. I suspect disabling it will make things look worse, possibly MUCH worse.

Lots of people sure seem to get upset about interlacing. I think they confuse the scan lines (which of course don't exist on addressable displays like LCDs) with how the technology actually works. Personally, while I love the "look" of film, the artifacts when the camera moves are pretty horrible, and the way motion looks is pretty jerky. Lots of progressive vs. interlaced threads on this forum if you want to get more information.

Interlaced is definitely NOT an old fashioned technology, or else we wouldn't have 1080i as one of the main elements of HD technology. That wasn't some brain-damaged decision by a few people that didn't know what they were doing. Quite the opposite. It was a decision by dozens of the brightest engineers across the planet, trying to come up with the ultimate entertainment display system using the technology of the 1990s, not the 1940s.
Coursedesign wrote on 3/5/2006, 12:19 PM
John,

Back in the day of an engineer watching a field getting plowed and coming up with the line-by-line scanning concept, it was soon found that the bandwidth need could be cut in half by using interlaced scanning.

The choice to go with interlaced 1080 was made by television engineers who had worked with televisions their entire careers. They knew that televisions were supposed to be interlaced, and it seemed like a nice shortcut to be able to get 1080 lines for less with the 1995 MPEG-2 technology providing limited compression, and "the CRTs were perfect for interlaced anyway."

Today, we have more modern codecs that provide the same quality at a much higher compression, and with the faster chips we have today those are easier to handle.

Turns out that progressive footage is also more efficient to compress, and going forward, interlaced television sets are only holding the rear. Mainstream today for HDTV is LCD, DLP, plasma, etc., which have in common that they are all 100% progressive.

So in retrospect, it's easy to wish that the olden TV engineers had had enough foresight to go progressive, especially since 100% of films enjoyably watched on television are originated in progressive.

Perhaps you are thinking of all those complaints from movie viewers across the planet who are unhappy over the "pretty horrible camera moves" and "the way motion looks pretty jerky" in their favorite feature films. Shooting progressive at 24P or even 30P requires a different camera technique than 60i. Big deal.

The "dozens of the brightest engineers across the planet" suggesting 1080i were not "trying to come up with the ultimate entertainment display system." They were trying to come up with something cheap and reasonably cheerful.

I'm not upset about interlaced. Just unhappy when I'm doing postwork that requires deinterlacing and then reinterlacing, both lossy. Then knowing that my now reinterlaced footage will be deinterlaced again in the HDTV, in order to be showed on any of today's screen technologies that are all progressive.

VanLazarus wrote on 3/13/2006, 12:12 PM
Thanks for the replies everyone.

Farss,
Perhaps it would be best if I could post small clips of the examples I'm talking about. Is there a way to do this in this forum (ie attach samples)? I haven't tried "Reduce Interlace Flicker".... Will do.

The clips were taken with a still camera but are not photos... they are Motion JPG AVI files.

Johnmeyer,
Regarding the "disabling resample" setting: My general experience has been that I get much better results with this setting enabled. With this setting disabled, blending occurs when changing framerates. The blended versions of clips with motion look terrible to me. In the example I started this thread with I'm going from 30fps to 29.97fps, so blending will not have as negative an effect as it would when going from NTSC to PAL.... but it is my understanding (I may be wrong) that the "disable resample" setting has absolutely no effect on interlacing, only on whether the interlacing part of Vegas receives an original frame from the clip OR a blended frame generated from two frames in the source clip.

My short rant about interlacing was not intended to belittle the genius's that pioneer video technology. I was just lamenting the inevitable nature of our technologies.... That being that we sometimes hang-on to old and proven technological methods because changing the infrastructure to support better but incompatible methods costs far more (and generates less profit).... not to mention the greater risk involved. I am also irritated by the inevitable variety of competing standards that companies and countries create. After dealing with NTSC/PAL, interlacing/deinterlacing, and 20 different kinds of burnable DVD's, I was really hoping that the new HD standards would be able to drop some of the old "backward compatible" methods.... interlacing being the one I'm familiar with. I guess that's because I like new technologies and don't mind spending money to get them... I realize that not everyone is like this. Many people have the perfectly acceptable opinion that if it isn't broken, don't change it. But if everyone thought this way, then the more painful advances would never be made. It will be a happy day for me when I don't have to deal with interlacing.

Coursedesign,
I completely agree.
farss wrote on 3/13/2006, 12:24 PM
I certainly agree with your comments re interlacing, I'd add the use of YUV over RGB and low frame rates as well.
But look at how the world seems headed, mp3 compression, crappola speakers, and now video being watched on postage stamp sized screens.
Bob.
VanLazarus wrote on 3/13/2006, 12:45 PM
johnmeyer,

One thing I forgot to mention in my reply was regarding your comments on the artifacts with motion in film. I am not one to say film is better than interlaced video. I do not work in the film industry, rather, I come from a background making video games. In any 3D video game, framerate is king. Playing Half-life 2 at 24fps is ok..... but playing it at 60fps is like a dream. If film was 60fps, I don't think you would have many complaints with it and motion.

As a movie fan, I'm looking forward to the day when I can see a digital movie in every theatre at progressive 60fps.... why stop at 60fps?.... whatever speed the human eye can convey information from the retina to the brain. People can still record in 24fps (or any framerate) if they want to go for an artsy feel, or surreal feel.... but I'd like to see motion on the screen at a speed that fools my eye so well, I don't know that it was recorded.
VanLazarus wrote on 3/13/2006, 12:51 PM
Farss,

I hope that people watching more and more video on the multitude of devices with tiny screens will make the theatre experience that much more special. Long live the movie theatre!
VanLazarus wrote on 10/21/2006, 3:29 AM
Well, I've come back to this problem... and I'm pretty certain that there is a bug within the MainConcept MPG encoder for Vegas. It definately is having problems interlacing video. I have a video example that is panning at a consistent speed to the left..... and after a short time, huge jaggies appear during playback.... it's as if the interlace fields got suddenly reversed into the wrong order. Before the jaggies appear, the interlaced video looks nice and clean (like progressive).

Is there a place that I can post this example so that others can see?
Coursedesign wrote on 10/21/2006, 3:49 AM
Are you viewing the output on an interlaced display (CRT) or on a progressive ditto (LCD, plasma, DLP)?
VanLazarus wrote on 10/21/2006, 3:50 AM
Just found the source of the problem!!! It seems that MainConcept MPG encoder gets confused when interlacing video AND changing the framerate. If I encode this progressive 30fps into interlaced 30fps, no problem.... nice and clean. If I encode it into interlaced 29.97fps, after a few dozen frames, huge jaggies.... like it's grabbing field data from incorrect progressive frames as it reduces the number of frames.

All tests above were done with "Disable Resample" set on. If I set it to "Smart Resample" then I have jaggies in the output video whenever changing framerates (and this is unacceptable).

What I'm going to have to do (unless anyone has a better idea) is pre-process all the 30fps video into 29.97fps (without resampling!!). This means individually adding every 30fps into the timeline and outputing them into 29.97fps clips. I'd like to do this in VirtualDub, but unfortunately these clips are all MOV files and Vegas is the only thing I have that can process them.

Then I can include these processed clips into my main project and the MainConcept MPG encoder will produce nice and clean interlaced video.
VanLazarus wrote on 10/21/2006, 3:52 AM
I am viewing it on my LCD monitor.... Yes, I know it's a progressive display. But the change in quality half way through the MPG suggests that this problem is not related to viewing interlaced video on a progressive display. I believe that if this was the problem, then the entire video would have bad jaggies.
VanLazarus wrote on 10/21/2006, 10:12 AM
Hmmmmm.... now a new problem has come up. Vegas can't tell properly which source video clips are interlaced and which are not.

To solve the problem that started this whole thread I've converted all my 30fps MOVs to 29.97fps AVI (uncompressed and progressive).

My projects is a mix between these newly created uncompressed progressive AVIs and original interlaced NTSC DV AVIs.

If, under the File -> Properties -> Video tab, I set "de-interlace method" to "interpolate fields" then my DV AVI's are rendering nicely, but my uncompressed AVIs appear like the resolution has been lowered and you can see exaggerated pixel lines. This must be happening because VEGAS THINKS THAT MY PROGRESSIVE UNCOMPRESSED AVIs ARE INTERLACED. When I highlight the progressive AVI file in the Vegas explorer, it shows that the video file is interlaced (which is wrong!). So Vegas is trying to de-interlace progressive video!!!

If, under the File -> Properties -> Video tab, I set "de-interlace method" to "none" then my uncompressed progressive AVIs are rendered perfectly, but my DV AVIs (which are indeed interlaced) look bad because the two fields are just combined with no blending or interpolation.

I'm nearly at the end of my rope here.... I've spent the whole day trying to solve the original problem and, uncharacteristically, Vegas has been giving me problem after problem.

Does anyone know how to get Vegas to properly determine if a video is interlaced or not?
VanLazarus wrote on 10/21/2006, 10:20 AM
Well, I found a manual workaround.... I can open up the Properties for each clip that Vegas thinks is interlaced (but isn't) and change the "Field Order" entry to "None (progressive scan)". Then Vegas understands that this clip is indeed progressive.

It would be nice if Vegas understood this on it's own.
farss wrote on 10/21/2006, 3:29 PM
Vegas or anything else for that matter has no way to determine if footage is interlaced or not as far as I know, unless perhaps there's some flag set to tell it.
From what I know 24p material is the only thing that's flagged in DV world. 25PsF and 50i read the same off tape, I'd assume 30PsF and 60i are the same.
You could try setting the de-interlace method to Blend in project properties rather than Interpolate.

Bob.