To Deinterlace or not?

R0cky wrote on 3/13/2008, 2:25 PM
Starting with interlaced footage, either NTSC DV or HDV 60i.

Display devices are a projector and CRT TV both capable of 720p and 1080i. Screen is 110 inches diagonal.

For delivery to DVD which will give better image quality:

DV Source: edit and delivery interlaced? Or deinterlace (DG's smart deinterlacer) and deliver progressive on DVD?

Same questions for HDV 60i source being converted to SD for DVD delivery. Deinterlace using DG's smart one or Cineform Neo HDV. Image resizing will happen. Limited fast action.
---------------------------

HDV 60i source, delivery as m2t or divx to be played on a computer connected to the same display devices. Same questions.

thanks all,
Rocky

Comments

mdopp wrote on 3/13/2008, 2:52 PM
Rocky,
I did a lot of testing in this area in recent years and my recommendations are:

For DV source: Do not deinterlace.
If you did, you could only deliver 30 progressive frames per second instead of 60 interlaced frames.
The quality per image would not benefit from progressive scan as the source was already interlaced anyhow.

For HDV source the same principle applies.
However you need to make sure Vegas handles the different fields and resolutions correctly. It is best to use 60fps progressive HD as an intermediate format before you finally render out to 60fps interlaced SD:
1. Set up the project in it's native HDV format (HDV 1080-60i)
2. Change the project settings to 60 frames Double-NTSC, progressive,
deinterlace method 'Interpolate Fields':
3. Render the video using your favorite SD-template (in interlaced format)

Martin

P.s.: Of course you can argue that for HDV source material, progressive SD is an improvement because each HDV-field already has 540 lines. I prefer the smoother motion of interlaced material over the last bit of resolution coming from progressive rendering. But that's personal taste, I guess (and also depends on the type of material you are working with).

P.p.s.: If you render not for DVD but for playback on computers I would always prefer to deinterlace the material in Vegas because most computer playback software does a very poor job when deinterlacing. In this case you could also render 60fps progressive and get the best of both worlds.
johnmeyer wrote on 3/13/2008, 4:44 PM
Never deinterlace unless you have to. Since the display device can display interlaced material, don't deinterlace.
johnmeyer wrote on 3/13/2008, 4:50 PM
Duplicate post because the Sony server returned a "server not found" message so I used the "back" button on my browser and re-posted. Sorry, but IT'S NOT MY FAULT!
Kennymusicman wrote on 3/13/2008, 4:57 PM
hehe. Johns message was worth saying twice.. Or perhaps one is the upper field, and the second is the lower field so we can see the complete answer?

(meh - sorry!_)
John_Cline wrote on 3/13/2008, 5:04 PM
When resizing from HD to SD, Vegas already deals with individual fields, so the extra step of going to Double-NTSC is unnecessary.
CorTed wrote on 3/13/2008, 5:09 PM
So what is up with the MPG files from a sony HDD camera. Each time I throw that on the timeline, I have to check the deinterlace check box. If I don't the output is extremely jittery.
Is this due to the compression of the video??
johnmeyer wrote on 3/13/2008, 5:15 PM
I have to check the deinterlace check box. If I don't the output is extremely jittery.No, this is due to upper vs. lower field, and you are doing the wrong thing by deinterlacing. The proper thing to do is to get the project properties and Render As field settings to all match. Interlacing does not make video "jittery."
Spot|DSE wrote on 3/13/2008, 5:36 PM
Some folks describe interlaced displays on a progressive monitor as "jittery" although it's a mis-application of the term.
It's expected, when viewing interlaced content on a progressive scan monitor, to see interlace artifacts. This is one reason it's a good idea to have a broadcast monitor handy, or if working with HD, you'll use the DeInterlace setting for secondary monitor. None of this has any bearing on output, however.
We generally deinterlace everything, and rarely at double NTSC/60p
CClub wrote on 3/13/2008, 7:32 PM
This is where buffoons like me stick pins in our eyes. I wait with baited breath for an answer from one of the forum elders. I see johnmeyer saying "Don't de-interlace." I write it on my master "Editing Protocol" list. But wait... Spot responds: "We de-interlace everything." Given that I use Cineform Neo for nearly everything, I had written down David Newman (one of the company heads) recommendation to de-interlace when using Cineform intermediates. I shall now run for the pins and begin poking.
DJPadre wrote on 3/13/2008, 7:38 PM
Im with spot ion this one..

SD is shot Progressive anyway.. Vegas cant really handle progressive so it treats them as segmented frames so it "behaves" like interlaced when it comes to processing.

Ive got Native progressive footage which Vegas has ruined and now this footage looks like home video even though its nowhere near it.

The motion cadeance is interlaced and this is Vegas doing.
This is one of my biggest gripes with this program as in the past, we used to deinterlace interlaced footage, which gaveus good results, now though as the source is progressive, Vegas weakness' are coming to the fore.

As for HD, we start with Interlaced if were going down the SD DVD delivery.
Reason being is that we can reframe and run slowmotion without a problem with this footage. Reasaon we interpolate to progressive, is because it offers the highest actual resolution possible (not temporal) and we retain our motion cadeance

If we were to use frame mode (canon A1's) for one, we wouldnt be able to use slowmotion as Vegas pukes with slowmotion on preogressive scan. Frame doubling instead of temproal frame generation

The predominant reason we use interlace... slowmotion in vegas.

There are ways around it with other apps etc, but who can really be bothred when the clients wont notice the difference anyway... not most clients anyway..

Aside from Slowmotion, the motion cadeance of progressive is VERY noticable.. even on plasma and LCD panels. This differentiates our work and to be honest, it doesnt look as amateurish (when looking at the fact that I am specifically refering toi the wedding market)
That and visible combing arrtefacts are prevalent on higher end displays, makes me stick with progressive

There is also the issue of rendering.
Its at least 20% faster to render progressive scan than it is interalced.. irrespepctive of the source

field order issues
Now when downscaling from 1080i to 576p, we go straight to progressive, because as soon as u start messing with field orders, then your screwed.
Vegas already has issues with field order dominance so we play it safe and avoid that issue altogether.
We use interlace for all sources, then scale down to progressive so there is no field order problems later
johnmeyer wrote on 3/13/2008, 8:27 PM
Well, I generally defer to Spot on this sort of thing, but I'd sure like to hear his reasoning as to why he deinterlaces everything. I'd also like to now what technique he uses for deinterlacing.

Here's my thinking on the subject.

If you are going to display on a monitor which is not capable of displaying interlaced material, then I understand why one would deinterlace. However, if your monitor can display interelaced video, here's my argument for why NOT to interlace:

1. Once you deinterlace, you have 30p instead of 60i. 30p has a completely different feel. It is not as fluid and "video-like." If that is what you want, then fine, but if you prefer video to film for the look of a particular project, then you're going the wrong way by deinterlacing.

2. Deinterlacing always involves degrading the image in some way. It can never be entirely "right."

For instance, If a ball is thrown in front of a locked-down 60i camera, the odd lines that show the ball will display the ball in one position, and the even lines in a later position. The deinterlacing software has to be damn smart to move everything in the correct way to get the alternate fields to line up when shown at the same instant in time (which they will be when shown as 30p).

Now, imagine two balls thrown towards each other, from each edge of the frame. How do you get that right?? Answer: It is almost impossible to get it right.

Thus, unless Spot knows otherwise (and if anyone would, he would), I think that deinterlacing ALWAYS results in degraded footage. If you are going to watch on a progressive-only monitor, then that degradation may be a good tradeoff to avoid the interlacing artifacts. However, since there are by definition, no interlacing artifacts when watching on an interlaced display, there is no reason to do the operation in the first place if your monitor can display interlaced.

3. Deinterlacing takes extra processing time, and therefore lengthens your workflow.

I'm sure I can come up with others, but I'm not trying to write another thesis.

One last thought which definitely does not apply to Spot but may apply to others and that is the very mistaken belief that interlaced video is somehow "worse" or inferior to progressive. Over at doom9.org, there are lots of people that believe this. If this were really true, then when the new HD standards were designed and implemented, interlaced would have been thrown out. It was not, and that is because the "video look" is something many people want and enjoy for many types of entertainment, just as others desire and appreciate the "film look" that is, in part, the result of both progressive projection as well as the fewer number of events per second (30 or 24).
John_Cline wrote on 3/13/2008, 8:45 PM
I'm with John Meyer on this one. Interlacing is our friend and Vegas deals with it just fine. Sure, I'd like to have an 8000x6000 image at 60p, but that ain't happening right now, so I'll stick with 1920x1080 at 60i. I just can't stand the strobing effect of 24p or 30p. Unlike 24p video, in a theater film projection system, each of the 24 frames per second is shown twice, which reduces "flicker." But a quick pan across a picket fence will still strobe.

For anything thats destined for the web, I make a 960x540 intermediate by throwing out a field (one field is 1920x540 and I then rescale it to 960x540.) I end up with a perfectly deinterlaced 30p file. While I don't care for 30p, it is a practical compromise for the web. Then I can resize it to whatever I want.

John

http://en.wikipedia.org/wiki/Persistence_of_vision

http://en.wikipedia.org/wiki/Flicker_fusion_threshold
johnmeyer wrote on 3/13/2008, 9:35 PM
For anything that's destined for the web, I make a 960x540 intermediate by throwing out a field (one field is 1920x540 and I then rescale it to 960x540.)Excellent point. If you can sacrifice resolution (which you actually NEED to do when posting to the web), then you use your technique to get perfect progressive.

I'll admit, however, that I'd have to scratch my head to come up with the Vegas settings to "throw out a field." Does Vegas do this all by itself if you render 1920x1080 interlaced footage using 1920x540 progressive settings in the Render As dialog, or do you have to change some other setting?

John_Cline wrote on 3/13/2008, 10:04 PM
To be perfectly honest, I've never tried it in Vegas. I've always used Virtual Dub. I use the built-in deinterlace filter and select "Discard Field 1" (or 2, it doesn't matter) which results in a 1920x540 progressive image (or 1440x540 if I'm doing HDV.) Then I put the resize filter in the chain and set it to 960x540 and generate my intermediate file. (This is valid for both 1920x1080 and 1440x1080 files (although you need to disable the "maintain aspect ratio" checkbox in the resize filter if it's 1440x540.)

Now that I think about it, I'm guessing that Vegas would be smart enough to do this by just setting the render properties to 960x540 progressive. I've just always used Virtual Dub and old habits are hard to break.

John
johnmeyer wrote on 3/13/2008, 10:19 PM
I generally use VD or AVISynth as well. A lot more control and you don't have to guess how something is being done.
John_Cline wrote on 3/13/2008, 10:58 PM
Yeah, you can't beat VD and AVISynth when "surgery" is required.
NickHope wrote on 3/13/2008, 11:59 PM
For anyone interested, my Vegas to Stage6 guide describes the process of frameserving to VirtualDub and discarding field 2 for the perfect deinterlace to 960x540.
PeterWright wrote on 3/14/2008, 1:23 AM
I must admit to still not being full bottle on this issue.

If, say, I shoot 1920 x 1080 25P and then Render to SD DVD 720 x 576 using the DVDA template...

Does Vegas RE-interlace, and if so does that mean it has two identical fields per second?
Wolfgang S. wrote on 3/14/2008, 2:47 AM
Keep in mind, that the Vegas deinterlacer is far away from to be perfect - that is also one reason why I never deinterlace my material. If you deinterlace in Vegas, the Virtualdub adapted filters Smart Deinterlace by Mike Crash are worthwile to be used.

http://www.mikecrash.com/modules.php?name=Content&pa=showpage&pid=6


Desktop: PC AMD 3960X, 24x3,8 Mhz * GTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

PeterWright wrote on 3/14/2008, 3:01 AM
Yes, but by shooting progressive there's no interlacing involved initially - I'm wondering what happens after that ....
Grazie wrote on 3/14/2008, 4:38 AM
Pete, thanks for asking "wondering" the exact same as I. Interesting . .
farss wrote on 3/14/2008, 4:53 AM
Shooting lets say 25 frame per second one can record 25 discrete frames per second aka 25p. Not much in video land works this way.

You can split each frame into two fields and get 25PsF and that can be recorde exactly the same as 50i. Every thing in the video chain "thinks" it's interlaced so it gets handled with no issues.

Except the two fields were taken at the same time, there's the trick. So the two fields can be combined back into the one frame with zero loss and no need for any fancy smart de-interlacing, You get back to exactly the same as if you'd recorded 25p.

The only caveat is making certain that some progressive displays such as HDTVs don't see it as interlaced and try to do fancy tricks like "Bob" and make a complete hash of it. Using VLC you can simulate this to see how, depending on how dumb the display device is, things could go wrong.

Now if you've shot 25PsF you need to make certain Vegas sees it as Progressive and you edit on a Progressive T/L otherwise some FX can endup rendered interlaced.
Bob.
Grazie wrote on 3/14/2008, 6:01 AM
Now, my one and only remaining brain cell is deinterlaced.
NickHope wrote on 3/14/2008, 6:02 AM
Mine's got the jitters.