Creating DVD's Progressive instead of interlaced?

Hulk wrote on 12/4/2008, 2:56 PM
A short while ago when creating DVD's using "old" NTSC interlaced video I began to convert the video to progressive for use in the DVD. My reasoning being that just about everyone these days has a flat screen TV, which of course is a progressive device by nature and will convert the interlaced video to progressive.

I'm thinking that it's probably better if I have control over this process in the NLE to pick the best deinterlacing method for the video in question. I've noticed that V8 does a very nice job with deinterlacing using interpolation on most video I throw at it. In addition going progressive elminates any field order errors!

Just wondering if anyone else around here has thought about going this route.

- Mark

Comments

reberclark wrote on 12/4/2008, 3:39 PM
I would like to hear input on this as well.
johnmeyer wrote on 12/4/2008, 3:57 PM
I asked this last summer:

Educate me: Why/when is deinterlacing needed?

Based on what I learned in that thread, as well as what I knew before, and what I have observed since, my advice is:

Don't

Reason?

EVERY TV, including all LCD, Plasma, DLP, etc. (i.e., modern sets) MUST be able to display interlaced material. What do you think "1080i" is? The "i" is for interlaced.

Some people say that progressive "looks" better. That simply is not true. It looks different, not better.

Whether you agree or disagree with my statement above, the following is, I believe, 100% correct:

ANY conversion from interlaced to progressive will ALWAY degrade the video.

This is because the software which does this conversion must "guess" as to how to move the alternating fields in both space and/or time in order to get them to line up correctly when they are displayed at the same moment in time on a progressive display. This is impossible to do correctly for all situations, so artifacts are always introduced.

More importantly, even if the delinterlacing software were perfect, when you go from 60i to 30p, you lose 1/2 of all your temporal (i.e., time related) information. If I told you that some operation you were going to perform was going to cause you to lose half the vertical information so that your HD video was now going to be 1920x540 instead of 1920x1080, you would not like that, would you? Well, when you convert from interlaced to progressive, you are doing exactly that (throwing out half the information) except that you are doing it in the time domain instead of the spatial domain.

[Edit] P.S. Since I am rather strong in my statements above, let me just be clear that I am answering the original question as asked, and this question asked about making a DVD. There are plenty of other reasons for deinterlacing (converting to progressive), and these are very well explained by several people who did a great job answering my questions in that thread I linked to above. Most of these have to do with creating video for the Web, or for viewing on a PC which for whatever reason, cannot do a good job of displaying interlaced material.

Laurence wrote on 12/4/2008, 5:21 PM
I agree with the previous post 100%. There are really just two reasons to deinterlace:

1: The video is going to be displayed on a computer. In that case, just deinterlace the computer playback version.

2: You are fitting a small amount of interlaced material into a progressive scan shot project.

Making a progressive scan DVD out of progressive scan material can look wonderful, but deinterlacing interlaced footage is nowhere near the same thing. The overall look always suffers.
Hulk wrote on 12/4/2008, 6:35 PM
Um before things get blown out of hand here let's keep in mind that LCD's and Plasmas and Rear Projection and Front Projectors are inherently progressive devices. They "paint" the entire picture on the screen in one swipe they DO NOT display ANY interlaced content in an interlaced fashion as does a CRT display.

Period.

1080i does not display at 60 fields per second on a flat screen LCD.

The video will be getting deinterlaced by the display devices logic board. Some TV's do it well but most are pretty bad at it and that is why SD TV usually looks horrid on a flat screen. As I said some of the new ones do it better.

The question here is really that of how much picture quality is lost when the DVD must be displayed by someone with a traditional cathrode ray display? And more importantly how many people you deliver the DVD will be viewing it on one of those?

So as Laurence says in point #1 a good reason to deinterlace is if the content will be displayed on a computer monitor. Well computer monitor=LCD TV=Plasma TV=Front projection=Rear projection.

Yikes more confusion over this issue.

- Mark
Laurence wrote on 12/4/2008, 6:46 PM
Don't get me wrong, I like progressive scan DVDs. I just don't think that deinterlacing interlaced footage is a good way to create them.
GlennChan wrote on 12/4/2008, 7:33 PM
Vegas' deinterlacing is not the greatest.

2- I would just keep it simple and deliver it interlaced. You'll not gain much by making things more complicated (and running it through mediocre deinterlacing, when future TVs may do a much better job).
johnmeyer wrote on 12/4/2008, 8:17 PM
One thing I forgot to mention. If you or client ever will watch this DVD on an older (CRT) TV set, then the progressive DVD will definitely look worse.
Matt Houser wrote on 12/4/2008, 9:03 PM
If my footage is 1080i upper field first, is there any problems when rendering to a DVDA NTSC DVD stream at lower field first?

Or is it best to maintain the "first field" all the way through for best results?
johnmeyer wrote on 12/4/2008, 9:26 PM
DVDs can be rendered either way (upper or lower first). I always set the project AND also render so that both match the source footage. So, in your case, that would be upper field first for project, and upper field first for the render to MPEG-2 which will then be authored and burned onto a DVD.
Hulk wrote on 12/6/2008, 8:05 PM
I would advice people considering this to have a good look at a DVD produced with interlaced video and deinterlaced video (done right in the NLE) on the LCD set up in most homes. You will see all of the ugly combing artifacts on the interlaced version. Not pretty.

This is just my point of view. But I'm always the kind of guy looking foward and I see tube TV's having pretty having seen a KT extinction event. And the deinterlacing on even the best players and displays is still piss poor.
John_Cline wrote on 12/6/2008, 8:21 PM
By deinterlacing you are cutting the temporal resolution of your video in half. That means 29.97 images per second instead of 59.94. Even worse in PAL, which is 25 fps instead of 50. Personally, I don't deinterlace in order to keep the temporal resolution as high as possible and there are still lots of people watching standard-definition DVDs on CRT televisions. My LCD monitor and DVD playback situation is pretty good and I don't see "ugly combing atifacts."

Like you, this is just my point of view. In my fast-paced world, temporal resolution is king. Ultimately, in the near-term, I'd like to be producing 1920x1080p at 60 frames per second.
Coursedesign wrote on 12/6/2008, 10:45 PM
By deinterlacing you are cutting the temporal resolution of your video in half.

So, John, what is the difference then between you deinterlacing somewhere before final rendering vs. letting the viewer's DVD player or LCD/plasma monitor do the deinterlacing to be able to show your video on these inherently progressive-only displays?

For practical reasons, I suggest we set aside the very very small number of HD viewers (0.1% of the public?) who are watching HD on CRTs they bought back when those were still manufactured.

johnmeyer wrote on 12/6/2008, 11:12 PM
Are ALL non-CRT displays incapable of actually displaying temporally-correct interlaced video?
farss wrote on 12/6/2008, 11:37 PM
As far as I can see most of them display interlaced video correctly. Of course LCDs have inherently slower refresh (persistance?) than CRTs. I notice Sony's latest Bravias refresh at 200Hz.

The worst 'interlace' artifacts I've seen came from progressive material anyway. It's very likely that frames get split into fields anyway and then at times some real horrors can happen.

Bob.
John_Cline wrote on 12/6/2008, 11:50 PM
I have a Sony V1u HDV camcorder which will record HD at 24p, 30p and 59.94i. I have a PS3 hooked up to a 42" LCD monitor via HDMI. The PS3 will play .M2T files which have been captured directly from the V1u. When viewing unmodified, high-motion recordings from the V1u in native 24p, 30p and 59.94i, the 24p and 30p recordings have a noticable and undeniable lack of temporal resolution compared to the 59.94i footage. The 59.94i recording has absolutely no combing interlace artifacts. It is a smooth as silk, the 30p recording is not and the 24p recording is even more of a juddery mess. Apparently, my PS3 and LCD TV is not merely deinterlacing 59.94i material and displaying it at 30p.

Like I said, full HD at 60p (at least) is my ideal, until then, 59.94i is it. Of course, broadcast 720p is already at 60 fps. Among other things, I videotape NASCAR, IndyCar and ALMS automobile racing and a high temporal resolution is just as important to me as spatial resolution.

That said, my earlier post was about SD DVDs, which still have a pretty large audience viewing on CRT televisions and there is no reason whatsoever to deinterlace to create SD DVDs. If your DVD player and LCD TV are displaying interlace artifacts, then let me put this as kindly as possible: your equipment sucks.
megabit wrote on 12/7/2008, 1:59 AM
I must say that while I agree with most arguments against de-interlacing for DVD, 2 things seem to not even have been mentioned in this discussion:

- there is no such thing as progressive DVD (or BD, for that matter), other than 24p. All the rest is interlaced (be it 50i or 60i), and it's up to DVD players / TV sets to display it properly

- therefore, I don't see a problem when someone shoots interlaced (50i or 60i) - just keep it that way to your final media (DVD/BD). The real dilemma is when you shoot progressive (as I do 100% of the time), and want to deliver on DVD or BD. What I need to do is actually "interlace", so that I'm able to prepare DVD-compliant clips.

Luckily, being in PAL area, my 25p material when rendered into 50 i ends up as 25PsF (or does it, Bob?), so I don't seem to be loosing anything. I guess it's the same with 30p->60i. And of course, 24p can be left as is.

De-interlacing 50(60)i for DVD-BD delivery is not a good idea at all and simply can't work when using fully-compliant authoring systems, like DVDA; 25(30)p not even being in the DVD/BD specs.

I did burn a couple of 25p-encoded BD's using Ulead (whose MF6plus HD system just doesn't care), but frankly never tried them on a standalone BD player/TV set. On the PC they work perfectly.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 12/7/2008, 2:15 AM
"ends up as 25PsF (or does it, Bob?), "

Yes.

I agree with John 50/60p is the way of the future. Interlace is a dog of a thing. I didn't realise how bad it was until I tried working with it in AE. Now I know why the VFX guys hate it.
Bob.
GlennChan wrote on 12/7/2008, 3:36 AM
Of course LCDs have inherently slower refresh (persistance?) than CRTs. I notice Sony's latest Bravias refresh at 200Hz.
In a LCD, light is being emitted all the time as opposed to film (each frame is flashed 3 times) or a CRT (there is a quickly moving spot that scans the image). Human vision is weird, so that when images are displayed this way it doesn't look right. (Then again, neither does interlacing or 24fps film being projected.)

The elements in a LCD also can't transition too fast between different states, which can lead to ghosting.

- Some LCDs do black frame insertion (the backlight stays on all the time). This helps with the motion issues but doesn't get rid of them. It also cuts the contrast ratio in half. You need a higher refresh rate in the LCD to do this.
Coursedesign wrote on 12/7/2008, 12:36 PM
Are ALL non-CRT displays incapable of actually displaying temporally-correct interlaced video?

I know only one $4,000 professional small LCD from Panasonic that can actually display interlaced (as opposed to receiving an interlaced signal and deinterlacing it for display).

All consumer LCDs (and most professional ditto) are progressive only.