Interlace on progressive displays

Serena wrote on 3/26/2006, 11:03 PM
Unfortunately I'm fairly ignorant when it gets down to detail in electronic display technology, so this is probably a no-brainer. Coming from film, my eye is always distracted by jaggies and I go to quite a bit of trouble to remove them from my material. Yet jaggies are either invisible to many who've grown up with interlaced displays and are ignored as an artifact (as I have no problems with 24fps cadence and slow pans etc), or maybe there's something else.
Interlaced displays show fields separated by 20ms in time (or 1/60 sec in NTSC land). How are fields handled on a progressive display? Are both fields shown simultaneously? If the frame is a simple combination of fields then jaggies are a natural result for any material shot as interlaced.

Comments

farss wrote on 3/26/2006, 11:40 PM
Really depends on the display / what's driving it.
Some do a pretty horrible job of de-interlacing, others do an excellent job. Keep in mind that the display device may also be rescaling the vision as well.
The video that I've showed through big LCD projectors in cinemas looked about as good as SD PAL video can under the cirumstance.
The telecined film on the same tape, through the same projector looked probably worse but that wasn't the fault of the projector or the telecine transfer, just the letterboxing threw away too much of the precious resolution. Neither had a jaggies problem.
From memory some of the edges of motion from the video sourced content looked a little blurred or soft but them I'm being very critical, I was looking for issues, certainly no worse than you'd have gotten from the normal shutter speed in a film camera.

My advice is to leave 50i footage as it is and let the display device handle the de-interlacing, they're getting better all the time. If you're regularly screening through the same display device run a test.

Bob.
Serena wrote on 3/27/2006, 12:44 AM
Staying with 1080i doesn't produce satisfactory images, at least not for me; SD is OK, I agree. The thing that raised the query this time is a DVD I've just received and I was having a preliminary look on my laptop. Interlacing artifacts were quite obvious, although I suspect many people wouldn't be conscious of them. The images were great and the content engaging, so the artifacts didn't detract from the video - but this made me wonder around the question I posted here.
farss wrote on 3/27/2006, 3:44 AM
Well if I look at 50i footage on a LCD in the Vegas preview window at Best Full then the artifacts are really bad and obvious.
Make a 50i DVD and play that back in the same PC etc and it doesn't look half as bad. So from that and what I've read the player software is doing some form of de-interlacing and better players (perhaps also overall CPU / GPU grunt) are doing a better job as time goes on.
So I guess where I'm coming from is if I de-interlace the footage myself now then that's as good as it'll ever be. If I leave it as 50i then as technology etc improves the results on progressive displays will improve.
I've only tried a small amount of 50i HDV on the Vegas second monitor preview to a 1080 LCD and the de-interlacing looks pretty good when enabled in Vegas, just what's doing the job I don't know, could be Vegas code or the GPU.

BTW a few ABC programs are shot at 50i and then run through a very expensive de-interlacer / 'film look' box, probably from S&W. I think M.D. was done that way.

Bob.
jkrepner wrote on 3/27/2006, 6:31 AM
I can confirm Bob's approach: rendering out of Vegas as interlaced is your best bet. I just did a DVD for a client (using a LCD screen at a tradeshow) and burned two DVDs for them, one interlaced and one progressive. The interlaced DVD ended up looking better on the progressive display. I was surprised because the preview of interlaced footage from Vegas (on a borrowed LCD TV) looked terrible. After I burned the footage to an interlaced DVD and played back on that same LCD set from a DVD player, the picture looked a lot better. The DVD player also had progressive scan outputs and that really helped.

Jeff
Serena wrote on 3/28/2006, 1:33 AM
Interesting. Bob, you've talked around the reason for my question -- how is information being processed? My concern is exactly the point you've made -- when I process the images so they satisfy me on LCD displays and projectors, am I setting up future problems on different displays? Presently all my general output is going out rendered to SD on DVD (from 1080i originals) and that plays well on CRTs as well as LCDs. More recently I've been processing using DVFilmMaker (progressive) because that produces excellent HD images displayed on LCD projectors and computers -- much better quality than any other approach I've tried. Obviously there's quite an overhead in this approach, but batch works nicely overnight so I'm not sitting around waiting. The projector I use accepts both interlaced and progressive inputs, but how these are processed isn't clear to me. Seeing which looks best is helpful but not technically informative.
Perhaps there is some overlap in this thread with Laurence's: "HDV and deinterlace method", since Vegas is common to all.
farss wrote on 3/28/2006, 3:57 AM
Well specifically no, I cannot see anyway you're going to have problems down the road. Once you've converted to progressive that's it. Although what you've created is in fact 25PsF, 25 progressive frame per second split over two fields.
This will work on both 50i and 25p display devices, damn well better or else Hollywood is in trouble in PAL land.
So, no matter what the display device does it'll see no need to apply any fancy de-interlacing so what you send to it will be what you see.

In general (and I'm no expert) there seems to be a number of levels of de-interlacing technology. Most basic is ditch one field and interpolate, simple but you loose about 50% of the vertical res. I think that's roughly what you're achieving with CF25, but with the downconvert to SD you can easily afford the loss.
The next step up looks at how different the two fields are, if there's very little difference then the two fields are merged, if there's a lot of motion then one field is ditched and the other interpolated.
Next up gets a bit smarter still, those parts of the two fields with no motion are merged, those with motion are interpolated.
The very best seems to look for motion vectors and move parts of the fields back into alignment and then merge. As far as I know only hardware boxes use this, probably to protect the IP, although I wouldn't be surprised if those smarts are now in some of the software only solutions, I think Twixtor might be able to do this trick.

So where does that leave you. Well the better the method you use now the better the results will look. Your images are not at the mercy of the smarts in the display device, how it looks now is how it'll look, forever. I can see no way it'll look any worse or any better.
I guess the latter was the thrust of where I was coming from.
All that side, if the de-interlacers in the display devices improve dramatically you can always re-render to 50i. If you want to release in HD that door is still open to you if you've shot 50i and a very good HD de-interlacer becomes affordable.

Bob
Serena wrote on 3/28/2006, 3:46 PM
Bob, good summary. I presume DVFilmMaker interpolates action between fields (rather than time correction) but the result is satisfying. Just to review I paused on some fast action (drummer in trio) and the image is nicely photographic and in motion I see no problems with smoothness. Guess this is all a matter of acceptable compromises, and what is acceptable has a personal bias. 8 bits is probably the biggest compromise, and mpeg compression, and so on.

Serena