Shooting interlaced - what's the point?

ingvarai wrote on 12/14/2008, 3:49 AM
Taking the risk of beating a dead horse (I know this topic is frequently discussed), I would like to know this:

Is there any point at all to shoot interlaced video? I have an amateur camera, a Canon HF 10, and it can record in i50 or p25 (this is PAL land). Having followed some threads in this forum, I no more see any point in using the i50 setting at all.
My Vegas project is aimed at flat wide screen TVs only, with some additional multimedia purposes like You Tube.

Using the p25 setting should avoid the deinterlace dilemma completely, as I see it. I made some test recordings using both methods, the same object moving at the same speed, and honestly I saw no difference at all, when playing it back using the Canon bundled software, Image Mixer 3.

Comments

farss wrote on 12/14/2008, 5:00 AM
Shooting 25p is quite a chore, 30p is a bit easier it seems.
Worth a mention that 50i gains you around a stop of light, that can come in handy. I shot HD 50i 270deg shutter, de-interlaced to 720p it looks pretty good but no huge amounts of motion, no fast pans or zooms.
If you see no difference between 25p and 50i something strange is going on. Of course if the object is moving slowly enough you probably wouldn't. Try a pan instead.

Bob.
ingvarai wrote on 12/14/2008, 5:10 AM
If you see no difference between 25p and 50i something strange is going on

And what am I supposed to see?
John_Cline wrote on 12/14/2008, 5:12 AM
The point is that shooting interlaced gains you twice the temporal resolution of shooting 25 fps progressive. 50i gets you 50 images per second and 25p gets you 25 images per second. We go round and round about this on the forum, I happen to be in the camp that can't stand the "judder" of 24p, 25p or 30p. Of course, I shoot a lot of really fast motion material.
ingvarai wrote on 12/14/2008, 6:03 AM
I know the initial purpose of interlaced video.
The question now is: What is the point shooring interlaced, since the prevalent TV screens do not honour the interlaced video benefits, but instead live their own lives and deinterlace the nice interlaced video before it reaches our eyes?

Or to rephrase it:
Will I notice the difference. And how?
Myself I kvow the difference between a cinema look and a video look when watched on a CRT TV. But not on a LCD TV.
Th question is interesting because intelaced video face som challenges when processed in for example Vegas, and if one could avoid this, it would be velcome.

Grazie wrote on 12/14/2008, 6:21 AM
I happen to be in the camp that can't stand the "judder" of 24p, 25p or 30p. Thanks John - I thought it was just me, and me being "Luddite".

I see this on TV transmissions on my CRT TV and hate it. I see it on LCDs and loathe it. It is just plain jarring and I don't know how to get over myself with this. I just can't do it.

Grazie
Laurence wrote on 12/14/2008, 7:20 AM
I'm working on a 24p thing right now (my footage needs to be worked into an existing ongoing 24p project). It's a documentary with mostly just talking heads and slow moving B-Roll. I must say that I will be pretty happy to go back to 60i once this is done. Even when it's just a talking head, the judder that you see in the moving jawline and head movement really bugs me. Yeah the image is a tiny bit sharper in progressive, but at HD resolutions the 60i image is plenty sharp.

Now this little Kodak Zi6 camera pockets sized camera I just got, shooting at 1280x720 at 60p looks really nice, but my laptop (a pretty fast Intel Core2Duo) can't seem to play back the footage at full resolution and frame rate. I believe that the faster interlaced HD modes will be with us until 50p and 60p become practical, which is going to be the next couple of years at least.
fordie wrote on 12/14/2008, 7:44 AM
well I shoot both 50i and 25p. then convert 25p to 24p for blu ray.
Ice skating..50i
other stuff ..25p
50i handles the motion much better and it still looks fine on a big plasma.
shutter speed is important shooting 25p, my canon xh-a1's default to 1/25 (drives me mad!) but 1/50 is correct.
John_Cline wrote on 12/14/2008, 7:57 AM
"What is the point shooring interlaced, since the prevalent TV screens do not honour the interlaced video benefits, but instead live their own lives and deinterlace the nice interlaced video before it reaches our eyes?"

I think this is a commonly held misconception, a LOT of people assume that an LCD panel is only capable of displaying 30p. I have a Sony V1u that can shoot 1080-30p and 1080-60i. (Of course, it's actually 29.97 and 59.94, but for the sake of discussion, I'll call it 30 and 60.) When viewing both types of footage on a variety of progressive TVs, from inexpensive Vizio LCD TVs to fairly pricey Sony SXRD models, I can EASILY tell the difference between 30p material and 60i. The televisions are not simply deinterlacing the 60i footage and displaying it at 30p, there is something much more sophisticated going on, they seem to be deinterlacing and maintaining full temporal resolution. This would be fairly simple to do by taking each field and turning it into a full frame. In the case of 1920x1080i, there would be sixty 1920x540 frames which could each be rescaled to 1920x1080 and displayed at 60p. This seems to be what's going on. An LCD panel would only need a minimum of a 16ms response time to pull this off, well within that capability of most modern LCD panels. (For PAL, it only needs to be 20ms.) In fact, except for upscaling back to 1920x1080, this is how I make 960x540-60p Flash and .WMV files from 60i HDV. It looks great and has the full 60 fps temporal resolution.

I don't know exactly how the TVs are converting from interlaced to progressive, but I do know that 60i footage displayed on any of my LCD TVs is completely free from judder and 30p footage is not. Also there does not seem to be much, if any, loss of spatial resolution with the 60i footage, so some sort of magic is happening.
johnmeyer wrote on 12/14/2008, 8:01 AM
If you can't see the difference, well, to say the least that is surprising.

Shoot test footage and pan the camera back and forth. Not violently, but reasonably fast. Do it as if you are trying to capture the entire panorama of an outdoor scene, or show the entire room. Do this at both 25p and 50i, if those are your two settings.

If you don't see a MAJOR difference, even on a modern display, then either there is something wrong with your equipment or your technique. It is not subtle.

Slow-frame progressive (like 24p or 25p) is going to give your video a slightly "removed" feeling that can be sensed even when there isn't much motion. Interlaced video -- not because of interlacing per se, but because of the fact that this technique gives you twice the temporal resolution, as John already stated -- makes motion seem very lifelike.

I cannot imagine shooting sports or any other fast action using any other technology.

Interlacing has gotten a really undeserved negative connotation in both this forum and elsewhere. Every single technique for displaying moving images involves showing one image after another and they all have certain artifacts. Good old film has a major artifact, namely the flicker introduced by having a shutter. The "improvement" made to cover this is to open/close the shutter several additional times while each frame of film is shown. Hardly a great thing, but it keeps people from getting migraines while watching a long movie.

Video (50i, 60i and 1080i) has interlacing. It actually doesn't have that many issues (compared to projected film) except when trying to do fancy post production editing (cuts-only works fine). If it was such a lousy idea, I can guarantee you it would have never been carried forward into the modern age in the form of 1080i, but it was carried forward because it delivers a "live" look that 24p and 25p cannot, and will never, be able to deliver.

I spent quite a bit of time and effort to develop (I can't say "invent" because two other people have done it before me) a series of techniques that takes kinescopes (24p representations of 60i material) and convert them back to 60i video. Thus, I have spent a LOT of time looking at all the things which separate not only film from video, but also 24p (which is what a kinescope is) and 60i. The two are completely, totally, different.
Laurence wrote on 12/14/2008, 8:07 AM
I believe that 60i is converted to 60p with odd and even fields being interpolated up to full 1080p or 720p resolution (depending upon your HD TV).
John_Cline wrote on 12/14/2008, 8:15 AM
Laurence, that's what I was trying to say, just not as succinctly as you just did. Thanks.
ingvarai wrote on 12/14/2008, 9:03 AM
Ok ok ok, guys and gals, I am convinced. And this is the answer I was after. Why I did not see any difference? I said:
I saw no difference at all, when playing it back using the Canon bundled software, Image Mixer 3

I did not say I tested it on TV.

JohnMeyer:
If it was such a lousy idea, I can guarantee you it would have never been carried forward
So - when do you want to use progressive scan then?

Laurence wrote on 12/14/2008, 9:32 AM
Progressive scan is useful the following times:

1: You are going to transfer to film.
2: You are doing a movie and you are really controlling the motion well using high budget techniques like dollys and tracks for the moving footage.
3: It is a web delivery or computer viewing only project.
4: You want to use it mostly to grab stills (example: you want stills of different points of a golf swing).
5: You are doing a low motion DVD (talking heads and slow moving b-roll) and you want the uprezzing algorythms on your HDTV to look as good as possible.

If you are shooting progressive, make sure that you are really controlling your motion. Use handheld shots as sparingly as possible. Tripod everything, Don't zoom and pan slowly enough that you don't see the judder. Look at dollys and tracks for motion. It's a whole different world than using regular video where you can just grab a camera and start shooting.
Coursedesign wrote on 12/14/2008, 10:51 AM
I hate interlaced for several practical reasons, #1 being that it is near impossible to do a lot of post work in interlaced, so you end up having to deinterlace, do the work, and reinterlace.

Not exactly ideal for quality.

Still, I appreciate that there are still times when interlaced is necessary to reduce video bandwidth, just like this was the main reason for its invention. As much as we'd like 1080p60, that is not practical for today's content distribution with MPEG-2, which is how most content is distributed in the U.S., whether OTA, cable, or satellite.

Europe still has a chance to standardize on 1080p60 if they go with MPEG-4. They could get 1080p60 in the same or less bandwidth than we're using for 1080i60.

The U.S. has too much gear already in households to switch to 1080p60 on a broad basis.

(And there is of course also the issue of producing 1080p60, as the camera assortment is limited today.)

NEW, RELATED SUBJECT:
Why can't a TV system (from production source to display) be Variable Frame Rate?

Metadata could determine the time each frame should be displayed, so you could have say a 1080p30 news announcer switching to a 1080p60 sports clip, followed by a 1080p24 movie, followed by an hour of 720p15 talking heads lending 15 Mbps of bandwidth to two subchannels from the same station.

By skipping all the pulldown nonsense we gain picture quality and reduce our bandwidth needs.

(Interestingly, MPEG-4 has good support for interlaced, while MPEG-2 and interlacing were never intended to go together, although a shotgun marriage of these two was later arranged out of desperation, with some hacks providing support for interlaced, but with less efficient encoding of course.)

Former user wrote on 12/14/2008, 11:15 AM
"NEW, RELATED SUBJECT:
Why can't a TV system (from production source to display) be Variable Frame Rate?"

Once all production, broadcast, transmission and TVs are digital, then this will be possible. But until then, the NTSC TV standard is 29.97 fps interlaced. Any production created for broadcast (other than hidef) has to be converted to this standard at some point along the line. So why not shoot 29.97 interlaced untill it is possible to support, without conversion, other speeds and frame types.

Dave T2
Coursedesign wrote on 12/14/2008, 12:29 PM
What is this "NTSC" standard of which you speak?

Oh, I see now, you mean the analog TV broadcast standard that is scheduled to disappear forever 8 weeks from now?

It is not necessary for all production to be digital for Variable Frame Rate to work, any more than it is necessary to avoid shooting film to produce for digital television.

Any production created for broadcast (other than hidef)

There is very little created for broadcast today other than hidef.

Used high level professional 29.97 interlaced cameras are selling for peanuts today, I recommend you go shopping now before they all end up in the Museum of Interlaced Television :O).

As for what to shoot, it is obviously OK to produce in 480i29.97 if you're shooting for DVDs, especially as BD is has been getting little traction so far. And TV stations will still accept 480i29.97 for some purposes for years to come, because they can't be that picky.

Look, this is not religion, where one is always right and the others are therefore false.

Interlaced and progressive will co-exist for years to come, but the eventual outcome is not in question. Whatever the computers want, they eventually get.

GlennChan wrote on 12/14/2008, 12:30 PM
NEW, RELATED SUBJECT:
It would be difficult for the broadcaster, since they would have to deal with timecode and timing issues. For starters, they likely have to do a SD broadcast version too (e.g. cable TV). So that has to be 29.97i.

A lot of the equipment also will only deal with a single framerate.
farss wrote on 12/14/2008, 12:43 PM
"Why can't a TV system (from production source to display) be Variable Frame Rate?"

Good grief. TV systems have two fixed frame rates and that causes no end of extra work. Remember TV is a GLOBAL system.

There was some talk of having variable frame rates for digital cinema systems but that went nowhere. To the best of my knowledge there isn't a digital video / film camera that can even shoot variable frame rates. Even the Varicam doesn't truly shoot at variable frame rates like a film camera can.

Bob.
kairosmatt wrote on 12/14/2008, 12:58 PM
When we first starting getting into video production and reading this forum for advice, I didn't think that I could ever tell the difference between 24p and 30i. But like John says, even on shots with not much motion-it is very obvious!

I'll go out on a limb here, as opposed to most others in the forum, and say that I MUCH prefer the 24p look over 60i. Even for fast motion stuff and wildlife.

Also, I had some major technical headaches with upper field HDV and lower field DV on the same timeline and rendering to QT MotionJPEG. I did a bunch of tests, one source was always backwards. Rendering to MPEG for DVD automatically sorted out the field order.

Ever since then, its been all progressive!

kairosmatt
johnmeyer wrote on 12/14/2008, 1:42 PM
I keep hearing that MPEG-2 wasn't designed with interlace in mind, and yet I am pretty sure that this was one of the big deals when it was introduced to supplant (not necessarily replace) MPEG-1. VCD (MPEG-1) was progressive, but MPEG-2 allowed for interlaced which was a big deal.

Obviously if you prefer progressive to interlaced, that's the prerogative of every videographer. However, your clients may or may not prefer that look. I am pretty sure I would have some very unhappy clients if I delivered a football game in 24p because it does not look like the football games on TV.

Also, if 24p (or 25p) was really preferred, I would think all live video events would be broadcast in this format (even with pulldown, you can approximate the progressive "look"). Since this is very rarely done, I would suggest that very few people -- both those on the broadcast side, and the consumers who receive those broadcasts -- are clamoring or demanding 24p for live television.

I think that most of the preference for 24p and 25p expressed in these forums is for the reasons given by several of the previous posters, namely that interlaced can be tougher (but not impossible !! ) to work with in some (but certainly not all) post production situations. Also, many modern displays do not handle interlaced in the traditional way, since they do not paint the display with a scanning electron beam. However, there is absolutely nothing inherent in a pixel-based display that totally prohibits driving rows of pixels at alternate points in time.

farss wrote on 12/14/2008, 1:53 PM
Interlaced is certainly going to fade away. It has many technical problems such as line twitter and imposes limitations on vertical resolution. High frame rate progressive imaging is the way of the future. Possibly even beyond 60p, I see that the latest LCDs interpolate to 200Hz. Some are complaining that the cadence of film is destroyed by this. I guess it'll be left up to the consummer to decide what they prefer watching and they seem happy to be rid of the judder.

Bob.
GlennChan wrote on 12/14/2008, 2:05 PM
However, there is absolutely nothing inherent in a pixel-based display that totally prohibits driving rows of pixels at alternate points in time.
I believe some LCDs will have difficulty doing this since the liquid crystals are not that good at switching from grey/white <--> black, depending on its response time. I believe that there are some color issues to work out.
Having the LCD display black lines every other row (black line insertion) also cuts the contrast ratio in half.

If we were to redesign all our television systems now, I think there's no question that we would go with progressive. Compression is much more efficient with progressive images (in other words, we'd get better picture quality practically for free) and it would be easier for most display technologies to handle / they are better at displaying progressive images.
For practical reasons however, interlace will likely be around... for a long time. Unless 1080p60 takes over. But that would have to be a few decades out, since many people get their TV from analog cable (I do).
kairosmatt wrote on 12/14/2008, 2:09 PM
John-
I would've thought you were right about MPEG-2 because it is certainly been much easier for me when it comes to interlacing.

No matter what I have on the timeline (upper field, lower field, 30p, 24p, I even had 25i there once) it never mixes up the fields (saving headaches in the viewers!). I have just been struggling with getting the Quicktime MotionJPEG right (which is my delivery format for one main client).

I've been pretty fortunate that all of our clients have liked what we do-but we don't do any sports. I think the 24p actually 'tricks' them a little, into thinking they are watching something more cinematic.

kairosmatt
johnmeyer wrote on 12/14/2008, 3:00 PM
I believe some LCDs will have difficulty doing this since the liquid crystals are not that good at switching from grey/white <--> black, depending on its response time.I guess this is a workaround that some LCD displays use. Others attempt to deinterlace, or they halve the resolution. None of these are what we want.

When I said I didn't think there was any reason LCDs (or any of the other pixel-based progressive display technologies) couldn't be made to display interlaced, I was thinking primarily of the addressing scheme used to turn the pixels on and off. I need to have a chat with my old friend and business partner who is an expert in this to see if there is some reasons of physics that prevents building a display that permits addressing odd rows and even rows at separate points in time. Even if the fields were not painted top to bottom, I'm not even sure this is even necessary, since I think that most modern cameras, other than those with "rolling shutters" grab all the even lines at the same moment in time, and then the odd lines, rather than actually scanning each line at a its own discrete instant in time. Thus, if this is possible, you'd strobe the odd line pixel information into the display and keep that up there for the full 1/30 (or 1/25) second, and then 1/60 (or 1/50) of a second later strobe the even pixel information into the display. Obviously this isn't how most displays work, and I need to find out if that is for physics, manufacturing, or cost reasons.