HD Again....Progressive vs . Interlaced...

PDB wrote on 2/23/2005, 6:43 AM
Sorry to bring this up again...I guess the concept hasn't dug in yet....

Ok, so from what I gather, the "HD ready" displays going into homes are primarily (?) plasmas and LCD displays..all of which are progressive....so what is the point of interlaced HD broadcasting/capturing? Does de-interlacing happen pre-display? Wouldn't it make more sense just to stick with progressive?

Why would interlaced be adopted as a standard in broadcast and then most/all displays get shoved into homes as progressive displays...(scratching my head....)..and while we are at it, why differentiate between 25p and 30p in this scenario?

I guess I am finding it hard to understand what the advantages of interlaced are in this day and age...

Comments

JJKizak wrote on 2/23/2005, 7:00 AM
They will be all 1080i as the tube models are now. They use 1080i because it doesn't blink and jerk. 720p doesn't cut it period.

JJK
PDB wrote on 2/23/2005, 7:06 AM
Hang on...you saying that they are not progressive displays??....Don't see why 720p doesn't "cut it"....aren't all crt's progressive displays? and overhead projectors...??

Ok, now I'm completely lost...I could have sworn it was the other way round....

John_Cline wrote on 2/23/2005, 7:28 AM
HiDef CRT televisions can actually be both interlaced or progressive depending on what type of signal its being fed. Plasma and LCD are progressive by nature

720p is 60 individual 1280x720 progressive frames per second. 1080i also has a temporal rate of 60 frames per second but it is 1920x540 fields interlaced into a 1920x1080 image. Since the temporal rates are the same, I prefer 1920x1080i because studies have shown that the human eye (and brain) are more sensitive to horizontal resolution and movement than vertical resolution. In terms of perceived image quality without regard to display size, my preference is currently for CRT-based HiDef displays, but the technology is moving so fast that my preference may change.

John
PDB wrote on 2/23/2005, 7:45 AM
Ok John, I see the point about horizontal res...

What completely escapes me is why the screen/tv market is pushing progressive displays and in the US for exmple have adopted a preferred interlaced broadcast...

Isn't that a bit like selling people diesel cars and then making all gas stations regular unleaded pumps (so you need something to convert etc...)???

Beats me I guess...

It does sound as though Europe may go for the 720p though....
John_Cline wrote on 2/23/2005, 8:11 AM
Other than image size considerations, I don't see much point in going for either LCD or plasma displays. But, that's just my opinion.

For what it's worth, both ABC and FOX have gone 720p, CBS, NBC and PBS have all chosen 1080i. As long as you have a display capable of displaying both formats in their native resolutions, it's kind of one of those "six of one, half dozen of another" deals. Both formats look really good.

John
PDB wrote on 2/23/2005, 8:30 AM
Right...SO I guess the next question is....are PLasma / LCDS (which are selling like hot cakes) capable of displaying interlaced streams natively (I take it they aren't....) which means the stream is treated prior to display....(Does it make a difference?)

And if we look further into de future (not so very far into it as far as I'm concerned) with the convergence of media/pc/internet/tv and whathaveyou in homes, progressive seems to make even more sense doesn't it? I mean sometime we will be delivering films realtime on a live stream from a website etc to be seen in your sitting room...and that will be progressive (won't it?)...

If not, doens't that make a case in favour of progressive HD broadcast/capture/delivery?

Or am I just extremely naïve?

riredale wrote on 2/23/2005, 8:38 AM
Either 720p or 1080i is a significant improvement over NTSC, just as HDV is (let's not get another flamefest started over whether HDV is truly HDTV).

First, the issue of 25 vs. 30: the Americans commercialized TV first, and chose 30 because the first sets were synced with the 60Hz 110v power mains. Europe settled for 25 in part because of the same sync capabillity with their 50Hz power, but also to not be beholden to the almighty RCA and US industry back in the late 1940's. Given that NTSC is 30 and PAL is 25, it was reasonable (to Europe, at least) in the late 1980's to standardize in Europe on a 25 frame system for HDTV. Plus, conversion from that universal HDTV source called film was much easier at 25 than at 30.

As for interlace versus progressive: IBM once made an interlaced computer monitor, but it's been obvious to all for many years that interlace introduces irritating display artifacts if you're close enough to the screen to see them. So conceptually, yeah, progressive is better than interlace. But not so many years ago (and especially back in the 1930's, when some of these basic issues were first investigated) it was impossible to implement a data rate high enough to deliver progressive images comprised of a decent number of pixels at a high enough frame rate to ensure fusion (where the brain is fooled into thinking that a sequence of still images is actually fluid motion) and minimize flicker. It could have been done if framestores existed back then, but they didn't (i.e. deliver a progressive image at 30f/sec, but display it twice, at a 60Hz rate, mimicking what a film projector does). So they settled on delivering just half the lines 60 times per second, killing two birds with one stone as they say. That's interlace, and from typical viewing distances back then of, say, 10 picture-heights, it worked great.

Our "compatible HDTV" proposal back in the late 1980's was to deliver progressive images at 30f/sec, in a fashion similar to how movies are delivered on DVD--scanned progressively, then de-interlaced for transmission compatibility. A smart set or DVD player can re-combine the interlaced sets and display the progressive original scan. Our proposal, called HD-NTSC, had a bunch of other improvements (widescreen, digital audio, doubled vertical and horizontal resolution) but, alas, the feds shot it down and decreed instead that they would adopt an entirely new standard and shut down NTSC in 2001. Hah!

My own gut feel these days is that 60p will eventually die out, since 30p is adequate for most things (after all, film is still shot at 24p). In time I think chips will be developed that can fill-in intermediate frames to give the illusion of 60p or perhaps even 90p.

BTW, 720 was proposed by the Zenith engineers in 1988 as an alternative to the 1125-line interlaced HDTV system pushed by Japan. Many engineers on the committees thought it was significantly inferior, and was made part of the final HDTV standard due to politics, which is often how these things get settled.
PDB wrote on 2/23/2005, 8:48 AM
Thaks riredale,,,! that was extreely informative, I must say...

But I still dont understand the point of the continuous developmente of interlaced technology going forward....(that statement in itself reads contradictory in my mind...) I really want to understand the point benefits...It will probably be an important aspect in my next camera purchase decision...I mean, we are looking at HD today with an eye into the immediate future...better try and get it right then!
riredale wrote on 2/23/2005, 9:01 AM
Maybe the reason why the Sony HDV camera is 1080i is because it was developed in Japan, where the accepted definition of "HDTV" is 1125 interlaced (1080x1920 pixels from a digital point of view).

In the end it probably doesn't matter that much, since chips have been developed that are able to de-interlace a purely interlaced image pretty cleanly. And, of course, if HDV cameras are introduced that shoot 30p and then deliver 60i (to maintain compatibility), it's trivial for a display to show a pure 30p image.

As for basing your camera purchase on trends, I guess it's a safe bet that no matter what you buy, it'll be obsolete eventually, though hopefully not as quickly as PC's. I have to smile when I look at a PC Magazine from two years ago and note that "state of the art" back then is today's Walmart special.
JJKizak wrote on 2/23/2005, 9:09 AM
In my geographic area (Northeast Ohio) over the air broadcast quality on CBS is the best, this is viewing the most distant objects, tiny people in a grandstand or people walking on a Vegas street on CSI way in the background. PBS is next, UPN, then ABC and NBC and finally FOX which is the worst. Fox also seems to have their local station broadcasting with a slightly distorted resolution causing the people to look slightly fat. Fox also has very poor resolution for objects in the background, all tiny people are out of focus. The local stations broadcasting High-Def on location news remotely is really sometimes bad with 7 or 8 double images contained within a reporters face. The low light performance is pretty hoaky also with lots of pixelation noise.
NBC, CBS, and UPN tend to broadcast some programs with lower brightness (maybe gamma) then the simulcasting analog transmitter.
The FOX Dolby 5.1 sound is absolutely terrible. They seem to deliberately put a hole in the center and constantly vary the volume like a four year old playing with the TV set. I haven't seen a grain free broadcast on NBC yet. ABC seems to add a slight blur to everything. CBS and PBS are the best right now.

JJK
John_Cline wrote on 2/23/2005, 9:47 AM
"since 30p is adequate for most things (after all, film is still shot at 24p)."

Film is shot at 24p primarily because film stock was/is expensive and 24 fps was about as slow as they could go and still (mostly) maintain the illusion of smooth motion. Although, film is theaters is actually projected at 48fps, with every frame being shown twice. However, if you look at a pan of a wide shot, 24fps film stutters quite noticably and it drives me nuts. Ideally, I'd like to see 1920x1080 progressive at 60 frames per second. Nevertheless, I think that keeping the temporal resolution at either 50 fps PAL or 60 fps NTSC is a mininum whether it's interlaced or progressive. My opinion is that the temporal resolution of 30 fps progressive isn't quite high enough.

John
B_JM wrote on 2/23/2005, 9:51 AM
i sure agree with you John_Cline ....

some projectors use three blade shutters though - though they still suffer the issues ..

BarryGreen wrote on 2/23/2005, 11:47 AM
<<Although, film is theaters is actually projected at 48fps>>

Totally not accurate. Film in theaters is projected at 24fps. The film runs through the projector at the rate of 24 frames per second, and you see 24 distinct images per second, not 48.

With a two- or three-bladed shutter, what they do is even-out the flickering. A film projector holds the film in place while the light shines through it, then a shutter closes and the film gets advanced to the next frame in darkness. Then the shutter opens again, projecting the next frame.

This causes quite a bit of light/dark flicker. So they developed two-bladed shutters, and eventually three-bladed shutters, which darken the frame even while it's held still, which evens out the pattern of light/dark. However, the film still moves through the projector at the same rate, 24 distinct individual frames per second, not 48.

As far as the original poster, and interlaced vs. progressive, you may find it interesting to read the EBU's papers on this. They are totally agreeing with you, and saying that progressive offers much more, going forward. From all indications, they really want nothing to do with interlace when they set the specifications for European HD broadcasts. However, as with all decisions like this, you have to follow the money... Sony is heavily invested in interlace, HDCAM is an interlaced format, their implementation of HDV is interlaced, so they are lobbying hard to get the EBU to adopt 1080/50i.

The ultimate, from where we stand now, would be 1080/50p and 1080/60p. You'd get all the same motion rendition as the current interlaced formats, but with about 30% higher perceived resolution and no interlaced artifacts. But those formats don't exist yet.
John_Cline wrote on 2/23/2005, 12:08 PM
Both Barry and B_JM are correct about shutters in film projection used for flicker reduction, that's what I meant but, obiously, I didn't say it very well. Film in theaters is indeed 24 frames per second, not 48 as may have been inferred from my earlier post.

John
musman wrote on 2/23/2005, 1:46 PM
No, I think you're right. Interlacing is a solution to a 70 year old problem and we don't need it anymore. So long as the refresh rate is high, progressive is better in every way than interlaced.
There's a really good article in DV Magazine about all this and HDV. By now it may be available free on their site DV.com. Apparently 720p has a perceived resolution equal to 1035i, so 1080i is not much better in terms of perceived resolution. In other words, don't worry so much about the higher number, it doesn't give you that much more here.
Spot|DSE wrote on 2/23/2005, 2:04 PM
Interesting thread. Peter Gloeggler has a very nice paper published in the NAB journals on this very subject that I just learned of after spending a couple hours interviewing him today. Maybe I'm just way behind in the learning curve, but learned a lot from him today on the intricacies of prisms, CCDs, vertical vs horizontal distortions, and how all that relates to p and i engineering processes and broadcast in general. I thought I had a good grip on the subject til today. I'll ask him if I can pdf his portion of the Broadcast Journals from NAB, if not, I'll have an article/interview with him on the DMN in a week or so.
VOGuy wrote on 2/23/2005, 2:16 PM
Actually, 24fps was chosen as the standard "Sound" speed due to the amount of film needed to produce adequate frequency response, using the "variable density" sound recording method, the only system available at the time. Before sound, film was shot and projected at variable rates from 16 to approximately 28 fps. 18 fps was considered "good enough" for most silent films. However, that rate was varied, depending on the subject matter.

-Travis
Former user wrote on 2/23/2005, 2:21 PM
And depending upon how good the cameraman was at cranking a consistent speed.

Dave T2
Spot|DSE wrote on 2/23/2005, 2:33 PM
Travis, do you have substantiating information on your post? Bell Labs and Edison have different information, so I'd appreciate seeing anything different. Not arguing, I'm just interested in hearing what someone else has had to say on the subject. Edison himself wrote as to why 24fps was selected, but maybe he wasn't the only one of merit with something to say on the subject. The first Vitaphones were only 20 fps, according to Bell. I wasn't around back then. :-)
John_Cline wrote on 2/23/2005, 2:46 PM
While there may be differing opinions on 1080 vs. 720, interlaced or not, we still have the matter of 1920 being quite a bit higher horizontal resolution than 1280.

John
Laurence wrote on 2/23/2005, 3:12 PM
To my eyes, interlaced footage looks a lot better than progressive, but then again that's probably my eyes. I'm quite far sighted these days, and all those dots just blur together anyway. On the other hand, I can see the flickering of flourescent lighting, older televisions and computer monitors not set above 60 hertz. The nature of the work I do (outdoor 4 wheel vehichle events, hang gliding events, and boating stuff, mostly done hand held) just looks a lot better with a higher frame rate. A lot of the handheld stuff I see in movie theaters and DVD releases makes me feel motion sick, whereas handheld video does not. Still, I can see where for many people doing film type projects with tripods, dollys, and wanting lots of spatial resolution of beautiful sets, 24 or 30 progressive looks a whole lot better.
Spot|DSE wrote on 2/23/2005, 5:44 PM
This is part of what Gloeggler and other high end camera and film to video delivery folks perpetuate, is that interlacing is designed for the human eye, and is smoother/sharper/more comfortable to the eye.
What I saw from him was a basketball game shot in 30p and shot in 60i at the same time. Watching the 30p gave me a headache after a while, due to the juddering effect of 30 complete images vs the 60 half frames that are each containing 2 different images. Canon has some interesting studies in their Optics book on this subject as well. What is more interesting/curious, is while video captured at 30p causes eyestrain, video captured at 60i and converted to 30p does not. In the interview, Peter went into a number of areas regarding the eye refresh rate vs screen refresh rate, and gave me a couple of curious but worthwhile experiments. Until 50p and 60p become possible in both cost and bandwidth, I suspect the public will be very happy keeping 60i.
Hulk wrote on 2/23/2005, 7:14 PM
This is an interesting thread. I have to admit I have a predisposition against interlaced formats just because progressive does away with field order problems and you can do cool thinks like crank up the shutter speed, light permitting, and get crystal clear slow motion effects and such.

With interlaced ALL motion regardless of the shutter speed is blurred do to the weaving of two temporally different frames. Deinterlacing doesn't fix anything, it's really a bunch of clever methods of figuring out what the progressive frame would have looked like.

And I agree with John in that CRT technology still provides the best image quality for the price, if not the largest screen. But I also see Plasmas and LCDs "mainstreaming" in the near future as prices continue to fall. The "coolness" factor of these formats will be too much for the public to take. Much like mp3 was too cool even though the quality is far below 44.1 PCM CD specs.

So if we are producing content for progressive displays using an interlaced format then interlacing, even smart interlacing which only occurs on the difference portions of the frames will have an adverse effect on the signal.

Now in reality the shutter speed won't be fast enough to freeze motion anyway in a progressive or interlaced format so it won't matter. And static images will benefit from the higher overall resolution of HDV 1080i over 720p. So when it all comes down the progressive/interlaced probably won't matter if the display is progressive. I suspect, as DSE states, that interlaced will be the better option for interlaced displays.

I know my computer would rather deal with 720p rather than 1080i!!

- Mark

PDB wrote on 2/24/2005, 1:17 AM
I must say it has provided me with some food for thought...In some ways it is a litlle bit reminiscent of the Beta vs Vhs bout from way back...

But from what seems to be happenning in the consumer market, as in home entartainment and displays, the future of media convergence etc would suggest that progressive will predominate...Does that mean it's best to invest in a "progressive media library" of footage? probably not particularly relevant...

I do however find it surprising that there seems to be very little going on in terms of interlaced displays for homes...Is that a design problem by any chance (ie, you can't make really, really flat interlaced pannels which look so good on your wall and take up much less space...)

Another point I have thought about from this discussion is that we are using our tv sets/displays for other things (ex: gaming) or to watch your holiday snaps from that wonderful last gen digicam...and all these benefit from progressive display too....

So my personal conclusion is that the continuous investment/ development of interlaced technology indeed responds to one side of the business (broadcast) and most probably due to invested $ influences rather than a response to the natural flow of demand of the consumer market....

Thanks everyone for your thoughts and knowledge!

Paul.