Comments

SonyDennis wrote on 4/23/2003, 11:05 AM
1080p exists, but there is no hardware for it yet. 1080/60p uses double the bandwidth of 1080/60i, so I don't think you'll see it for a while. You might see 1080/24p pretty soon because it's not all that much more bandwith than 1080/60i, and exactly matches what is needed for theatrical film releases.

There is no 720i, it doesn't make sense from a resolution and bandwidth point of view, as it's not significantly higher resolution than standard definition.

We're still burdened with interlaced formats because it's still a good tradeoff between spatial and temporal resolution. Sports look better in 60i than in 30p because of the fast motion.

Likewise, we're still burdened by 29.97-based framerates because of the need for simulcast, SD up or downconversions, and other NTSC compatibility. Oh well.

///d@
Nat wrote on 4/23/2003, 11:18 AM
Thanks for the reply.
I think the thing I don't catch is why 1080p would take up more bandwidth than 1080i, since it's the same resolution ?

Thanks,

Nat
Nat wrote on 4/23/2003, 11:41 AM
Also, when talking about 720p/60, we're talking of 60 frames per second, not 60 fields right ?

Meaning 1080p/60 would be 60 frames seconds and 1080p/30 would be the standard 30 frames second ?
riredale wrote on 4/23/2003, 12:21 PM
Nat:
We need to use the correct terminology here. 1080 refers to the number of scan lines (duh) and 30p means 30 frames per second, progressively scanned. In addition you can have 60i, meaning 60 fields per second, interlaced (i.e. 30 frames per second), and finally one can do 60p, meaning 60 frames per second, progressively scanned.

The ideal is a fast frame rate with each frame progressively scanned, but that takes a lot of bandwidth. Researchers found out about 80 years ago that you could get the ILLUSION of a high frame rate by just painting every other line the first time, then painting the remaining lines the next time. In this manner you were only actually delivering half the number of lines, so the data rate was halved. The catches are that: (1) the illusion breaks down when the display gets large, and (2) it's harder to do pixel manipulations, since adjacent vertical pixels come from a different time.

1080/60i came first, about 1988. Then, the Zenith engineers said that 720/60p looked about as sharp, even though there were far fewer scan lines, because it was progressive. The Zenith guys were wrong, but they were good at "spinning" the data to match their objectives (derailing the Japanese 1080 standard).

BTW the 1080/60i format wasn't the original one proposed by NHK in Japan. The original standard was 1035/60i. Back in those days Japan was threatening to overwhelm the American video engineering interests, and people were throwing anything down on the tracks to try and stop (or at least slow down) the Japanese train. American interests came up with the notion that "pixels had to be square," which today seems pretty amusing, given that all of us in DV-land are comfortable with non-square pixels. But the hysteria about square pixels won the day, and since it had already been determined that there had to be 1920 active samples in each line and also that the aspect ratio had to be 16:9, then forcing pixels to be square meant there had to be 1080 vertical pixels, or 1080 active lines. But those pesky Japanese quickly adapted their equipment to accommodate 1080. What really screwed the Japanese proposal was the later adoption of Digital TV, and we can all see how successful DTV has been in the USA...

In other words, politics created the DTV disaster we live with today.
Nat wrote on 4/23/2003, 2:36 PM
Wow thanks a lot for that info :)