Progressive scan really better?

Videot wrote on 3/4/2004, 10:58 PM
Since I have a new widescreen TV that supports Progressive scan & I saw in Spot's book that progressive was supposed to be better I took a 3 minute avi & burned an MPEG2 both the regular way & prograssive scan.

Even though it's supposed to be a good TV I can't see any difference. Is the picture really supposed to be better? Should one create all of there projects using progressibe scan? Maybe my eyes aren't good enough.

Comments

farss wrote on 3/4/2004, 11:17 PM
Doing it that way I doubt you would see a difference, in fact it'd probably look worse.
If you start with material shot progressive in say a DVX100 then yes you will see a difference. For starters a camera shooting progressive gives 30% more vertical resolution than if it was shooting interlaced.
Displaying on a progressive device progably measn less eye strain as well.
But that doesn't mean that it's all good news, progressive scan means less temporal resolution and at the camera end of the chain more noise. Certainly having a progressive scan capable display device is a plus IF say you're playing out a DVD that was taken from film. Is it going to be a huge difference, I think not.
stutch wrote on 3/8/2004, 6:46 PM
Call me crazy but

I can absolutely see a difference for the better on my plasma. Using progressive only on my render at best setting, and of course reduce interlace flicker, little unsharpen mask and gaussian blur. Mainly miniDV to DVD w/ firwire io.

Commercial DVDs are a HUGE step up from a svid or composite input.
Spot|DSE wrote on 3/8/2004, 10:48 PM
Progressive IS better IF:
1. It was shot progressive
or
converted to progressive for editing stage and has been kept that way
AND
is being displayed on a progressive scan television or a computer monitor.
If it ever goes to a regular television, projector, etc, then you lose all the benefits, and potentially get a lesser picture quality depending on what device is doing the conversion from progressive to interlaced.
farss wrote on 3/8/2004, 11:48 PM
I fail to see why converting progressive scan material to interlace would produce any image quality degradation. Going the other way around (assuming it was shot interlaced) then absolutely you've got to take some quality hit. After all material shot progressive is still recorded as two interlaced fields, well OK if you're in NSTC then it's also got pull down applied.

Maybe going from 24 fps to 60i gives some quality issues due to pulldown however by all reports even shooting 30p gives improved image quality despite the loss of temporal resolution. After all most Hollywood DVDs are encoded as 24p and converted to 60i in the DVD player.

I'd also mention that all Vegas generated media, including the Vision series of loops is progrssive scan, we were told that converting this to 50/60i would yield perfect results (which it certainly has!).

What's not widely mentioned is the improvement in vertical resolution when shooting progressive scan. During interlacing the camera uses a trick called line interpolation, this gives improved noise / gain however with a loss of vertical resolution. When shooting progressive you gain vertical res (29%) but loose low light performance.
roger_74 wrote on 3/9/2004, 2:19 AM
"I'd also mention that all Vegas generated media, including the Vision series of loops is progrssive scan"

Vision Series may be progressive, but Vegas generated media certainly is not (not for interlaced DV). The Vegas generated media is always whatever you set it to be. If the Video Preview is set to "Preview" you can't see the interlacing, but it will be there when you render.
roger_74 wrote on 3/9/2004, 7:56 AM
Yes, I need to correct myself. The event properties for generated media are always set to progressive. But if there is any movement in the generated media it will be interlaced when you render to an interlaced format.
Spot|DSE wrote on 3/9/2004, 11:18 AM
Farss,
I've seen images that simply don't output the signal well into an interlaced format when the DVD is progressive. It may well be something other than the decoder, but that's the first place I'm gonna suspect, because it looks terrific coming off an XGA projector as pscan, coming off of a pscan DVD player. Same disk, same output, going to the television contains artifacts. In theory, it shouldn't. But concept and reality are sometimes different.
jaegersing wrote on 3/11/2004, 2:41 AM
If you convert a progressive scan image that contains fine, moving, horizontal lines into interlaced format, some of the lines will flicker. This effect is especially bad on images derived from moving graphics, but it can affect video too.

Richard Hunter
farss wrote on 3/11/2004, 5:10 AM
SPOT,
you're right on, the devil is always in the detail. Years ago I used to dabble in still photography, something as simple as how a film still camera records a moving image can be dramatically affected by how the shutter works. I keep going back to that with video.
Not only will the image be affected by how the it's recorded but also by how its projected and that's without throwing "p" versus "i" into the mix.
There's this very long thread over at dvxuser.com with guys thinking there's something seriously wrong with the PAL version of the camera because they get motion artifacts if they pan too fast, I was amazed at the number of guys who were convinced there was something truly wrong with the camera, no one seemed to tweak to the idea that maybe it had something to do with how things happen when you record images that way.
And then there's that little voice in the back of my head that keeps saying "the STORY, what about the story!"