25p mpeg2 render looks better than 30p?

MH_Stevens wrote on 5/2/2007, 11:07 AM
I'm test rendering a 1080x60i clip of moving cars and pans across close objects to mpeg2 for display on a computer HD LCD, using three settings. 1) 1080x60i, 2)720x30p and 3) 720x25p. Both the 30p and the 25p had the same quality settings.

As I expected the 60i shows extensive jaggies and is a bit jumpy on my P4. The 30p was pretty good with only minor jaggies, but what surprised me was that the 25p was perfect! Coming from a 60i source, why was the 25p was much better than the 30p?

Michael

Comments

johnmeyer wrote on 5/2/2007, 12:29 PM
... for display on a computer HD LCD ...

That's the thing: You're displaying on a computer display, which is progressive (I assume).

I think perhaps you are paying too much attention to the interlacing artifacts. On a display that can properly display interlaced material, the 1080x60i should look by far the best because nothing has been done to it (you haven't deleted any frames). As for 25p being "perfect," I guess it all depends on what you are looking for. However, remember that you have definitely lost information by going to 25p so, at least by the definition that I would use, this is far from perfect.

I leave it to others who have more experience rendering for computer displays to give you a better alternative, but I can say, pretty much for certain, that 25p is NOT the best way to go.
MH_Stevens wrote on 5/2/2007, 3:33 PM
John: Yes I know 25p throws away and that is exactly why I ask (forget the 60i because I'm on a progressive setting on the monitor) why does the 25p look BETTER than the 30p. Coming from 60i and just deinterlacing I would have expected the 30p to look better than the 25p. That's the real question I had. Is it possible that Vegas did NOT deinterlace my 60i when it made 30p and is making the monitor do it? The 30p shows those horizontal lines down all moving vertical edges.