Oh, no !!! Another stupid question about Field Order !!!

sqblz wrote on 1/28/2003, 12:03 PM
Folks, I confess my ignorance, but please bear with me, I need your advice.

My doubts concern the decision between interlace vs progressive. There was a very recent thread and a pointer to a Webpage, and both were *excelent*, but I was not able to answer my simple question.

I have read here a while ago that the best settings for still grabbing would be: Best Quality + Field Order none (progressive) + Deinterlace Method blend fields. And truly it gives me a lot of quality in the grabs.

Then, I thought "well, why not using the same settings in rendering ?". For starters, my DV Camera (a Sony TRV-6 PAL) doesn't allow me to choose field order, it just shoots !!! (I guess it's lower field first). Then, my TV is just a TV (a Sony 36" 16*9 100MHz thing) with no such frills either.

My appreciation is that (1) I prefer the Best Quality (why wouldn't I?) ; (2) Deinterlace=blend seems to me much sharper than Deinterlace=none ; (3) Progressive seems to give nicer results than whatever field first.

With LFF I get a nitid fine jagging in moving lines, and that effect seems to disappear in progressive (if I use Dynapel's Steadyhand, I *must* convertt to progressive before, or the result will be unuseable).

Now, the big question: I am rendering to AVI and later I will transfer to DV Tape. My present findings relate to what I see in the PC Display. But how will they look in TV ???

If my final movie were to be seen on PC I would say "spot on", but on TV I have my doubts, and right now I just can't burn it to tape to be sure (I don't have the DV tape recorder available).
So far I have connected my PC to the TV via S-video and played the AVI in PC, while watching on TV. The result was quite satisfying, but I am not very sure of what defects should I look for. Flickering ? got none. Jagging ? nope. Scintilation ? zilch.

And then again: if playing on PC and watching on TV is satisfactory, will it be as well later on, when I transfer the AVI to DV Tape, and then I will play the tape and watch on TV ?

Somehow I understand that some gizmo is interlacing the movie before the TV shows it, but *to me* it seems that when the AVI is progressive I get a better TV image than when it is LFF.
Is this rational ?

Your esteemed advice much appreciated, as always. Thanks in advance.

Comments

Erk wrote on 1/28/2003, 12:34 PM
For stills, I've had more success with Interpolate rather than Blending fields. Perhaps it depends on the actual frame.

Re: progressive video on a TV, I've found that interlaced looks much better than progressive. That's what TVs are made for, no?

G
mikkie wrote on 1/28/2003, 1:11 PM
>>if playing on PC and watching on TV is satisfactory, will it be as well later on, when I transfer the AVI to DV Tape, and then I will play the tape and watch on TV ?

That certainly is the heart of the matter. How a video appears on your pc is effected by the player, the codec, the graphics card, the monitor and so on. If you go out via firewire you're not going to use the same chain so to speak as going out through I assume your graphics card's svid out, which is likely re-interlacing the progressive picture anyway. You might try going out through your camera, and from the camera to your TV.

mike
sqblz wrote on 1/29/2003, 11:39 AM
OK, if I understand it correctly, it goes like this:

- everything that is to be reproduced on TV must arrive there interlaced. If it is progressive material it will be re-interlaced. If not, it goes through.

- what I see on monitor does not relate to what I see on TV. I may see a progressive image nicely on PC and badly on TV (after being re-interlaced). And the opposite is most certaily true (intelaced images don't look so good in the monitor).

If I still don't get it, somebody knock my head, please.
Former user wrote on 1/29/2003, 12:05 PM
A normal TV signal is interlaced. That means each frame is made up of two fields draw alternating.

If you convert your TV signal to a progressive scan (not shot progressively) you are throwing out a field, so you are only getting 1/2 of the frame.

Some progressive options are to blend the fields, or use one field or the other. But you are still only getting half of the information.

If it is shot progressive, then it is a whole different story.

Dave T2
Summersond wrote on 1/29/2003, 12:31 PM
DaveT2, thanks for the clarification. That helped me to understand what NOT to do on my next production to TV.

dave
TheHappyFriar wrote on 1/29/2003, 1:09 PM
TV's (i believe) are Lower/Even/2nd field first. I work at a TV station nd when I started making commercials to play on an MPEG-2 player, I learned this REAL fast (play a file that's progressive or wrong field on oyur TV and that's what went over the air!)!! That's how I render our mpeg-2 files.

Everything looks OK on my monitor though. I've rendered interlaced/non interlaced video, and it only matters for the TV. Just remember this: your computer will only de-interlace video automaticly if you use the S-Video out option (ie: on a Radeon card). If you use yout capture cards video out, it will most likely be what your project is. (monitors are also little darker too, so I make my video's a litle darker then what looks good on my monitor)
mikkie wrote on 1/29/2003, 5:53 PM
>>monitors are also little darker too

If you're interested, check out Adam Wilt's site, & dv.com, etc.

TVs really are a different animal. To me the thing that helps is remembering that the TV has no idea what frames or fields are the way we talk about them here -- it just sees or recieves frequencies. Levels are voltages and such (the controls you'll see, as in VV4 Beta make more sense if you remember the original stuff was often voltage levels). So what we see as a little darker is a combo of different circuitry and different colorspaces. A lot of software (and hardware) deals with this automatically, some you have to set it, some you have to guess, but you actually want to change the colorspace a bit *IF* you can - otherwise you might wind up washing out the bright scenes.

In VV3 you'll see this adjustment under FX as Clamp. In VV4 Beta it's moved to legacy & a lot more fine control is added to the filters that replace it. The traditional TV color space is something like 16 - 235, while the space your PC is more familiar with is 0 - 255. Making it worse, while 16 may be black, there's super black which is lower on the scale. Superwhite works the same way, capturing a little extra detail. Some cameras do make use of this.

When you clamp the PC color range down to the smaller figures, the danger is that you'll lose everything above 235 or so, & everything below around 16. Done properly it's close to working with gamma and curves in P/Shop. The opposite is also true, if you're expanding a captured stream for viewing on PCs, though just how and how much depends on all the video hardware and software you're working with. Often the direct show stuff in windows will access unfiltered video, while the bundled software (the stuff that came with your capture card for example) can transcribe everything and crop the signal (ati) at the same time.

When the target audience uses a PC, if nothing else politely remind your viewers that the RealOne & Winmedia players both include the same HSB controls they have on their TV, & you went to great lengths to preserve the original colorspace for when they play it back on their big screen or HD tvs.

mike
sqblz wrote on 1/30/2003, 4:26 AM
Spot on DaveT2, TheHappyFriar and Mikkie !!! Thanks.

Just to finish my agony: is my amateurish dv camera (sony trv-6) shooting progressive or low field first or whatever ? If I shoot something and then directly view the result on TV, the signal will be interlaced, but what about the stock which is on the tape ?

It's really a pain !!! the progressive rendering definitively looks much better *on the PC*, but the final product will be for TV, so LFF seems to be unavoidable.
De-standardization of the standards is a drag !!!
Former user wrote on 1/30/2003, 8:20 AM
You would have to pay extra for a progressive camera, so yours is probably a standard TV interlace. I believe the DV standard is lower field first.

I am not sure by what you mean about the stock on the tape. Tape is just a recording medium. It doesn't care if the signal is interlaced or not. If you shoot interlace, then that is what it records.

A PC is progressive scan, that is why it looks better when you render to progressive.

Dave T2
vonhosen wrote on 1/30/2003, 10:43 AM
If you render your MPEG-2 files as progressive or interlaced, your stand alone DVD player shouldn't find either any more difficult to play back than the other. You will however find the progressive encoded disc has softer image than the interlaced.