Deform FX produces JAGGIES - correct?

Grazie wrote on 1/9/2007, 9:54 AM
On motion I'm getting interlace jaggies occurring. Could somebody else confirm this for me. As a remedy I've done a Progressive Render. That's sorted it out. Is this the ONLY solution?

I've distorted the FRAME to fit inside a TV screen that is at about an angle 45 degrees to me. So the frame has a perspective distortion to it.

TIA - g




Comments

TheHappyFriar wrote on 1/9/2007, 10:33 AM
jaggies on preview monitor or preview window? I nearly always get jagies on the preview window but seem to loose them when I burn to DVD.
Coursedesign wrote on 1/9/2007, 5:05 PM
The laws of physics would seem to require the presence of jaggies when showing interlaced footage of objects in motion on a progressive display.

Corollary: Panny makes an on-set LCD HD monitor (BT-LH1700W) that fixes this problem with special circuitry. At about $3,000 for a 17" monitor they can afford it... :O)
DJPadre wrote on 1/9/2007, 6:02 PM
u could try the 3d plugin, it does everything deform filter does... actually quite a bit more too...
farss wrote on 1/9/2007, 6:10 PM
Sorry if this is the obvious:

1) You are rendering at Best
2) You have a de-interlace method defined in the Project Properties

Given that you're probably downscaling the frame you shouldn't really get any jaggies. I'd agree that these sort of tasks are way easier with corner pinning. I've tried doing this kind of thing without that in Vegas but it's just too tedious. Probably quicker to play the video you want in the TV and shoot the whole thing. One trap though, you'll probably need a camera with Clear Scan unless you want lots of flicker / scrolling bars in the TV image.

Bob.
Grazie wrote on 1/9/2007, 10:08 PM
Just to recap . . .

I'm overlaying a Video ONTO another Video.

The overlay video is that of talking, moving, nodding heads. I have NO JAGGIES on this straight/non-deformed piece of video. It is working well.

I have overlayed these Talking Heads ONTO a video of a TV set - which is placed in the corner of a room.

The TV set is angled away from the viewer.

Now, to make the TALKING heads fit the TV set better, because it IS at an angle, I've:

#1 - Used Pan/Crop to reduce the SIZE of the FRAME of the Talking Heads

#2 - Used Vegas DEFORM Fx to do this. I have DEFORMED the talking heads.

Now, it is on these TALKING HEADS that I'm NOW seeing jaggies. To compensate/get rid of/remove the JAGGIES, I have subsequently rendered the Deformed Fx Talking Heads as Progressive. And this HAS got rid of the JAGGIES.

Is this the approved/correct/professional way I should be expecting to do this procedure? Or is Deform and Pan/Crop messing with Vegas?

OR

Should Vegas Deform Fx be able to handle this?

PLUS

I want somebody to try this and tell me they are getting the same or not.

Sorry if this is the obvious: - With me?

1) You are rendering at Best No. Render to New track GOOD


2) You have a de-interlace method defined in the Project Properties - Yes.

"Given that you're probably downscaling the frame you shouldn't really get any jaggies. " What do you mean?


"I've tried doing this kind of thing without that in Vegas but it's just too tedious. WHAT?!? All I'M doing is applying a Vegas Deform FX?? Not tedious at all.

"Probably quicker to play the video you want in the TV and shoot the whole thing. " - I'M adding a Vegas DEFORM Fx. Hallo?



Again, into an EXSISTING video of a TV Screen which is AT an angle I've OVERLAYED this WITH another piece of video. Because this OVERLAY doesn't quite fit the TV screen I have applied a DEFORM Fx to make it FIT.

"One trap though, you'll probably need a camera with Clear Scan unless you want lots of flicker / scrolling bars in the TV image. " Again . . I'm adding a Deform FX. I am NOT filming a TV screen.


AND I still would like somebody ANYBODY, to test this. I want somebody to repro getting jaggies on motion OF a a piece of media that you have used the Vegas DEFORM Fx.

If somebody wishes to SKYPE me I can show them.

TIA - g

Grazie wrote on 1/10/2007, 1:05 AM
RIGHT! First off - thank you Bob!

We just spoke for about 45mins on SKYPE and I have learnt a most valuable lesson: Sometimes, just sometimes, GOOD ( render) just aint GOOD enough!

Yes I have my project settings at GOOD. And yes Render to New track is set at GOOD.

HOWEVER! When rescaling - here Pan/Crop<>Deform Fx - I needed to have BEST on Preview and BEST for rendering to New Track. So, under these circumstances I have to consciously get OUT of my comfort zone of Previewing at "Preview", which at times I do and use GOOD, and move even further UP to BEST.

There you have it. When all else fails, try BEST! Jaggies gone.

Thanks again Bob!

BTW, Under these circumstances of re-scaling, Bob can tell you why BEST IS better than GOOD.
farss wrote on 1/10/2007, 4:58 AM
To be honest it's hard to give a really good answer because from my brief search of the Vegas documentation it doesn't give much away.
It'd be a reasonable question to ask just what is the difference between Preview, Good and Best.

From my intrepid attempts at reverse engineering an answer the difference between Preview and Good seems to relate to how precisely Vegas decodes a frame. I note that with no FXs applied there's no difference between Preview and Good. Apply any FX though and there is a difference. I can only assume that as Vegas decodes every frame to uncompressed to apply the FX the quality of the decode is better between Preview and Good.

The difference between Good and Best as we've noted relates to interpolation. If all pixels stay put i.e. there's a one to one mapping between all the pixels (everything is say 720x576 same PAR) then Good is as good as it gets, Best will just waste CPU time doing needless calcs.

However resize / rescale and pixels are no longer in the same position. So when you want to find the correct value for a given pixel what to do, there's no pixel in that position. SImplest way is a nearest neighbour algorithm, find the one closest and use it's value. Pretty fast and simple. Except, it's quite possible that for some positions in the output frame that one pixel of the source will be the nearest neighbour to more than one of the output pixels.

Simple example. One red pixel, the rest black. You move that image so that red pixel lies in the middle of four of the output pixels. Apply the simple nearest neighbour algorithm and our one red pixel has become four red pixels. Other wierd things will happen as you shift the transformation slightly, our red pixel will wink between being one to four pixels.

Now add to that simple example the complexities of interlaced video and things get really nasty and my brain hurts.

Now Vegas when switched to Best uses Precise Bicubic with Interpolation. That means it samples 16 pixels around every one of the ourput pixels, a simple explaination is here.
Note the statement from that page:
"Bicubic interpolation is often used in raster displays that enable zooming to an an arbitrary scale."

Now what I really cannot adequately explain is why not using Best produces jaggies, loss of resolution I can understand. I can sure understand why we need to set a de-interlace method but exactly how that gets tied in with the resampling algorithm used escapes my limited ability to build a mental picture of what's going on.

What even more so escape my addled grey cells is why something that has such a dramatic impact on the final quality of our work commands so little attention in the documentation. We've in the V6 release, with much fanfare, added even more tools that involve rescaling video (3D track motion). That the results can end up looking pretty aweful unless you happen to know some of the secret handshakes is a pretty poor show.

Now here's another question.

How many realise that there's a Good / Best setting for audio resampling?

Now my ears aren't that discerning but I tell you, I can hear the difference. And yes it takes longer to render audio at Best than it does at Good. And no, I have less of a clue as to just what that switch does, I just have Best set by default now.

Bob.