Bug: "Gaussian Blur" alters light level

riredale wrote on 6/8/2003, 1:20 PM
I just finished a 3-minute montage of some still shots, set to music. Very nice. The only problem is that the very first shot (the interior of a cathedral, fading in on the ceiling and tilting down slowly) was full of interlace artifacts on some of the hard lines in the walls. I ran that segment through the Gaussian filter, and found that at the ".001" (the first step up from no correction at all) it looks much better.

But I also noticed an anomaly: if you use the FX timeline to set the blur to begin heavy and gradually taper off to nothing (.000), there is a subtle "flash" in brightness level when the blur goes from .001 to .000. I can't think of any reason why a Gaussian blur should do this from a math viewpoint, so there must be some sort of code error that does this.

Comments

josaver wrote on 6/9/2003, 2:31 AM
Possibly you are using the microsoft DV codec instead the sonicfoundry's codec.

Go to options/preferences/general tab and check "ignore third party Dv codecs" and uncheck "use microsoft DV codec".

I've had this problem in the past in the transitions.

Josaver
BD wrote on 6/9/2003, 8:12 AM

The .000 setting on the Sharpen filter seems to have a slight sharpening effect -- .000 is not the same as turning off the filter. So, to fade in/out a sharpening effect: I copy the event onto an upper track and apply the Sharpen (or Convolution Kernel) filter there, and then Fade the copied event.

(Perhaps this method would also be helpful with the Gaussian Blur filter.)

Brandon's Dad