I just finished a 3-minute montage of some still shots, set to music. Very nice. The only problem is that the very first shot (the interior of a cathedral, fading in on the ceiling and tilting down slowly) was full of interlace artifacts on some of the hard lines in the walls. I ran that segment through the Gaussian filter, and found that at the ".001" (the first step up from no correction at all) it looks much better.
But I also noticed an anomaly: if you use the FX timeline to set the blur to begin heavy and gradually taper off to nothing (.000), there is a subtle "flash" in brightness level when the blur goes from .001 to .000. I can't think of any reason why a Gaussian blur should do this from a math viewpoint, so there must be some sort of code error that does this.