Moire Patterns in Rendered Video

Quoman wrote on 4/3/2004, 11:50 PM
I'm trying to create a slide show using 120 JPEG images. While previewing the timeline the images look good, but after rendering the project as an NTSC DV Video for Windows (.avi) file, most of the images show noticeable moire or ringing patterns. Am I missing something in the custom settings or is there an inherent problem with the JPEG images or both?

Comments

jetdv wrote on 4/4/2004, 5:06 AM
A couple of things to try when working with images (especially ones with fine lines)

1) Render using BEST instead of GOOD (one of the few times you need to do this)

2) Add a very slight blur to the images.
Cheesehole wrote on 4/4/2004, 5:31 AM
Yeah "quick blur" usually does the trick. When I'm rendering those types of slide shows for DVD I usually get obsessive and use the Median Filter, but it is VERY CPU intensive so your renders will go slowly. See if quick blur works first.

I use that trick on text too.
farss wrote on 4/4/2004, 7:18 AM
Also you're fighting a battle on two fronts, the compression on the DVD and how the signal gets from the DVD to the TV. Unless you're going component from the player to the TV horrid things can happen.
Also encode straight from the T/L to mpeg-2 to avoid the DV artifacts.
johnmeyer wrote on 4/4/2004, 11:35 AM
jetdv's suggestion of using Best instead of Good is the most important. The main reason for having this setting is to let you use the different rendering algorithm that gets invoked with Best. It does pretty much the same job with motion video, but does a much better job with still pictures. It is, however, slower, which is why it is not selected as the default.

If you have time to try an experiment, you might want to select about 25 seconds of your project and then render it using Good, the using Best, then using Best with the motion blur.