interlace nightmare - only with SD though

Sherif wrote on 2/18/2008, 6:29 PM
I swear I try and search on topics before I post, and I know theres been tons on this one, which I have read all weekend. So any help appreciated.

When I take my HDV footage and directly render to WMV - NO INTERLACING seen.

When I take my SD footage and directly render to WMV - horrible visible interlacing seen. (When I render ro DVD no visible interlace artefacts)

What gets me is that in the custom WMV render parameters it does NOT ask you whether you want to de-interlace the footage, and what technique you want to use. Does anyone know why this is?

((In the project properties I have set Vegas to multimedia and no fields and de-interlace with blend. The Vegas preview window plays fine.))

SHould I be de-interlacing during CAPTURE?

I use Vegas 8a on a platform which has taken this winter to get half right(XP, Quad-core, 2Gb etc).

My new motto: "Bring back some form of standardisation for gods-sake before its too late (in the morning). 2:21am UK time.

Comments

fldave wrote on 2/18/2008, 6:37 PM
For SD to WMV, then I would change the project properties to deinterlace Interpolate and always, always use "Best" quality. Don;t skimp on the WMV data rate either.
farss wrote on 2/18/2008, 7:14 PM
Yes,
when downscaling interlaced from HD to SD you REALLY want to specify a deinterlace method in the project properties and use Best. Without that Vegas will really mangle things. The interlace artifact looking things that you get are not the traditional interlace artifacts at all, they're MUCH bigger, think dogs teeth instead of mice teeth.

Bob.