I don't think this is any way VVs fault, I suspect it's an issue with the nature of DV but I'm damned if I can explain it.
Today I transferred quite a bit of 8mm film using my new ADVC-300. Worked beautifully. I felt so inspired just for kicks I went and did some color correction following BBs tutorial on removing color casts. Wow what a difference, bride is now wearing white, not yellow and the roses are yellow not green. So I had a go at another clip for the same client.
The film was in much worse condition, some of it just cannot be corrected, it seems sometimes due to bad processing or storage one or more of the dye layers just disappears. It was also very dark and grainy. Boosting up the gain and apply whatever CC would help mostly it looks OK apart from the grain and noise.
Now comes the wierd part. I was monitoring extrenally via the ADVC-300 and I started to notice some horrendous artifacts, much like serious venetian blinds that are shifted about 5% of image to the left on any vertical during motion. Damn I thought something wierd in the D/A, it wasn't on the PC monitor so I just ignored it. I did try prerendering it just in case but it was still on the external monitor but not on the internak.
Anyway job is late so I pressed on, making a note to better check out the ADVC-300. Encoded to mpeg for DVDA and then lost a few hours having DVDA dramas. Finally burnt DVD and had a look how my CC turned out. First clip of wedding looks truly great, went to the really dodgy clip and horror of horrors there are the venetian blinds.
So back to VV. If I disable the CC problem is there but not as bad, if I enable Reduce Interlace Flicker on the clip properties, problem goes away, if I apply slight gaussian blur problem goes away. So I can find lots of ways to fix it but none of them make any sense. These artifacts are way too big to have anything to do with interlace. The only factor is the amount of noise i.e total number of pixels changing from frame to frame. The CC has bought the noise from the CCDs and the film grain up a lot, the blur would reduce it.
As I don't know just what reduce Interlace Flicker does I cannot comment on that. It almost looks like there is just too much happening for the DV codec to cope with BUT DV encoding isn't temporal.
Obviously I've fixed the problem but I'd sure ike to know why what I'e done fixed it. I might add the actual source is at 18 fps with some very dodgy pulldown to PAL at 25fps but that has never caused a problem before, all the other reels on the same timeline are fine.
Today I transferred quite a bit of 8mm film using my new ADVC-300. Worked beautifully. I felt so inspired just for kicks I went and did some color correction following BBs tutorial on removing color casts. Wow what a difference, bride is now wearing white, not yellow and the roses are yellow not green. So I had a go at another clip for the same client.
The film was in much worse condition, some of it just cannot be corrected, it seems sometimes due to bad processing or storage one or more of the dye layers just disappears. It was also very dark and grainy. Boosting up the gain and apply whatever CC would help mostly it looks OK apart from the grain and noise.
Now comes the wierd part. I was monitoring extrenally via the ADVC-300 and I started to notice some horrendous artifacts, much like serious venetian blinds that are shifted about 5% of image to the left on any vertical during motion. Damn I thought something wierd in the D/A, it wasn't on the PC monitor so I just ignored it. I did try prerendering it just in case but it was still on the external monitor but not on the internak.
Anyway job is late so I pressed on, making a note to better check out the ADVC-300. Encoded to mpeg for DVDA and then lost a few hours having DVDA dramas. Finally burnt DVD and had a look how my CC turned out. First clip of wedding looks truly great, went to the really dodgy clip and horror of horrors there are the venetian blinds.
So back to VV. If I disable the CC problem is there but not as bad, if I enable Reduce Interlace Flicker on the clip properties, problem goes away, if I apply slight gaussian blur problem goes away. So I can find lots of ways to fix it but none of them make any sense. These artifacts are way too big to have anything to do with interlace. The only factor is the amount of noise i.e total number of pixels changing from frame to frame. The CC has bought the noise from the CCDs and the film grain up a lot, the blur would reduce it.
As I don't know just what reduce Interlace Flicker does I cannot comment on that. It almost looks like there is just too much happening for the DV codec to cope with BUT DV encoding isn't temporal.
Obviously I've fixed the problem but I'd sure ike to know why what I'e done fixed it. I might add the actual source is at 18 fps with some very dodgy pulldown to PAL at 25fps but that has never caused a problem before, all the other reels on the same timeline are fine.