Degradation of avi files

richjo100 wrote on 4/22/2005, 1:26 AM
Hi all
In a previous post I have asked about rendering to HDV and the size of rendering to the intermediate codec. If I render the HD frames but with MJEP codec it is a lot smaller. However I would like to know about the degradation of the image. Does anyone know of anyway of telling how degraded a file is. I know you can just look at it but I was wondering if there was a utitlity that is available that will give a more scientific answer.
Any help or advice on this matter would be greatly appreicated
Thanks
Richard

Comments

B_JM wrote on 4/22/2005, 7:27 AM
you can subtract one frame from another and see whats left ..

there shoudl be nothing different ...

blackmagic - on their website - covers the method of doing this (they used to)

there are other methods also - but it boils down to what is changed ..

even a split screen A B test is good - as the eye (on a good large monitor) can pick up very small changes if they are close to one another
thankins wrote on 4/22/2005, 10:07 AM
In Vegas this can be done by ...

1. Loading the two files that you want to compare onto the timeline. They should be on seperate tracks, but should be aligned in time. If they aren't aligned in time, you won't be seeing a valid result.
2. On the upper track, change the Compositing Mode (located on the track header) to "Difference Squared".
3. Play back the project. The Preview Window will now display the difference between the two files.

Thanks. Tim.
richjo100 wrote on 4/25/2005, 2:10 AM
Excellent ! Thanks for your help guys. I will look at the black magic website and the vegas method later

......later.....

I cant find anything on the BlackMagic website but I only had a quick look. I'll look again later.

I have tried the Vegas method. Im getting confusing results.
The setup:
Image1 rendered using MJPEG at datarate 1024 (look shocking in Media Player but ok in QT???)
Image2 rendered using MJPEG at datarate 4096 (looks ok in Media Player and QT)

When I do the Vegas difference it says there are no differences. Am I being misled by mediaplayer when considering the quality. The datarate is definitely diffferent in QT but looks the same but is very different when viewed in media player.

Can anyone shed any light on this?

Thanks

Richard
Chienworks wrote on 4/25/2005, 4:37 AM
The differences are probably small. Color values can range from 0 to 255, but the monitor doesn't necessarily display them as a smooth constant range, nor do your eyes always perceive them as a full range either. Let's say you have an 8x8 block of pixels that varies smoothly from one color at one corner to another color at the opposite corner, and this range is from 120 to 140. At a low bit rate, the MPEG algorithm may decide that making the entire block 130 is close enough and all it can do with the limited number of bits you're allowing it. When you see this in media player you'll see a solid block where there used to be a nice gradient. It also stands out more because it will have sharp edges against neighboring blocks. In the middle range of color values this is fairly easy to see.

Now, when you create a difference image, this same block will fade from 10 at one corner to 0 in the middle and back to 10 at the opposite corner. The difference between 0 (black) and 10 (almost very nearly the same as black) is very hard to see. It may look entirely black either on your monitor or to your eyes. What you could do to amplify the difference is to add the Color Curves effect to the output and drag the center of the curve up towards the top left corner. This could increase that 0 to 10 range up to 0 to 150 or so. Now you'll be able to see the areas that are most different as lighter greys. True, this no longer accurately represents the difference. However, if you carefully apply the same curve each time, you'll be able to compare different codecs and bitrates. Those which show the smallest differences (darker overall in the difference image) have done the most faithful encoding job.