Rendering Deterioration

aspenv wrote on 12/16/2004, 5:34 PM
I was reading the Mac vs PC thread, and Laurence was pointing out that rendering deteriorates the video. I assume, it is not the source (media file) that gets deteriorated, but the output.
So, does this means, that the more I render the worst quality I'm getting? If it is so, how come can this happen. Is it not rendering from a source media file that is intact?
I'm confused about this, thanks for your help

Comments

Jay Gladwell wrote on 12/16/2004, 6:36 PM
Not with Vegas, no. Kelly did some test a year or two ago re-rendering various clips in Vegas 99 times and there was no appreciable difference from the 1st to the 99th render.

Jay
aspenv wrote on 12/16/2004, 8:21 PM
Thanks Jay,
What about 200 times? 1,000 times? Does that count for prerenders as well?
You said no appreciable difference...so that means that there IS a difference. How can that be? Isn't Vegas a non destructive NLE? But I'm degrading my media files...!
Sorry for not understanding this concept
Chienworks wrote on 12/16/2004, 8:31 PM
Buildup of quality degredation from multiple renders occurs when you render the source (A) to an output file (B), and then remove (A) from the timeline and use (B) as a new source and render it to (C), and then remove (B) from the timeline and use (C) as a new source and render that to (D), and so on. Each new rendering is a further generation away from the original.

If i read your original post correctly, you are asking about rendering (A) to (B), then rendering (A) again (still the original source file) to (C), and the rendering (A) again to (D), each time still using the same original source. This would be the normal situation when you do many test renders and pre-renders to see how the project is coming along. In this case, you can render an infinite number of times (well, not really, the heat-death of the universe will put an end to your rendering fun eventually) and the last render will still be exactly the same quality as the first. You haven't ever changed the original (A) source file, so you'll still get the same quality from it every time.

This differs from analog sources in which each pass across the playback heads slowly erodes away the oxide layer and slightly demagnatizes the tape. In this case, you really would start noticing major quality loss from the original tape after a hundred plays. On the other hand, even in the NLE world, each time you play an original file you are wearing down the hard drive a tiny bit and eventually you'll lose data. However, the hard drive can probably sustain millions of replays before serious damage will occur.

----------------------------

In the multi-generation re-render test i did with SONY & Microsoft's DV codecs, by far the most degredation occurred with the initial DV encoding. There was much more quality lost going from uncompressed to the first DV generation than in all 99 DV->DV generations combined. Microsoft's codec didn't hold up anywhere near as well, but even still, each generation of DV->DV suffered a lot less loss than the first uncompressed->DV generation.
PeterWright wrote on 12/16/2004, 8:36 PM
If you have untouched DV footage, do nothing to it and render it out as DV it will simply be a copy of the original. No degradation.

If you add effects, such as colour correction, superimposed titles etc, then Vegas will re-render using its own DV codec, and although there may be minimal degradation, you won't see it for many generations. You should never have to go down more than maybe three or four generations, so tests involving 99 are just to see how good the codec is. The Vegas one stacks up well against all comers.

farss wrote on 12/16/2004, 8:40 PM
This doesn't apply to all systems by the way. It seems Avid applies chroma smoothing to all renderd output which does make the fooatge look better than what it does coming out of Vegas. However keep repeating the process and things start to fall apart.

Bob.
aspenv wrote on 12/16/2004, 9:20 PM
Thanks so much for the replies.
Thanks, Chienworks, for you detailed explanation. Since I've never had the need to render A to B and then use B, I could not understand why a rendered output from a digital source could loose quality.
Thanks again!