Wich codec an what is 16-235 and 0-255

sjursjursjur wrote on 10/3/2004, 7:14 AM
Hi.

I've been using Vegas and After effects for a little while (educational). But I just recently realized that rendering clips (captured with Vegas) in After Effects and importing them back into Vegas does funny stuff to the colour, saturation, contrast etc. I've tried both Uncompressed, MicrosoftDV, And Main Concept 2.0.4, AVI-renders, but their all wrong. Turning off "Ignore third party codecs" in Vegas does seem to help reducing the differences, but results in a more contrasty picture with flatter highlights (or lower luminance range). I've looked for something like PhotoShop's Color-space settings in After Effects, but I don't find anything.

What I want to ask, really, is: Is there any way to get clips rendered in After Effects to look similar to it's origin when imported back to Vegas (in terms of colour/saturatinon/contrast/luminance that is)? Is there any codec that is better than others (also is there any downloadable ones)? And finally: Could someone please explain the whole 16-235 and 0-255 bit?

Go ahead, educate me! Regards SjurSjur, Norway

Comments

Spot|DSE wrote on 10/3/2004, 7:58 AM
16-235 is the color span of the NTSC signal in the RGB world, and is predominantly a concern for those taking video to broadcast. The spec sets a level for maximum extremities of the color range. 16 is the darkest black (really dark, dark grey) that may be broadcast, and 235 is the brightest white that can be broadcast. (really bright, bright grey)
This relates to the IRE standards (Institute of Radio Engineers) of broadcasting blacks at 7.5 IRE and whites at 100 IRE. In the digital realm, 7.5 IRE is equal to 16 RGB and 100 IRE equates to 235 RGB.
When video information exceeds these limits in the broadcast signal, several things can occur, not the least of which can be distorted audio or jittering displays of colors.
Vegas has an NTSC color filter to assure nothing goes beyond these limits. In the Scopes views, you can see these values pretty easily.
Steve Hullfish has an excellent book on color correction that delves into the depths of IRE and why those standards were put into place.

Regarding transfers from AE to Vegas and vice versa, most of us use either sequential TGA or uncompressed media to go back and forth. The only way you should be seeing the color shifts you indicate is if you're using intermediary codecs that don't have the best compression, which is never a good thing.
sjursjursjur wrote on 10/3/2004, 10:14 AM
Thanks for the answer, Though I forgot to mention that I am Working in a PAL environment. So I guess the 16-235 does not apply then.... Although I think PAL has an IRE upper limit of hundred I think it goes down to zero. Anyway, I also forgot to mention that I exported as TGA, with the same result, much higher contrast.

Anyway, if compare the origin AVI-file and the AA TGA render with "ignore third party codecs" unchecked the colours are similar but the quality is bogus, and the highlits are clamped.
With the "ignore..." checked, I can apply the "computer RGB to studio RGB" color correction preset to the TGA render and the "Broadcast Colors" legacy plugin to the origin AVI which makes the the clips near identical although highlights are clamped. Clamped highlights are OK in a way since I guess they get clamped when the finished project reach a TV anyway, and I can live with that. On the other hand, it reduces my possibilities while color-correcting/adjusting curves etc, as well as increasing the use of FX/plugins, which again result in considerable rendering-time, and just plain hassle.

And thus to the core of my ponderings: Is there a way of not having AA clamp the highlights? That way all I have to do is apply the "Computer RGB to Studio RGB" to the AA-clips and everything will be so much easier... Or altoghether just not having AA either clamp the highlights OR increas the contrast level?

I realise this post should probably have been posted in a AA-forum, but hey, it's all because I want to use Vegas!

Sjur
johnmeyer wrote on 10/3/2004, 11:05 AM
If the color doesn't look right (washed out, or too dark), bring a sample of the rendered video back into the Vegas timeline, below the original, and do an A/B compare. If it looks different, change the RGB 16-235 setting (or whatever it is called in your codec) and try again.
taliesin wrote on 10/3/2004, 5:08 PM
For PAL and speaking about RGB luminance levels the ITU-601 says: Bit 0 and Bit 255 are reserved for technical information. Bit 1 up to bit 15 is a underexposure buffer. Bit 16 is black. Bit 235 is white. Bit 236 up to bit 254 is overexposure buffer. Bit 255 is reserved for technical information.
In addition to it broadcasting stations usually doesn't except levels below RGB 16 because this might cause technical problems, some kinds of interferring with sound, etc. when broadcasting.
Levels over 235 aren't that harmfull but usually broadcasting stations automatically limit transmission to RGB 235. (I use the RGB values here though it's not really what it's used out there).
If you work for broadcasting purpose I would suggest keeping your full range till the end of your production and in a last step add the broadcast colors filter to fit your levels and use the knee function (smoothness) there not too loose to much information in the blacks and whites.
If it's for home purpose or for beaming - test your equipment. If it allows levels below RGB 16 and/or over RGB 235: keep the super-black and super-white. It's very valuable contrast range!

Taking about After Effects: Take care of what kind of codec it uses when decompressing the files (if them are DV-AVIs) while you work on them!
For example using MS-DV for decompressing will make your signals being clipped in white and black.
In Vegas using the Sony Pictures DV codec ("Ignore third party codecs" checked and "Use Microsoft dv coded" unchecked) for decompressing dv files will handle the luminance range right the way it is. No changes are done.

Marco
sjursjursjur wrote on 10/3/2004, 5:49 PM
Ok. Again, thanks for replies. It's educational! Whoo!

Apparently there are lots of traps to get caught in in this scenario. Changing the RGB to or from 16-235 is not an option as none of my current codecs seem to have this option. As for the A/B compare... been there, done that. A lot.

Anyway, seems like I've come up with a solution I can live with and I think it might be worth mentioning if anyone else should ever search these forums for a solution to this particular problem. (Keep in mind that all this is PAL-DV stuff, I've done no testing with NTSC). Also, I should mention that I'm using After Effects 6 (not 6.5) and Vegas 5.

Now, the problem is in essence After Effect's way of handeling captured AVI (I use the Sony PD-150 PAL, and capture with Vegas). It (After Effects) clips the highlights to 100IRE as you import it to a project. (I've found no way to alter the decompression clipping in either After Effects or in my current codecs settings) Thus you should not use the origin AVI (,that is the captured file). Go to Vegas and render the desired file as a quicktime .mov file using TGA at highest quality as video format. Now import this into After Effects. Right click on the imported file and select "Interpret footage". Under Fields and Pulldown, and then Separate Fields choose "off". Then "OK".

Now you can do whatever was the reason you imported the file into After Effects in the first place. Then render as Quicktime .mov TGA at highest quality. Thats it... NOT!

When you import back into Vegas, the program interpret the pixelaspect ratio to 4:3 (1:1,926 for PAL DV) regardles of what aspect ratio you set for the render to in AE. Well this is, of course, easily spotted as you drag the footage into the Vegas timeline, and equally easily dealt with. What's not as obvious though, is that Vegas (again, regardles of render settings) interpret the footage's field order as "upper field first", well, set this to "lower field first" and, tada!: Virtually no quality loss or change in color. Last thing: one might want to render this to a new track as AVI DV, that is to make for smoother handeling in the Vegas environment.

All right, I'll go pat my back now, but if anyone has a simpler (read: fewer renders) solution I'm all ears.


SjurSjurSjur

EDIT: Of course, the simpler solution is getting a AVI DV decompression codec for After Effects that has the option of not clipping the whites and blacks. If anyone knows of a good one, hoot. By the way, does anyone know how to tell which decompression codec AE is using?
taliesin wrote on 10/4/2004, 3:42 AM
>> : Of course, the simpler solution is getting a AVI DV decompression codec for After Effects that has the option of not clipping the whites and blacks. If anyone knows of a good one, hoot.

Matrox DV. But take care to adjust the codec preferences before. It's delivered with preferences set to 16-235, but it can also be adjusted to fit 16-255 or 0-255.

Marco
sjursjursjur wrote on 10/4/2004, 5:53 AM
Where can I get the Matrox codec? Also, i figuered how to find out how to see what codec is being used by AE; alt-click on the footage. Now, does anybody know how to make AE use another decoding codec than the MS one?

Sjur
rs170a wrote on 10/4/2004, 6:36 AM
Where can I get the Matrox codec?

\http://www.matrox.com/video/support/ds/software/codec/home.cfm
There's a form to fill out but it's free.

Mike
Bill Ravens wrote on 10/4/2004, 6:40 AM
The Mainconcept 2.4.4. codec has the same feature to select 16-235 RGB or 7.5 IRE. To access the radio select buttons, hit "configure" after you select this codec to encode (or playback) with.
sjursjursjur wrote on 10/4/2004, 12:50 PM
The Matrox codec made my day. The thing was, I never found a way to make AE use a different codec than MS for decoding (while trying out different codecs, among them the MainConcept 2.4.4 demo), but the Matrox did it all by itself, how convenient! And fre of charge! Thanks a lot for all the advice.

Sjur
taliesin wrote on 10/4/2004, 2:51 PM
Cool, glad it helped you.

Marco