DVDs and Color Correction

matt24671 wrote on 9/12/2004, 6:45 PM
Folks-
I did a wedding video where most of the footage looked too dark (on my NTSC monitor, out the firewire from Vegas.) But after burning a DVD in DVDA, the same footage played from the DVD through the same monitor looks much lighter - not too dark at all (though a little less color saturated.)

My question: if the MPEG-2 conversion changes the video's color / brightness - how on earth am I supposed to correct it in Vegas?

Thanks,

Matt

Comments

johnmeyer wrote on 9/12/2004, 7:17 PM
What format was your original footage, and how did you capture it? Did you use Vegas, DVDA, or some other program to render the MPEG-2 files? Depending on the answers to these questions, you can run into the RGB 16-235 mapping issue which will indeed make your DVD look lighter, and somewhat washed out. If you capture DV AVI, do the rendering in Vegas or DVDA, then you will not have this problem, and someone else will have to come up with something else to try.

If you ARE using another program to render the MPEG-2, then look for a checkbox relating to RGB 16-235 and change it.
NickHope wrote on 9/13/2004, 1:33 AM
Ah, thanks John, so that must be why my MPEG2 from CCE is lighter than the one made with Vegas' MainConcept.

Just noticed that the luminance in CCE Basic defaults to "16 to 235", so I'll change it to "0 to 235", burn a DVD and see if I've got the contrast back.

By the way I'm capturing with Sclive and using Debugmode Frameserver.
NickHope wrote on 9/13/2004, 3:30 AM
Did it and i have more contrast with "0 to 235" as long as I frameserve in RGB, not YUY2. But possibly still not as much contrast as the original DV or the Mainconcept MPEG2. Very difficult to judge though.
farss wrote on 9/13/2004, 4:28 AM
I've done a bit of testing on this. Render various test patterns out to mpeg-2 using MC PAL, drop back onto Vegas TL and all looks well.
Burn to DVD and check composite output back into Vegas's scopes.
Result is out of the DVD player levels are now compressed to between 16 - 235.
Not much you can do about it from what I see. Some have suggested upping the gamma to compensate but you'd need to be careful you didn't start crushing the ends. I'd say the settings someone was talking about are to indicate to the encoder which points it should consider as 0% and 100%.
I'd further add this is verified by the results from my very dodgy monitor. Feed it straight from the Vegas T/l and the out of spec levels can cause it to loose sync big time. Burn the same material to DVD and never a twinkle. So for copy to VHS I legalise, for DVD I don't.

Bob.
johnmeyer wrote on 9/13/2004, 7:45 AM
Dropping the rendered MPEG-2 back into the Vegas timeline, and then using the mute or solo buttons to do an A/B comparison should tell you whether the color is identical or not.

If the MPEG-2 looks identical, then it will be identical when it is authored into a DVD, UNLESS DVDA (or whatever authoring program you use) is recompressing it. Make sure to check, in the authoring program, prior to preparing the DVD, whether the MPEG needs to be recompressed (it's in the Optimize setting in the File menu in DVDA). If no re-compression is happening in your DVD authoring software, then Nothing is done to the MPEG-2 when it is converted to a DVD, other than interleaving it with the audio and subtitles, and inserting navigation pointers.
farss wrote on 9/13/2004, 8:45 AM
John,
the issue isn't what's done to the mpeg when it's converted to a DVD. The problem is the DVD player fiddles with it. Haven't tried the component outputs on a DVD player but it sure fiddles with the composite output.

Bob.
Spot|DSE wrote on 9/13/2004, 8:54 AM
Semi OT, but use the Black restore prior to encoding. you'll see a slight increase in the quality of the encode. Check it out!
johnmeyer wrote on 9/13/2004, 10:30 AM
use the Black restore prior to encoding. you'll see a slight increase in the quality of the encode

Spot, what setting to use? The Streaming setting of 0.020? Anything above 0.100 seems to show artifacts.
NickHope wrote on 9/13/2004, 11:35 AM
Somewhere on my meanderings today I read that some but not all DVD players mess with luminance. I'll be comparing MPEG2's in Vegas from now on and not with my dodgy eyes through my dodgy DVD player on my dodgy telly.
SonyEPM wrote on 9/13/2004, 2:55 PM
"some but not all DVD players mess with luminance"

This is correct- DVD players may and often do decode somewhat innacurately, sometimes a little, sometimes by a large amount (player dependent). We have done many tests with color bars and other test files and when you render DVD MPEG in Vegas you should see

a) the rendered DVD Mpeg-2 file (using default templates, no filters), when loaded back in Vegas, will be a very close match to the original on the internal vegas scopes. This is good, as designed, but the MPEG decoder is factoring in so this isn't a 100 guarantee that it'll look the same on a DVD player (which uses a different decoder than Vegas does)

and

b) the same test clips, when burned to DVD and played back on better DVD players (I use a Sony DVP NS400D, we've tested this with quite a few others), will show a very close match to our DV output (looped through a Sony PD170 or a DSR1500a for D>A) when the signal is compared on a hardware WFM (I use an older but accurate Videotek TVM 710, s-video feeding the scope in all cases.

Anyway as a general rule you should trust Vegas to have produce accurately matching DV and DVD MPEG output where hue/sat/lum is concerned. If you find differently we'd like to know about it-
farss wrote on 9/13/2004, 3:00 PM
Good to see someone else getting the same answers as I did. Does kind of make a joke of the idea of putting test patterns onto DVDs to calibrate monitors.

Bob.
StormMarc wrote on 9/13/2004, 3:45 PM
My guess is that your problem is that DV has a black pedestal of 0 IRE and your analog TV expects 7.5 IRE signal (set-up).

What I do is:

1. Send my DV (from a firewire card) through a Canopus ADVC-100 convertor and set the black level analog output to 7.5 IRE. This feeds my monitors that are calibrated to a 7.5 IRE signal. So basically I am adding the proper black level to the footage for analog viewing on a USA tv set and my color corrections will be correct.

2. When I make DVD's I use Cinemacraft (set at 16-235) and the Panasonic DVD player appears to add it's own black level boost because they look good. Whenever I have tried to add set-up to the original DV footage through color correction (then mastered to DVD) they look washed out.

3. Another note: When I record to VHS from DV I make sure I add the 7.5 set-up to the 0 IRE DV footage. This is because the orginal DV should stay at the 0 IRE level standard but analog tapes should have 7.5 set-up.

4. Keep in mind that if your doing your analog conversion through a DV deck or camera you have to find out if your unit adds 7.5 IRE set-up to it's analog outputs. I have a Sony WV-DR7 which does not. I have heard that the JVC units do add set-up.

Using the system above has kept my videos fairly standard.

Here's a good article in regards to the black level issues with DV.

http://www.adamwilt.com/DV-FAQ-tech.html#Setup

good luck,

Marc
RexA wrote on 9/14/2004, 2:15 AM
Hmmm. Interesting issues raised here.

Well, I guess if I use my AVIA calibration DVD to adjust my home theater TV, then things should be right, at least for all DVD's, though maybe not for other video sources if the DVD player itself is not "playing by the rules".

So many variables. Who do you trust?

I just tried to play an old VCD on my newest DVD player today and it was horrible -- digital blotches everywhere. I tested the player with a different VCD before I bought it. This "problem" VCD played fine on my old DVD player and the "test" VCD still plays on this one.

I don't think I have ever seen a DVD player that does everything right. In my experience the sites that rate DVD players about compatability are largely inaccurate too.