Render color issue

amunioz wrote on 8/13/2004, 8:36 AM
Hi there,
I'm working on some home footage, doing some editing, then rendering into files to store it in CDs.
Ok, the problem becomes when I render the project with its color corrections made.
What I see in the Vegas preview window (color, brightness, saturation, etc) is ok according to the adjustment I'd made.
When I render to the output file (no matter the format), and see it in no matter the player it appears very dark and saturated. Completely diferent to what I see in Vegas preview window.
Now, If I add that file to the time line to compare it with the original, they looks identical.

So, can any of you tell me what is going on here ?
Is there any way to set up Vegas to work the same way the video will be played in the same machine, at the same time ?

Thanks in advance
Andres

Comments

johnmeyer wrote on 8/13/2004, 8:58 AM
Use the Firewire output capability of Vegas and DVDA to preview to a TV set. What you see on the monitor -- especially with colors -- is going to be VERY different from what you see on a television set. Don't ever do color correction on the computer monitor or you will end up with exactly the problems you describe.
stormstereo wrote on 8/13/2004, 9:51 AM
I've experienced the same here. The first thing you should do is look in your graphics card settings. I use nVidia Geforce4 440 Go Mobile and there is one thing in the settings called "Overlay Controls". This overlay affects the video when played in Windows Media Player. For some reason the default color saturation is set to 130 %. I make sure that is turned down to 100 %.

Secondly, just the other day I rendered out two small clips, one with the Vegas internal DV codec and one with MainConcepts latest DV codec. To my surprise the one encoded with Vegas internal was darker and more saturated when viewed in WMP and MainConcepts was more true to the original. However, keep in mind that there's a difference in ENcoding a clip with Vegas and DEcoding the same clip with WMP. The same codec must be used when encoding and decoding to obtain optimal results.

If you compare your encoded clip to the original on the Vegas timeline and don't see a difference, then you're safe. Don't worry 'bout it, just check the Overlay Controls.

Best/Tommy
amunioz wrote on 8/13/2004, 10:53 AM
Hi, thanks for responding.
I'm ok whit that, and we're using almost the same card. I have a GeForce 440 MX 128Mb AGPx8.
Already set the Overlay control.
But the problem becomes when I distribute the file.
Ohter user got the same problem.

I was thinking about using the same codec than Vegas, but how to know that ?
I'm working with AVI Uncompressed source and rendering to WMV at max. quality.
In that case Vegas will use a codec (or not) to show the AVI format and will use another to render.

By the way, I'm testing the same thing at work, where got no an GeForce but a generic card, and the simptoms are the same.

Is the a way to adjust the video preview to adjust it to the rest of windows video players ?

The idea is be able to play my videos without brightness adjustments, almost in the most of cases.

Thanks-.
Former user wrote on 8/13/2004, 10:56 AM
If you are going to be color correcting on your computer monitor, you need to get a calibration setup so the monitor matches closer to your output. Or monitor on an external monitor that is setup correctly.

Dave T2
amunioz wrote on 8/13/2004, 11:05 AM
I'm sorry but I only want to see the videos in a monitor.
Watching the videos on TV or monitors is not my intention.

I open Vegas, adjust the colors, render an output file, watch it on any player and see the video very very dark and a little more saturated.

Having all the color controls at default (not extra procesing at all).
Without touch anything between the editing and the watching of the final output.

Thanks
stormstereo wrote on 8/15/2004, 10:22 AM
Okay, I misunderstood you the first time. I thought you were rendering out to AVI. When doing so, Vegas use an internal codec with really good quality.

Anyway, when you are rendering to WMV you are using the WMV codec as provided by Microsoft. It should look good on any computer screen. I still believe the graphics card does something to the picture when played in the Windows Media Player 9. I know you've tried this already but can you maybe look at the WMV-file on another computer with a different media player and graphics card? Just to gather some more information. As of now I can't come up with any new ideas.

Best/Tommy
amunioz wrote on 8/17/2004, 4:47 AM
Thanks to all of you.
farss wrote on 8/17/2004, 5:12 AM
Perhaps a simple approach could solve this one.

Calibrate your PCs monitor using the same procedure as you would for a TV monitor. Now make a set of test clips. SMPTE bars, gradient and perhaps blue only bars. Encode these to whatever your target format is.

Play these out in the target player. Take screen shots of each. Bring these stills back into the Vegas T/L and then calibrate the PC monitor using these.

If I've got my thinking around the right way you've now calibrated out whatever is causing your problem!
This will not be that accurate but it should be better thab taking a lot of stabs in the dark. Let us know how the idea works out. I suspect this is going to become more and more of an issue as more video is targeted at display systems other than traditional video devices.

Bob.