Video Quality

mterrien wrote on 7/6/2009, 7:54 AM
I am using the Video Capture program that comes with Vegas. Right now with Vegas Pro 9.0 that is Video Capture 6.0c and a notice the same result with earlier versions. I am noticing some degradation when the video is pulled from my camera as an AVI. Is there a better way to do this? When I play the video from my Canon GL1 direct to a TV it looks great. Captured from the camera to an AVI the video looks washed out (no contrast) and is not as clear. I am also guessing that when I create a DVD from this AVI it's degrading even more.

Comments

John_Cline wrote on 7/6/2009, 8:08 AM
If you are using Vegas to capture video from your GL1 using IEEE-1394 (Firewire) then there is absolutely no loss whatsoever. All you are doing is doing a straight data copy operation from your camera to the hard drive. The Vegas capture process does not modify your video or audio data in any way and there is no way that it can. If your video looks washed out, then that would suggest that your computer monitor is not properly calibrated.
mterrien wrote on 7/6/2009, 6:29 PM
I am using a Firewire Connection to my computer. The Canon GL1 has a Firewire output jack. My NVIDIA motherboard has a on-board Texas Instruments OHCI Compliant IEEE 1394 Host Controller and it's using a Microsoft Driver. The AVI played in Microsoft Media Player looks a little lower quality wise. And creating a DVD through Vegas and DVD Architect it looks even worse. So if I am getting lossless transfer from my camera to Vegas. Then there has to be something else wrong I guess. I render at the highest quality in Vegas. I don't care about the file sizes. I want the best picture. You also mentioned I should calibrate my computer monitor. I have never done that and wouldn't know how. It's a brand new Samsung 22" LCD.
rs170a wrote on 7/6/2009, 6:53 PM
The AVI played in Microsoft Media Player looks a little lower quality wise.

As expected. Media Player is NOT to be used as a tool to judge overall image quality.

And creating a DVD through Vegas and DVD Architect it looks even worse.

It all depends on the template that was used to do the encoding from Vegas.
If you used the default, then yes, I would expect some quality degradation.
Download and read Vol. 1 Issue #7 of Edward Troxel's free newsletters as this issue deals with DVD encoding.
While written for an earlier version of Vegas & DVD architect, the principles remain the same.

Mike
Steve Mann wrote on 7/6/2009, 9:45 PM
"The AVI played in Microsoft Media Player looks a little lower quality wise."

Lower quality than what? Are you comparing apples and coconuts? No consumer LCD monitor or television can be used for comparing or evaluating video color, contrast and brightness. Only a calibrated monitor can do that and studio LCD monitors cost about $5,000 or more.

Some of the $200- $300 monitors can be somewhat calibrated, but even then, no LCD can go completely black (look at a "black" screen in a dark room), "white" depends on the color of the backlight, and contrast depends on the viewing angle.

craftech wrote on 7/7/2009, 5:53 AM
Connect your GL1 to the firewire port on the computer and the s-video from the GL1 to your television and playback the computer avi file through that setup and tell us how it looks on the TV.

John
mterrien wrote on 7/7/2009, 1:16 PM
To Steve Mann:
The bottom line is this: If I play my video tapes using the camera connected directly to my TV the video looks perfect. It's clear, sharp and colorful. When the same video is captured and written to a DVD in Vegas/DVD Architect, then played on the same TV the picture is blurry and the colors dull, like they're washed out. I want a DVD that has the picture quality I see when I connect my camera to the TV. I'll do whatever I need to accomplish what I want. But I have to know what that is.

To John:
"Connect your GL1 to the firewire port on the computer and the s-video from the GL1 to your television and playback the computer avi file through that setup and tell us how it looks on the TV."
Maybe I need to plead dumb here but I don't know how an AVI played on my computer is going to show up on my TV when connected this way.

To Mike:
"Download and read Vol. 1 Issue #7 of Edward Troxel's free newsletters."
I did and I'll give it a read.

Thanks to all for the suggestions. I'll keep working. Hopefully I'll get the answer I'm looking for.

Mike

musicvid10 wrote on 7/7/2009, 1:23 PM
"I want a DVD that has the picture quality I see when I connect my camera to the TV. I'll do whatever I need to accomplish what I want."

Then render and burn to Blu-Ray. Then you'll get what you want.

You sound like you want highly compressed, lossy ~6Mbs MPEG-2 on a DVD to look the same as your 25Mbs lightly compressed AVI from your camera. Tain't gonna happen!
John_Cline wrote on 7/7/2009, 1:34 PM
"I want a DVD that has the picture quality I see when I connect my camera to the TV."

There is absolutely no reason that Vegas and DVD Architect can't give you what you want. It's not rocket science, thousands of people do it every day. There's something you're not doing correctly. Perhaps it's they method you're using to capture the video or maybe it's a setting of some sort that you're missing in the project settings or the rendering.
Former user wrote on 7/7/2009, 2:12 PM
What are your MPEG rendering settings? (bitrate, etc)

Dave T2
johnmeyer wrote on 7/7/2009, 3:11 PM
I am noticing some degradation when the video is pulled from my camera as an AVI. Is there a better way to do this? If you connect your DV camera to a NTSC television set and play the video and then compare that to taking the same video captured to Vegas and then played back through the camera (by clicking on the monitor button in the Vegas preview display, and selecting "Best" as the preview quality), you should get identical video. Not close, but identical. Another way to do this is to use the Vegas "Print to Tape" feature and send the captured video back to another videotape. If you play that tape in the camera, directly to an NTSC television set, it will look exactly the same as the original.

Finally, if you put that captured video on the Vegas timeline, and use the DVD Architect template (do NOT use the "default" template) to render an MPEG-2 file, and set the average bitrate to between 7,000,000 and 8,000,000 bits per second, and then burn a DVD from this video, you will have to look VERY closely to see any differences from the original. If you know where to look, you will be able to see some subtle changes, but they are very, very small.

The main problem is that DV video looks totally different on a computer monitor, and there is nothing that can be done to change that. You MUST do all of your comparisons on the TV set.

mterrien wrote on 7/10/2009, 7:38 AM
Well it turns out my render settings were the problem. The Vegas manual doesn't give much information on how to use some of those settings so I have always left them at thier defaults. The problem, as a lot of you have metioned, was the bitrates. Thru some suggested reading and bitrate numbers from you guys I bumped up the average bitrate to 7,500,000 and the minimum bitrate to 2,000,000. I am finally able to achive what I want. Thanks to everyone for the help.

Mike
John_Cline wrote on 7/10/2009, 2:17 PM
If your project is under about 74 minutes and you're using a a single-layer DVD, just set the MPEG2 render to CBR at 8,000,000.

You didn't say that you did, but setting the average bitrate very close to the maximum bitrate really doesn't gain much extra quality, particularly if the average bitrate is set as high as 7,500,000. It may be better just to go with a CBR of 7,500,000.

For programs longer than 74 minutes, I use this bitrate calculator:

http://www.johncline.com/bitcalc110.zip