I shot my first HD footage, 1920 X1080i, edited and rerender it to SD 720x480i, then I rendered it to mpeg2 The footage looks choppy in mpeg2 when burned to dvd. What is the best setting for this, or should I just render the HD to mpeg2 and bypass the avi file? Thanks:)
When you downconvert from HD to SD mpg 2 don't forget to crop the the HD or check Stretch video to fill output frame in Render as otherwise you will end up wiht black bars in your SDWS becuase of the difference in aspect ratios HD vs SDWS.
The clip was about 15 minutes in length. Some shots look great, other seem to have fine pixelation going on, line on edges of things, ect. I used the present.
I didn't know that the trial would let you encode to mpeg format.
Anyways, Are you watching the video on disc or your computer? If a computer are you watching it in Windows Media Player? I have had this issue before. It was not the file, but the player (namely Windows Media Player).
It shouldn't help in rendering quality as it would be an intermediary. You will have to render back to mpg if you want to put it on a playable disc. I am still curious as to what you are seeing.
>When you downconvert from HD to SD mpg 2 don't forget to crop the the HD or check Stretch video to fill output frame in Render as otherwise you will end up wiht black bars in your SDWS becuase of the difference in aspect ratios HD vs SDWS.
No don't do that. It will work but it's not the best way. Instead render to 704 x 480 mpeg2. That will match your aspect ratios perfectly between the HD and the SD and DVD Architect has no problem with those dimensions. Neither do DVD players.
The test page has 3 photos on it. The video was of a fire in town.
The first photo/video looks good, the second and third shots, you can see jagged edges on the cars hood and roof.
Do you have a deinterlace method selected? I know you aren't deinterlacing, but if you don't select a deinterlace method, Vegas won't resize the interlaced footage correctly.
You will get better results if you interpolate from HD to SD using a deinterlaced frame rather than two interlaced fields (with half the vertical resolution of a frame) in isolation. For standard DVDs, the interpolated frame is then converted back to two interlaced fields again. All this manipulation is hidden from the Vegas user, but you need to know what is going on if you are to select the right options.
That's one way to do it, but I prefer to resize from interlaced to interlaced. What I would do is resize from the beginning 1920x1080x60i to 704x480x60i. To do this just select the widescreen NTSC DVD Architect template and change the 720 horizontal value to 704. Select "best" for render quality because that will affect the algorithm used for resizing. In the project properties, make sure that the deinterlace method is set to either "blend fields" or "interpolate" (it doesn't matter which one because it won't actually deinterlace).
It is important that the deinterlace method is not set to "none" because if it is, Vegas will resize the footage all at once and you'll get weird wavy horizontal and diagonal lines any time there is motion. If you have a deinterlace method selected, Vegas will separate the even and odd line fields, resize each field separately, then fold them back into a new resized but properly interlaced image. It does this in the background with no further input from you, but you have to select a deinterlace method in order for this to happen.
The horizontal size of 704 pixels is important because otherwise the SD and HD aspect ratios don't match exactly and Vegas will either have narrow black bars on the sides, or it will stretch the video slightly if you check the "stretch video to fill output frame (do not letterbox)" tab. Setting your output resolution to 704x480 pixels avoid this and the two aspect ratios match exactly.
"it doesn't matter which one because it won't actually deinterlace"
How do you know that Vegas doesn't deinterlace, interpolate and reinterlace?
The Dan Isaacs method (see for example http://invertedhorn.axspace.com/hdv2dvd_basic.html) turns each field into a frame (i.e. double rate progressive) as part of the processing. I had assumed Vegas did something similar.
>How do you know that Vegas doesn't deinterlace, interpolate and reinterlace?
I know because I've done extensive tests. Believe me, this was driving me nuts for awhile. If you search back a few years on this forum, you'll find all sorts of posts I made while I was banging my head against the wall trying to figure it out.
Vegas does two types of resize: one that is optimized for progressive frames and one that is optimized for interlaced frames. On a progressive frame resize, each frame is resized exactly as you would expect. The problem is if you resize an interlaced frame this way, what you'll get is a resizing of the interlace comb. This resized interlaced comb will alias into a wavy edge when you downrez. It looks crystal clear on static images but on motion you'll see the wavy vertical edges. They look terrible.
This leaves you two options.
You can deinterlace and resize. The deinterlace loses resolution but you have plenty of resolution in an HD frame so you'll still get full SD resolution on the smaller SD frame. To do this all you have to do is select the deinterlace method you like better and render to the widescreen 29.97. I would recommend changing from 720 to 704 horizontal pixels though to preserve the aspect ratio. In this case, Vegas will use the selected deinterlace method to deinterlace. Interpolate will look a little sharper. Blend fields will have smoother looking motion. It is up to your preference.
The other option is to render to SD interlaced. To do this, start with a 1080i project, make sure that a deinterlace method is chosen, then render to widescreen SD 704x480i. Render directly to mpeg2 and you'll preserve the colorspace that you lose if you go to DV codec first.
Proving that I'm right isn't too hard, but you need to use a CRT TV. If you have a widescreen CRT (which is rare these days), just play your DVD and stop it on some motion part. What you'll see is the motion flickering back and forth on the pause between two points of action 1/60th of a second apart. Voila: perfect interlace SD. Some old CRT TVs have a "squeeze mode" that compresses the 4:3 frame down to 16:9. Check it the same way on one of these sets.
If you don't have a widescreen TV, you can still check it on a 4:3 CRT. Just go into the settings of your DVD player and tell it that you have a widescreen TV. Now play your resized interlaced DVD and pause it on a motion frame. You'll see the flickering between to 1/60th of a second frames. The image will be vertically stretched into 4:3 but it will display correctly.
All of this is complicated because there are so many instances where the TV automatically deinterlaces. If you are on a 4:3 CRT TV, when it plays a SD DVD what it does is to drop every fourth line then deinterlace (otherwise the field order would be messed up). If you are on any kind of LCD or plasma HD TV these TVs always play a progressive image and so it is hard to tell when you get it wrong.
I just happen to be an early adopter of HD and I have an old dinosaur CRT HD TV sitting in my living room (all 300 pounds plus of it). It plays back true 1080i and true 16:9 480i so these interlace images were driving me nuts before I figured it all out. Most people couldn't see what I was seeing. Not because I was wrong or that they weren't perceptive. It's just that not too many people have widescreen CRT 16:9 TVs, so they weren't seeing the resized footage in it's true form.
How do you know that Vegas doesn't deinterlace, interpolate and reinterlace?
There isn't a deinterlace button. There is a three way choice of deinterlace method: "none", "blend fields" or "interpolate". Just make sure you don't select "none". If you do Vegas will assume your project is progressive and resize the interlaced footage incorrectly. If you are going from interlaced at one resolution to interlaced at another, it won't actually deinterlace, but it needs a method selected or it won't resize correctly.
The only time you should select "none" is when you are using a third party deinterlace plugin like the Mike Crash smart deinterlacer. Otherwise, just leave it on all the time regardless of whether you are doing an interlaced or progressive project.
Laurence, suppose you have mixed media in your project; eg: interlaced HD footage, progressive HD footage, and photos? How would you approach downrezing to SD?
Vegas is smart enough to know when footage needs to be deinterlaced or not. Just put all the stuff on the timeline and go.
In the case of rescaling HD interlaced to SD interlaced, let's assume that we have some 1920x1080 interlaced footage, Vegas takes the interlaced footage, breaks it into its separate fields at the field frame rate of 59.94 FPS (NTSC), (1920x540 at 59.94 fps) then it resizes that and reinterlaces it perfectly at the new frame size. The "blend" vs "interpolate" doesn't come into play, but you must have one or the other selected anyway.
As John said, Vegas has no trouble mixing interlaced and progressive footage. How does it know which is which? If you right click on any clip, it is specified in the project properties. You can even change it if it gets it wrong. As long as everything is flagged correctly (which it usually is) Vegas will do everything automatically in the best possible way, that is as long as you didn't set the deinterlace method in the project properties to "none".