MPEG2 rendering problem

dth122 wrote on 1/20/2014, 12:50 AM
I'm having a problem rendering HD footage to MPEG2 for a DVD. There are a lot of other posts and questions I've seen that hint around a similar problem, but I haven't seen exactly what I'm talking about solved.

I'm running Vegas Pro 12 (build 770). My footage is in Apple ProRes 422 1920x1080x24, 29.97fps. I started with my project at HD 1080-60i. The footage looks great in the 480x270 preview window while I'm editing. I'm not applying any effects to the video.

When I render to NTSC DVD Widescreen, I get tons of motion blur and straight lines become jagged. Relatively still shots look great. It's only when the camera or subject is moving that there is a problem. I can even see the blur in the preview window while the video is rendering.

As a test, I set up a NTSC DV Widescreen project and imported the footage. When I do this, I see the motion blur and overall quality problems right away in the preview window while editing, before rendering

I just noticed that I can see the problem appear in the preview window as soon as hit Apply after changing the project resolution in the properties box.

I tried all three Deinterlacing methods but they didn't seem to make any difference.

You can see a screen capture of what I'm talking about here:

http://i217.photobucket.com/albums/cc309/dth122/blur_zps4d0c15bf.jpg

What am I missing? Do I need to convert the footage to something else before bringing it in to Vegas?
- Dave

Comments

Chienworks wrote on 1/20/2014, 2:27 AM
Quite typical and normal interlace artifacts.

Do you see the problem when you view your DVD on a television?
musicvid10 wrote on 1/20/2014, 8:19 AM
Those are not typical, but are a result of resizing with your render method set at Good.
Set it at Best in the Project pane, and start all new projects that way.

johnmeyer wrote on 1/20/2014, 8:47 AM
These are most definitely not interlacing artifacts, but are caused by resizing (from HD to SD) with the project deinterlace settings set to "none." Musicvid mentioned using "best" for the render quality and you should definitely do that, but that isn't the cause of the problem. Musicvid did actually show, with the second of this three red arrows in the picture, the setting that must be changed to fix the problem, even though he didn't mention it in his text.

So, change Deinterlace Method from "none" to "interpolate fields" and your problem will be solved. If you use Best for Quality in the MPEG-2 Render As dialog, you will get slightly better quality, so you should do that as well.



musicvid10 wrote on 1/20/2014, 9:21 AM
Actually, I thought both were necessary to prevent the squiggles.
dth122 wrote on 1/20/2014, 9:54 AM
Thanks for the ideas, but they didn't seem to solve the issue. I think I found the issue though.

I started a new project, set rendering quality to best and deinterlace method to Interpolate... but still had the same problem. Then I noticed an option that wasn't on musicvid's screen shot... Adjust source media to better match project or render settings. This was checked by default and seemed like something I would want. However, when I unchecked it, my problem disappeared.

I believe johnmeyer is correct about the cause... it was doing some sort of scaling HD to SD that was creating the artifacts.

It's unfortunate that the default settings would cause such poor render quality, but I guess now I know for next time.
- Dave
Chienworks wrote on 1/20/2014, 9:55 AM
Well, actually they are interlacing artifacts. If it weren't for the interlaced source there wouldn't be any problem at all. The resizing from HD to SD changes their character and makes them wide squiggles rather than fine lines, but it's still because of the interlacing.

Otherwise, the solutions wouldn't involve dealing with interlacing.
johnmeyer wrote on 1/20/2014, 9:58 AM
Actually, I thought both were necessary to prevent the squiggles.No, the squiggles are 100% the result of the resizing without deinterlacing (or at least separating into separate fields). I got "schooled" in this over in the doom9.org forum years ago when I wrongly suggested that the OP was chasing a different problem. I've since had to deal with it many times when people send me video that looks like this and ask me to fix it.

Setting the rendering quality to "Best" makes only very subtle changes to the quality by using a slightly different resizing algorithm. The difference is perceptible, but only after very close inspection. This problem, by contrast, is grotesque and overwhelming.

I often wonder why Sony exposes this setting in the dialog. I would prefer that it be buried in and "advanced" sub-dialog and, if any interlaced footage is found in the project media pool, a warning be put up on the screen if a user attempts to change it to "none." Many editors -- both newbies and pros -- have the horribly mistaken idea that progressive is "better" than interlaced and that they therefore should do everything possible to avoid interlacing. As a result, a lot of people set this to none and then wonder why things look so bad. The actual damage will depend on the actual amount of scaling done and, in some cases, the video looks bad, but not devastatingly bad like this example. As a result, people degrade their video without knowing why it looks sub-par.
johnmeyer wrote on 1/20/2014, 10:06 AM
Well, actually they are interlacing artifacts. If it weren't for the interlaced source there wouldn't be any problem at all. The resizing from HD to SD changes their character and makes them wide squiggles rather than fine lines, but it's still because of the interlacing.Well ... I guess in a very general sense that is correct, but when most people talk about interlacing artifacts they are talking about the "teeth" or herring bone patterns seen in high motion areas of the interlaced video when it is displayed on an interlaced display, or when a still shot is taken from that high motion video of a single, complete frame.

It is pretty clear that this is what you meant in your original post.

What you said in your original post was: "Quite typical and normal interlace artifacts." You then went on to ask if the OP saw them when the video was played on his TV set. Therefore, you were not talking about re-sizing artifacts because those most definitely (and unfortunately) show up on an interlaced TV display, whereas normal interlaced video looks perfectly fine on a TV set.

The only reason I am posting this reply is that, just like my previous post, I really want to make certain that people stop thinking that interlaced video is inferior to progressive or needs to be changed to progressive because it is exactly that thinking that causes problems. Interlaced video is absolutely fine as long as it is handled correctly. Unfortunately, as I posted in my last reply, Sony Vegas allows -- perhaps even encourages -- people to choose settings that will create a problem. A little more intelligence from the Sony design team could reduce or eliminate these user errors.
musicvid10 wrote on 1/20/2014, 10:23 AM
Although I'm pleased the OP found his answer, I thought I'd share this. I can reproduce the actual squiggles only at Good / None, although the image is degraded at Best / None and Good / Interpolate. John's assessment is the most correct, since my assumption was not "truly" biconditional. Test images are HDV->DVD.

So the revised solution becomes:
"Adjust Source" off.
--There, all the bases are covered.










johnmeyer wrote on 1/20/2014, 10:40 AM
Those are some really interesting test results. I wonder if you've uncovered a long-standing bug, or some sort of anomaly. Perhaps, when rendering using "Good," Vegas doesn't even separate fields before doing the re-sizing?

The three images that don't show the squiggles do show the normal interlacing effect that happens when you show a single frame containing both fields which, because they come from different moments in time, create the herring bone artifact in areas of the frame that show motion. These will not show up when the video is displayed on a regular TV set and therefore are not a flaw or defect but instead are simply something that happens when you take a single-frame snapshot, something that doesn't happen when watching the video.

Thanks for taking the time to do those tests. Nicely done.

dth122 wrote on 1/20/2014, 10:45 AM
FWIW... I'm seeing the same thing on my end. It seems that as long as "Adjust" is off and the render quality is set to Best, the deinterlacing mode doesn't matter much. That's not to say that it doesn't make a difference, but it's hardly perceivable relative to the problems in all deinterlacing modes when render quality is set to Good.
- Dave
musicvid10 wrote on 1/20/2014, 10:55 AM
"the deinterlacing mode doesn't matter much. "

For purposes of resizing alone, a deinterlace method must be chosen if the source is interlaced. "None" won't work.
I think most people agree they can't see a difference between Blend and Interpolate in this usage.

For Interlaced->Progressive render of motion source, there is a pronounced difference between Blend and Interpolate, and most people end up choosing Interpolate.
musicvid10 wrote on 1/20/2014, 11:00 AM

"Good uses bilinear scaling without integration, while Best uses bicubic scaling with integration. If you're using high-resolution stills (or video) that will be scaled down to the final output size, choosing Best can prevent artifacts." -- Vegas Help

Once I've had to re-render a scaled video that was done at Good, any time savings of using that default setting are more than lost, so using Best as the default makes the most sense to me; unless of course, one has a perfect memory ;?)

johnmeyer wrote on 1/20/2014, 12:20 PM
Once I've had to re-render a scaled video that was done at Good, any time savings of using that default setting are more than lost, so using Best as the default makes the most sense to me;Many, many years ago, I did some render tests at Good and then at Best, and the rendering time difference was substantial. However, as computers have gotten faster, and Vegas code has changed, I don't think there is any longer much render time difference. Therefore, I totally agree: make "Best" your default for all rendering.

I decided to do a quick test. I used thirty seconds of 1440x1080 HDV video and then created SD MPEG-2 video using the "DVD Architect Widescreen" template as a starting point. I first did the test using Vegas 7, and then Vegas 10, both under Windows XP 32-bit. I don't use any GPU acceleration for anything.

Here are the results:

V7 Good 13 seconds
V7 Best 16 seconds

V10 Good (Quality Over Speed checked): 14 seconds
V10 Good (Quality Over Speed NOT checked): 14 seconds

V10 Best (Quality Over Speed checked): 19 seconds
V10 Best (Quality Over Speed NOT checked): 18 seconds

It would be interesting to have Sony (SCS) comment on this because I don't know whether other settings might change the results. For instance, would the percentage difference between Good and Best be larger if I did pan/crop, compositing, used specific fX, etc.?? I don't have the time or inclination to do the kind of testing required to answer that. However, what my simple test show is that with V10 (which uses the newer code base that is used in V11 and V12) there is a 36% increase in render times when using Best. This would increase a ten hour render to thirteen and a half hours. So, actually, the time difference can be significant, depending on deadlines, so understanding the old rule of thumb is still important, and that rule of thumb is this:

If the resolution of everything on your timeline (including pan/crop and track motion) is the same as your output resolution, then you can use Good; otherwise, you should use Best.
OldSmoke wrote on 1/20/2014, 12:20 PM
I recently finished a HD (720@60p) project that ended up on DVD and it turned out pretty well. I always apply a filter package that helps with the jagged edges a bit https://drive.google.com/file/d/0Bz4okF1D_ux5c0Y2VnZIZ05xckU/edit?usp=sharing. Give it a try; you will need Preset Manager to install it.
I also found that disabling "Smart Resample" on all 720@60p footage helps with "ghosting" on fast moving objects when rendered for DVD. These are my project settings http://drive.google.com/file/d/0Bz4okF1D_ux5c3BIczZnRzdzRE0/edit?usp=sharing
Depending on the footage, changing to "32bit video levels only" can also help.
Here is a sample of 720@60p footage rendered to DVDA http://drive.google.com/file/d/0Bz4okF1D_ux5V1BXTWlHVjdGU2c/edit?usp=sharing (it looks better when you download it and watch it rather then play it from the Goggle Drive)

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

musicvid10 wrote on 1/20/2014, 5:02 PM
Lately I've been doing the scaling and deinterlacing in Handbrake, and exporting a lossless intermediate for Vegas to render to progressive DVD. Results are worth it, but a bit time-consuming.



Also, Handbrake will now do a true 60i->60p bob, which opens the door for superb slomo.





PeterDuke wrote on 1/20/2014, 7:50 PM
"For purposes of resizing alone, a deinterlace method must be chosen if the source is interlaced. "None" won't work.
I think most people agree they can't see a difference between Blend and Interpolate in this usage."

People won't see a difference between "blend" and "interpolate" because the results will be identical! Use the command FC /B to confirm.

I presume what is happening is that "none" tells Vegas that the source is really progressive and therefore it does not separate the fields when resizing. If the fields are processed separately, it makes no sense to blend, so Vegas must be interpolating, even when set to blend.
OldSmoke wrote on 1/21/2014, 8:57 AM
@musicvid

Actually you can do a 50i/60i conversion to 50p/60p in Vegas too and it does a great job. Put the footage on the timeline, set the project properties to progressive and double PAL/NTSC, deinterlace to interpolate and render it out to 50p/60p. I tried the HB route but I must be doing something wrong. My test clip is HDV 50i 5:14sec long and after HB it ended up being 5:23sec long when imported back into Vegas. The Vegas converted clip stayed at the same length.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

PeterDuke wrote on 1/21/2014, 4:35 PM
The deinterlace method in project properties is unimpotant when converting i to p.

The main things are:

In project properties, set the frame rate to double (i.e. 50 or 59.94), and field order to None (Progressive).

The Preview should be set to Best/Full, so that you can view each new frame properly as you step along frame by frame.

The clip properties should be set either to the default "Smart Resample" or to "Force Resample", but not "Disable Resample".
OldSmoke wrote on 1/21/2014, 6:00 PM
@PeterDuke

The de-interlace method does make a difference when converting interlaced to progressive in Vegas, at least on my system with VP12 B770. What does not make a difference is the preview window setting.

None:
http://dl.dropboxusercontent.com/u/39278380/test-none.jpg
Blend:
http://dl.dropboxusercontent.com/u/39278380/test-blend.jpg
Interpolate:
http://dl.dropboxusercontent.com/u/39278380/test-inter.jpg

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

John_Cline wrote on 1/21/2014, 6:55 PM
Which deinterlace method one chooses when resizing interlaced footage absolutely and positively does make a difference. If there is any motion in the video, then choosing "interpolate" is the preferred method, in fact, it is really the only correct choice.

I do interlaced HD to SD conversions all the time and I have Vegas permanently set to "best" render quality and "interpolate." No exceptions.
musicvid10 wrote on 1/21/2014, 9:00 PM
"I do interlaced HD to SD conversions all the time and I have Vegas permanently set to "best" render quality and "interpolate." No exceptions."

+1

OldSmoke wrote on 1/21/2014, 9:16 PM
+1

Always set to "interpolate"

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

PeterDuke wrote on 1/22/2014, 4:13 AM
@OldSmoke

Yes, the deinterlace method does make a difference with Version 12. I tested with Version 9c which gives no visible difference. (A file compare using FC /B shows only a few bytes are different at the start of each file.)

Yes, so far as I know, the preview does not affect rendering. What I said was, "The Preview should be set to Best/Full, so that you can view each new frame properly as you step along frame by frame".