Good vs Best render for rendering to NTSC DV avi ?

will-3 wrote on 7/8/2008, 11:38 AM
Will we see any... or much improvement if we change render Quality from Good to Best?

We are rendering complex composits... including chroma key, motion backgrounds, lower thirds, Media Generated text, etc...

We render to a NTSC DV avi file... then print that to mini-dv tape. The video is then aired on local TV.

We have been rendering to the default Good but now are noticing some jagged lines along one side of the talents head... (it may... or may note be more noticable when they slightly turn their heads while talking... hard to tell)

We haven't changed our chroma-key method... and didn't notice this before... maybe our eye is getting better... or maybe it wasn't there... or maybe the new (blue) motion background makes it more noticable?

It's not that noticable on TV but we can clearly see it on the computer and on the studio monitors before we print to tape.

If we look at the video just before we render it... frame by frame... it looks fine... it only looks jagged on one side of their head... on the fill light side.

We are using Vegas 5.

I would really like to learn what is going on here so thanks for any help.


Comments

johnmeyer wrote on 7/8/2008, 11:52 AM
It is my understanding, based on some very old posts by Sony, that "Best" is only useful when the render is at a different resolution than your source material. Thus, if you are rendering from HD to SD (or vice versa), use Best. If you are using high-res still photos, use "Best." And if, for some reason, you generate media within Vegas at a resolution higher than your final render resolution, use "Best." For all other renders, you can save a lot of time, with zero impact on quality, by using the default "Good."

I have written to Sony many times about this and suggested that they change the terms used because the obvious conclusion most people reach is that Best is always going to produce better quality than Good. They very quickly find out that Best is definitely much slower, but scratch their heads when they don't see any improvement. Thus, even with all the complex composites, unless different resolution material is involved, I think you can stick with Good.

As for jaggies on Chroma Key, you need to search these forums about how to pull a good key. I don't do this very often, so I can't help you, but there is a ton of information on this board on how to get a clean key and reduce the jaggies at the edge. I don't think "Best" was part of any of the hints.
rs170a wrote on 7/8/2008, 1:17 PM
Here's what shows up in the Sony Knowledgebase about this topic.

""Best" uses bicubic scaling with integration, while "Good" uses bilinear without integration. You will have a hard time telling the difference, but must use "Best" if you are using high resolution stills or video that are getting scaled down to the final output size. "Good" may have bad artifacts on the near-horizontal edges, while "Best" will look great. "

For chromakey help, check out Keith Kolbo's tutorial called Using Chroma Key and Chroma Blur in Sony Vegas.

Mike
musicvid10 wrote on 7/8/2008, 8:42 PM
If you're using stills, any pan/crop, effects or anything that requires re-rendering, especially with interlaced material, you need to use Best .

The ONLY disadvantage I know of using Best is longer render times. If you can live with that, then you can feel comfortable setting Best as your Default, as I have done for many years.

Sony's published advice that you use Good for most work is just PR, IMO.
johnmeyer wrote on 7/8/2008, 9:10 PM
If you have lots of time, then sure, use Best. But, I would strongly advise that you do a few tests first, just to see what the time penalty might be.

I just put a ten second NTSC DV clip on the timeline. I dropped the opacity slightly to cause it to re-render. I then rendered it using the standard, default NTSC DV template, which uses "Good."

It took 13 seconds on my single-core, but reasonably fast laptop.

I rendered a second time, just to eliminate any "speed up" caused by re-reading the file from disk cache. It took the same amount of time.

I then switched to "Best" and rendered again.

This time it took 51 seconds to do the same thing.

That is almost exactly 4:1. So if you don't mind that kind of performance hit, then go ahead and use Best all the time. However, based not only on my own tests, but also what Sony themselves have posted here, and also what they have put into their documentation, unless your project has the elements I described in my last post, I really do think you are unnecessarily wasting a LOT of time and getting 0.000% improved quality in return.

musicvid10 wrote on 7/8/2008, 9:19 PM
**I really do think you are unnecessarily wasting a LOT of time and getting 0.000% improved quality in return.**

I really do think that applies, except for the users like me that use interlaced material and have something in their project that looks like crap if rendered at Good, and then have to render it again at Best. To me, it's worth a good night's sleep to wake up to a good result, even if I could have stayed up a little while longer to see whether it was good or bad when rendered at "Good." Yep, you need a good CPU fan, and another computer to run games, internet, and Word in the meantime if you want any kind of speed.

I guess the issue is that the preselected Project Properties are oft forgotten when one clicks "Render," even though those same settings critically affect the render output in some situations, explicitly the ones you mentioned. If the application was intuitive enough to "remind" us when it is preferable to change those Properties before undertaking a render that will take many hours anyway, and possibly with unusable results, I don't think there would be any discussion about this.
fldave wrote on 7/8/2008, 9:38 PM
Yes, I've seen Best clear up jaggies.

Jagged lines mostly are a function of your DeInterlace method in your project settings, if there is a lot of movement between frames, Interpolate works better than Blend. Default should be Blend otherwise, NEVER use "None".

Check the Best setting on the sub-footage where you are seeing the problem. If it doesn't clear up, you might have to go with Interpolate on your footage.

I use Best for everything, just in case. Yeah, I have 40 hour renders, but I know it can't get any better than that.
John_Cline wrote on 7/8/2008, 11:02 PM
I think I set it to "best" back around the time I got Vegas v2.0 and have never rendered using anything else. It's all about quality and "Best" sure sounds like it would be better than "good." :)
DGates wrote on 7/8/2008, 11:46 PM
I use 'Best' all the time as well. With today's faster processors, there's really no reason not too.
Grazie wrote on 7/9/2008, 12:29 AM
"I use 'Best' all the time as well. With today's faster processors, there's really no reason not too."

Sold!

Grazie
Tinle wrote on 7/9/2008, 7:26 AM
“Musicvid states:
I guess the issue is that the preselected Project Properties are oft forgotten when one clicks "Render," even though those same settings critically affect the render output in some situations, explicitly the ones you mentioned. If the application was intuitive enough to "remind" us when it is preferable to change those Properties before undertaking a render that will take many hours anyway, and possibly with unusable results, I don't think there would be any discussion about this.”

Sony Vegas 7e Help is quite clear on this issue. It states:
“Project full-resolution rendering quality
Choose a setting from the drop-down list to set the quality of the rendered video.

Unless you have specific performance problems, choose Good. Choosing Best can dramatically increase rendering times.
Good uses bilinear scaling without integration, while Best uses bicubic scaling with integration. If you're using high-resolution stills (or video) that will be scaled down to the final output size, choosing Best can prevent artifacts.

Some file formats allow you to associate a video rendering quality setting with a custom rendering template. Final rendering template settings override the Full-resolution rendering quality setting in the Project Properties dialog.”

Sounds like the rendering template setting determines on the video rendering quality issue, not project properties.
Former user wrote on 7/9/2008, 7:28 AM
I use BEST all of the time. I don't have to worry about whether I have resized or repositioned something and the difference in rendering times does not seem significant.

Dave T2
musicvid10 wrote on 7/9/2008, 9:04 AM
**Sounds like the rendering template setting determines on the video rendering quality issue, not project properties.**

No, it is as I said.
The Project Properties set the default rendering quality.

If I want to change from the default quality at the time of rendering, I would need to go into the Custom Properties for the encoder and change it -- each time I render.

If I go to render a project as "DVD Architect NTSC video stream," I don't want to have to remember if there is something I put there three months ago that is going to cause the jaggies if I forget to change the Custom properties.

No, what I want to do is select the template, click "Render" and have it come out right, without having to ask myself each and every time if I need to go in and change the Custom settings. If the logic of this is incomprehensible or you've never had to render a big project twice because there was 5 sec of re-rendered material that you'd forgotten about, by all means leave it at Good.

By leaving it at Best, I'm quite happy knowing I won't wake up to a crappy print and have to start that all-night render all over again . . .
rs170a wrote on 7/9/2008, 9:33 AM
No, what I want to do is select the template, click "Render" and have it come out right, without having to ask myself each and every time if I need to change the Custom render settings.

Apologies if this is redendant but do what I do and create a custom template.
I changed the render quality to "Best", added the word Best to the end of ""DVD Architect NTSC video stream" and saved it under the new name.
That way, when it's render time, I recall that template and render without having to think about changing anything.

BTW, here's another reason for choosing "Best" mode for DVD renders, especially if, like some users, you render to AVI first and then to DVD.
Any stills, graphics and generated media are all 4:4:4 colour space.
The render to AVI changes the images from 4:4:4 to 4:1:1 (AVI) and then to 4:2:0 for your DVD.
Using Best mode and going direct to MPEG-2 from the Vegas timeline cancels this extra colour compression step.
Adam Wilt's site has some good explanations here and here on these differences.

Mike
musicvid10 wrote on 7/9/2008, 9:41 AM
**I changed the render quality to "Best", added the word Best to the end of ""DVD Architect NTSC video stream" and saved it under the new name.**

Hehe, Mike,
As little as I use "Good" for anything, I think I would do the opposite; that is, create a custom "DVD Architect NTSC video stream - Good" template, and use it once in a while for those straight renders that have no titles, graphics, pan/crops, track motion, etc.
It might be good for re-encoding TV shows, etc.
johnmeyer wrote on 7/9/2008, 9:56 AM
For all of you who are setting all renders to "Best:"

Have you actually ever tested the quality of the result, when no resolution or framerate change is being done? Or, are you just falling prey to the semantics of the words used to describe the setting?

If instead of "Good" and "Best" the settings were called "Bilinear" and "Bicubic," would you feel so strongly that you're doing the right thing? Did it occur to you that, since you are really changing the scaling algorithm when you change from Good to Best, and since every algorithm for scaling has tradeoffs, that you actually might be making your footage look worse under some circumstances?

I guess this is beating a dead horse, but I post in these forums because I want to help people, and I hate to see people do the wrong thing, especially when it is for the wrong reasons. So, as I said before, I really recommend that you do short tests using both Best and Good, and see first if you can actually see a difference when no resolution or frame rate change is being made (i.e., no still photos and no HD video in an SD project, or vice versa), and also check on the render times. I always have a deadline, and no matter how fast a computer I may someday own, if I can take a four hour render and turn it into a one hour render with no hit in quality (which is what I am claiming is the case), I'll do that in a New York minute.
Former user wrote on 7/9/2008, 10:29 AM
Johnmeyer,

I have rendered video with good and best. On the one example I remember, I had some PNP type effects. The render in good created a lot of artifacts. Not usable. When I rendered best, the video looked clean.

And from what I can tell, the BEST vs. GOOD only becomes an issue when rendering effects. When I have added no effect to the video, it renders as a "file copy".


I can't say that I have compared a normal, no effect video at good and best but since most of videos involve some effect (slides zooming, repo of video or something) then best has always looked the "best".

Dave T2
musicvid10 wrote on 7/9/2008, 8:23 PM
**I really recommend that you do short tests using both Best and Good, and see first if you can actually see a difference when no resolution or frame rate change is being made (i.e., no still photos and no HD video in an SD project, or vice versa), and also check on the render times.**

So, I took a section from an actual project (NTSC DV Widescreen) and rendered it to the DVD Architect NTSC Widescreen video stream template.

Mind you, the chunk was not just ten seconds of captured video, but several minutes of a typical DVD project that contained mostly DV 16:9 video, a title fade with graphic, four stills (same size as project), a couple of basic effects, but no pan/crop or anything else that I knew would create jaggies or visible artifacts as a result of the different rendering options, just to be fair to the quote above.

I then rendered the project section at Best and Good, and got 13m 56s and 11m 56s rendering times, respectively. That's a 16% advantage in my real world (quite a difference from the threefold gain reported above). I also verified that there were no apparent differences in the video quality between the two.

Weighed (again!) against the prospect of rendering the whole project twice (just in case Good didn't work out), I think I'll take the peace of mind and get a couple of hours of extra sleep. To each his own.
johnmeyer wrote on 7/9/2008, 9:47 PM
got 13m 56s and 11m 56s rendering times, respectively. Wow, if that's all the difference most people get, then by all means render Best all the time! Must be my creaky old computer not having some instruction set that's used by Best.
MikeFulton wrote on 10/6/2008, 2:09 PM
Johnmeyer, just a heads-up here... there seems to be an additional difference between "best" and "good" that goes beyond what filtering algorithm is used.

People have quoted a 4x increase in rendering times from using "best" mode, but using bicubic filtering for scaling each frame instead of bilinear would not even remotely explain such a difference. The scaling of each frame is simply not that big a percentage of the overall operation. In fact, I'd expect the difference in rendering times to be hardly noticeable, if that's all you were changing.

However, something else is done differently between "good" and "best" with regards to how the source material is decoded. It seems that "good" mode takes shortcuts with decoding the source material. These shortcuts speed things up, but they can result in some odd glitches with motion when rendering to a lower resolution than the source material. These glitches superficially resemble an interlacing problem, but they have nothing to do with interlacing and occur even when neither source nor destination is interlaced.

The glitches are EXTREMELY noticeable on any sort of motion and are quite annoying.

By the way, these glitches only appear when the source material is MPEG-based. If your source material uses a motion-JPEG codec or uncompressed video the glitches go away. It's definitely something to do with how the source material is decoded.

So... I'd say, use "good" if you're rendering out to the same resolution as your source material, but use "best" if you need to scale things down.
Avanti wrote on 10/6/2008, 5:14 PM
As DaveT2 said.
I use BEST all of the time. I don't have to worry about whether I have resized or repositioned something and the difference in rendering times does not seem significant.
DGates wrote on 10/6/2008, 5:17 PM
I render to best at ALL times.
quoka wrote on 10/6/2008, 6:05 PM
I agree totally with JohnMeyer.
I have done test after test looking at the difference of Preview-Good-Best, and done quite exhaustive picture comparisons on a broadcast monitor.(I'm looking at this monitor all day)
I am forever changing the settings based upon what I want to achieve for the final output, a couple of days ago I output a TVC at preview quality to Digibeta 'cos it gave me the sharpest image, without any defects in motion or jaggies. I had prepared all the graphics at exactly the correct resolution and position in photoshop b4 taking them into Vegas, so I didn't have to resize etc.
What does bewilder me, with some source footage being exactly the same as the project setting & no effects or moves applied, Best softens the image - I'm still trying to work out if Vegas is treating some types of source footage different - like it having an imbeded colourspace that forces the output to be recomputed, not just passed through.