Good vs Best...What's the Real Difference?

Sab wrote on 2/11/2003, 1:30 AM
The manual says something to the effect of using the good setting unless "only the best will do." It also says that render times will increase "dramatically" at the best setting. What is really sacrificed at the good setting? This question is in regard to video render quality settings.

Thanks, Mike

Comments

Grazie wrote on 2/11/2003, 1:42 AM
Good question! I've wanted to ask the same.
dcrandall wrote on 2/11/2003, 2:05 AM
I copied the following definitions from this forum some time ago.

"Good" uses bi-linear scaling. (Best compromise between speed and quality. This method will produce good results in most cases).

"Best" uses bi-cubic/Integration scaling. (Best image resizing algorithm available in Vegas. Quality differences will be most noticable when using very large stills or stretching small sources)

If you are using high-resolution stills that are being scaled down to video, use Best and not Good to prevent flicker causing aliasing.
  • Velocity Micro Z55 Desktop Computer
  • ASUS Prime Z270M-Plus Motherboard
  • Intel(R) Core(TM) i7-7700K CPU @ 4.2GHz
  • Memory: 16GB DDR4-2400MHz
  • 4GB NVIDIA GeForce GTX 1050 Ti Driver Version: Studio Driver 452.06
  • Windows 10 Home 64bit v1909
  • Vegas Pro 18.0 Build 284
Sab wrote on 2/11/2003, 8:11 AM
Thanks for the reply. What effect does the setting have in regard to transitions, filters, etc. on DV video? What about stills captured at the recommended 655x480 with motion?

Sorry to ask so many questions but I really would like to know without spending endless hours experimenting.

Thank you again.

Mike
BillyBoy wrote on 2/11/2003, 10:07 AM
I'm the one that posted that explaination and a more detailed one... many times.

There is very little noticeable difference if any between best and good that can you can detect with the naked eye. However there can be a HUGE difference in the time it takes to render. You could double, triple even take more time than that to render at best verses good. So rarely is it worth the effort. The best advice is try a small sample using each method then view on whatever final medium your project is entended for. Note: If you intend to view as a DVD on your TV or just play off a DV tape from your camera, be aware that TV resolution is ALWAYS worse than the resolution the typical computer monitor can produce. So the small gain you may get in slightly richer shades of colors which is the primpary reason to use best over good... are going to be lost anyway and all that extra time you spend rendering is pointless... for most projects.
Former user wrote on 2/11/2003, 10:12 AM
I found a BIG difference between good and best when resizing video. I created a video with a textured background, a zoomed back moving video and rolling credits. When rendered at good, the video had many artifacts from being resized that resemble interlacing errors. When rendered at best, the artifacts disappeared. If you are doing normal full screen video, the difference might not be noticeable, but for effects, stay with best.

Dave T2
Sab wrote on 2/11/2003, 4:13 PM
Thanks BillyBoy, that's just the info I was looking (and hoping) for.

Mike
BillyBoy wrote on 2/11/2003, 4:24 PM
Dave has a good point too. I haven't seen it myself because I don't do that kind of project.
SonyDennis wrote on 2/11/2003, 4:50 PM
Dave:
The reason "Best" looked better for your zoom out was because of the "Integration" item listed next to it's description above. I recommend "Good" unless you're scaling down video or using high-res stills, then you must use "Best" or risk artifacts due to aliasing in the scaling.
///d@
stusy wrote on 2/11/2003, 5:04 PM
Also in the "audio" tab under "project properties" see: "resample quality"...
BillyBoy wrote on 2/11/2003, 5:17 PM
What do you consider a 'high resolution' still Dennis? Is there some formula like anything two or three times the destination frame size or something like that or does it also depend on the detail in the image, if or not you pan/zoom, that kind stuff?