Is "Best" really "Best"?

amendegw wrote on 8/14/2010, 1:29 PM
There are numerous recommendations in this forum to always render to "Best" quality in the "Video Rendering Quality" settings of the "Project" tab.

If the Project Settings have a "Full Resolution Rendering quality" setting of "Best", can one assume that setting "Video Rendering Quality" to "Use Project Settings" is tantamount to setting it to "Best"?

I've always done this, and haven't noticed a quality problem, but just wanted to get the expert opinion.

...Jerry

System Model:     Alienware M18 R1
System:           Windows 11 Pro
Processor:        13th Gen Intel(R) Core(TM) i9-13980HX, 2200 Mhz, 24 Core(s), 32 Logical Processor(s)

Installed Memory: 64.0 GB
Display Adapter:  NVIDIA GeForce RTX 4090 Laptop GPU (16GB), Nvidia Studio Driver 566.14 Nov 2024
Overclock Off

Display:          1920x1200 240 hertz
Storage (8TB Total):
    OS Drive:       NVMe KIOXIA 4096GB
        Data Drive:     NVMe Samsung SSD 990 PRO 4TB
        Data Drive:     Glyph Blackbox Pro 14TB

Vegas Pro 22 Build 239

Cameras:
Canon R5 Mark II
Canon R3
Sony A9

Comments

reberclark wrote on 8/14/2010, 1:48 PM
Good question...I would like to know as well.
kkolbo wrote on 8/14/2010, 2:21 PM
Yes
cbrillow wrote on 8/14/2010, 2:36 PM
For a long time, the advice on this forum was to stick with Sony's recommendation in the Vegas help -- use "Good" for most situations. It was said that switching to "Best" resulted in little discernible improvement to the output, but significantly increased render times. Another situation where "Best" was often recommended was when the project included still images on the timeline.

Functionally, "Best" uses a different scaling technique than "Good". I could recite the different names, but, honestly, the technical details are beyond my level of understanding.

It could be argued that aforementioned strategies were the prevailing best practice when most video was DV. Now that more Vegas users are dealing with HD video that needs to be downconverted for burning to DVD, it seems like we're seeing a lot more recommendations here in the forum to render at the "Best" setting.
musicvid10 wrote on 8/14/2010, 3:28 PM
If there is no resize/pan/zoom/crop going on in the timeline, I doubt anyone could tell a difference.

However, leaving it set to "Best" all the time doesn't result in very much longer rendering times on my machine, like 15% in actual tests.

http://www.sonycreativesoftware.com/forums/ShowMessage.asp?ForumID=4&MessageID=603288

And leaving it at "Best" is certainly less time consuming than a second render, if I forgot about something I have done in the project that needs it.

So, to quote Keith, "Yes," Best is really Best.
kkolbo wrote on 8/14/2010, 3:36 PM
The original post was asking, if I set my project setting to BEST, with leaving the render setting at "Use project Settings" result in it rendering at best. The answer is Yes.

As for the other, there is little benefit of rendering at best over good if there is no resizing being done. Unless I am shooting, editing, and delivering all in the same format, I almost always have resizing.

Like musicvid, I leave it best all the time. The delay is not that much in situations where it really wasn't needed. Being brain dead most of the time, this way I do not forget to set it when I do need it. A second render sux.

KK
cbrillow wrote on 8/14/2010, 3:37 PM
However, leaving it set to "Best" all the time doesn't result in very much longer rendering times on my machine, like 15% in actual tests."

Yeah, I'd suspect that Sony's comment about significantly-longer renders was written at a time when slower, single-core CPUs were the norm...
jabloomf1230 wrote on 8/14/2010, 5:25 PM
It's "buried" in the Vegas help file, but here's a summary from another website:

http://forums.creativecow.net/archivepost/24/16974
musicvid10 wrote on 8/14/2010, 8:08 PM
Ah, I didn't get the original question until now, but I'm glad we had a chance to revisit this briefly for new users.
K-Decisive wrote on 8/16/2010, 1:29 PM
ok....here comes the flack.....

I always render in 'preview', never in 'good' or 'best' because it actually looked worse when I did some initial tests back in the Vegas 5/6 days. I still do tests out to dnxhd to verify that what I'm rendering is loss less and IT IS!.

My understanding has always been that 'good' and 'best' enable motion blur, which I never use. I also never do any kind of track motion, I save all that stuff for A.E.

The C. cow article is eye opening, my main cameras are all progressive. Although I do use video from interlaced cameras every once in a while. I've never seen the interlace issue during rendering.

Just wondering how much crack I might be smoking here.
John_Cline wrote on 8/16/2010, 1:43 PM
"Just wondering how much crack I might be smoking here."

I'm wondering the same thing. Rendering with it set to "preview" will generally produce worse looking video than good or best.

Technically, DNxHD is not lossless.
K-Decisive wrote on 8/17/2010, 7:14 AM
I know.... but it, Cineform and prores are reallllly close.
This is the test I do>
With a 1080p or 720p clip, render out to dnxhd (or what ever you are testing) at the same resolution and the best setting for the codec. I set the rendering quality in Vegas to 'preview'.

Then bring the rendered clip back into Vegas on another track above or below it and set one of them to difference. Except for a few stray highlights here and there, it stays pretty much black the whole way.

So, the question is, this a good test for losslessness?? I've even cracked up the contrast in the preview screen by adding the levels/contrast plug in.

It may just be that because all my sources are progressive, most of the 'good' and 'best' processing is not needed. It seems to scale down to DV fine enough, that would be the only thing.......scratching head.....

example output DNXHD to QTpro to H264 <http://vimeo.com/13898191>
kkolbo wrote on 8/17/2010, 10:05 PM
However, leaving it set to "Best" all the time doesn't result in very much longer rendering times on my machine, like 15% in actual tests."

I just ran test on my new machine. Using HD .h264 1080p to AVCHD 720p, "Good" and "Best" made zero difference in render time. It did make a difference in CPU use. On "Good" it ran 46%. On "Best" it ran 78%.

I think we have finally reached the point where processing power has outrun the time cost of the two settings.

This is on a 12 thread (six core) Core i7 980 @4.0GHz with 12gb RAM @1333MHz.

Oh, the render was transitions, text and some minor color correction. 1:12 to render a 1:32 long video.
Fredouillelafripouille wrote on 8/18/2010, 3:38 AM
"Then bring the rendered clip back into Vegas on another track above or below it and set one of them to difference"


Setting compositing mode to "difference" is not the good method : it doesn't care about negative results (i.e. pixels with negative result will appear black too). You must use "squared difference" for that !