I just I rendered my first video on my new QUAD Q6600 server with 4gig ram. I noticed that its only using 30%-40% of the processor? Is a setting(s) I need to turn on for it to use around 80%-90% of the processor?
CPU usage seems to fluctuate and is based on what is actually being rendered. On my machine straight DV without any FX applied will use 100% of all cores, when FX come into play it will lower to approx 50%.
When rendering HDV i have yet to see it go past 60%
<< There is at least one setting that affects this; how many rendering threads have you specified?>>
I'm just using the default which is set to 4 in Preferences->Video->Rendering Threads, and my Dynamic Ram is set to 1024. Other than that I'm using the defaults.
The video was captured using WinDV, I believe the codec is dvsd.
Imagine what rendering actually means. First Vegas will decode the video using an external decoder, then it passes the decoded video to a (set of) filters. These may also be written by someone else (if you use third-party filters) and then to some transition code, again, perhaps not Vegas code. All of these may not utilize all the CPUs, and in those cases you will not see very high CPU usage.
If you render from DV-AVI or MPEG-2 (HDV) with Vegas filters and transitions you will generally get full CPU utilization, once you put other stuff in that mix, you may or may not.