Why on a quad i7 is VP10 only using about 15% CPU usage when rendering to file? Is it waiting on ram? Is there a tecnical reason it can't use more cycles thus speeding up render times?
You should get more than 15%, but perhaps you are using a codec that doesn't support multiple cores. Which codec are you using for rendering: MPEG-2; DV-AVI; AVC (MP4); AVCHD, or something else? Be as specific as you can, giving both the "type" and the "template." I have a quad-core i7 and will be glad to see if I can duplicated your problem.
"Why on a quad i7 is VP10 only using about 15% CPU usage when rendering to file?"
Wow, this gets asked a lot. A hundred other variables are in the rendering chain, everything from the codecs to filters to sizing to render settings to thread priorities, even CPU temperatures play a role. If you will list everything in your rendering workflow others may offer some assistance.
Vegas Preferences offer some control over your thread preferences, RAM reservation, etc.
You will have to track down your unique variables when "rendering to file" one by one, and find you may or may not be able to address the real bottlenecks in your workflow and system, your CPU decidedly not being one of them.
I'll throw a dart blindfolded and ask you to look at CPU temp rise during render. Does rendering start faster and slow down over the first few minutes?
I see johnmeyer checked in and he can run some comparisons for you, although he may not have any native 1440x1080 avchd to test.
You should be getting more than 15%, but all this means is that something other than the CPU is your bottleneck.
Watch the resource manager in the task manager to see if it's your disk or RAM that's maxing out.
Questions:
1) Are you doing all this on one hard-disk?
2) Is your computer a laptop? (They tend to throttle the clock when they get warm).
3) Is there any network disk involved?
In my (limited) experience, I get a LOT more out of my cores if I have the temp dirs, source files and render destination on separate hard drives. Not just partitions, but physical drives, preferably on separate controllers as well, though that is less of an issue with SATA.
And of course RAM.
With that I get my cores to run at 90-100%. My computer is sounding like my vacuum cleaner and I can turn off the heater in the room while rendering...but the CPUs run at 90-100% :)
You should definitely be getting close to 100% CPU usage when rendering to AVC.
Let's start with the usual suspects.
1. Open the Vegas Preferences dialog (Options -> Preferences) and go to the Video tab. Make sure you have at least 4 rendering threads (for 32-bit Vegas) or 16 rendering thread ()for 64-bit Vegas).
2. While in the same dialog, make sure you have something other than zero set for Dynamic RAM preview. You don't want it set to something really large either (avoid more than 1 GB for purposes of this troubleshooting).
3. Bring up the Windows Task Manager (Ctrl-Alt-Del) and click on the Processes tab. Find Vegas10.exe and right-click on it. In Winodws XP there is a "Set Affinity" option. Select that, and make sure that all CPUs that are not grayed out have checks in their respective checkboxes.
4. While you have the Processes tab selected, click on the CPU column. This will sort the processes by CPU usage. Scroll to the bottom to see which processes are taking the most CPU usage. During a render, you shouldn't see anything else using the CPU, other than Vegas.
5. In the Render As dialog, when you select Sony AVC, click on the Custom tab and look for the setting that determines whether you use the GPU or not. I am not at my main PC now, so I can't tell you exactly what the settings are, but from memory, there are three settings: one which forces the video card GPU to be used; another which stops the GPU from being used; and the third which makes the decision for you. I strongly recommend that you stop the GPU from being used. I actually recommend this no matter what, even if you aren't having any problems. Once you have a multi-core CPU, using the GPU doesn't buy you anything. You can do your own test to confirm this. I have the same recommendation about defrag: don't bother because it never improves performance and just wastes your time. You can do your own tests on this as well to confirm what I say.
So, those are my basic recommendations. If you want to post the VEG file, I'll run a test on that and see what I get. You don't have to upload any media, just the VEG file. As others have pointed out, it is possible to use certain fX that don't multi-thread, and that might be causing the problem. I know you said that you didn't do this, but perhaps there is something "lurking" that is causing the problem.
Finally, have you ever gotten 100% CPU usage using any other program?
I would also suggest that you Google "windows 7 enabling multi-core"
Since Chudson has a qualifying NVidia grapics card -- Nvidia 310-- and Chudson is rendering to SonyAVC, some of the work would be handled by the GPU, wouldn't it? I just don't know how much that would lower the CPU usage.
Since Chudson has a qualifying NVidia grapics card -- Nvidia 310-- and Chudson is rendering to SonyAVC, some of the work would be handled by the GPU, wouldn't it? I just don't know how much that would lower the CPU usage.I just did about ten minutes of testing, both on Win XP Pro 32-bit and on Vista 64-bit, using Vegas 10.0a. I did this to answer both this, as well as the original question about Vegas not using all cores fully.
I first rendered on 32-bit 10.0a under Win XP Pro 32-bit. My input video was a combination of HDV and AVCHD, with sharpening and color correction added to each (it was a multi-cam edit). I rendered using the Sony AVC codec using a compatible (it has an "=" sign next to it) 8Mbps Blu-Ray template. I first rendered using the CPU Only setting and then the GPU, if available setting. I did this test twice. In both cases, the CPU usage never got above 40% during the HDV portion and never got above 65% during the AVCHD portion.
The GPU-assisted render took two seconds longer (1:10 vs. 1:08). I repeated each test to confirm.
I then tried various other tests, eliminating the fX, changing RAM preview, and other things. Nothing affected the CPU usage. Setting the RAM preview to extremes, as always, can really screw up render times (DON'T set it to zero!!!), but there wasn't anything worth reporting.
I then rebooted to Vista 64 and ran Vegas 10.0a 64-bit. I got slightly faster times (1:02), but nothing to write home about. CPU usage was still about the same.
I then tried a bunch of other codecs, including MPEG-2, MXF, and MainConcept AVC, In each case, I got 100% CPU usage.
Conclusion. These tests confirmed what I found several years ago: the Sony AVC codec stinks. You can do a search in these forums on "Sony AVC CPU" and you'll find various tests that support what I've posted here.
Just to be clear, I make this statement about the Sony AVC codec not just because of its poor ability to fully use multiple cores, but also because the quality -- especially at low bitrates -- is pathetic. In particular, if you have read any of the tortuously long, but nonetheless excellent, posts by Nick Hope and Musicvid, about how to get really good MP4 AVC encodes, you'll find that MeGUI and Handbrake -- two free tools -- create substantially better encodes than the Sony AVC codec.
So, if you are using the Sony AVC codec, don't expect much.
I should probably upgrade to the latest Vegas Pro version, because I think Sony did fix some problems related to this codec.
You will not get any GPU advantage with your nVidia 9800GT. The GPU will not be used. You need at least an NVidia 2xx. That is according to the ReadMe for Vegas 10d.
You will not get any GPU advantage with your nVidia 9800GT. The GPU will not be used. You need at least an NVidia 2xx. That is according to the ReadMe for Vegas 10d.I find that statement in the Vegas readme very confusing. I think it refers to the GPU designation (there is no "NVidia 2xx graphics board). I tried to find what GPU is on a 9800GT, but gave up after a few minutes of frustration on the NVidia site. Perhaps a developer has access to this information, but I sure couldn't find it.
The earlier parts of the same readme file (for 10.0a) stated the GPU requirements based on the model of the graphics card, and my 9800GT clearly qualified. Perhaps when they upgraded they deleted support for my card. Bottom line: I can't tell, and since I don't think it is going to make any difference at all in rendering time (remember, my tests were using 10.0a, and according to that same readme, my card IS supported on that release), I'm not going to worry about it.
Very interesting thread. I repeated John's experiment and took it just slightly further.
I am running a lowly Q6600 clocked at 2.4Ghz. I am in the process of updating and got the new graphics card which I installed in the 6600. It is a GTX460 1Gb GDDR5.
I created a 40 sec timeline consisting of 4 clips. 2 AVCHD (GoPro) 1920x1080p clips intermixed with 2 MXF (EX3) 1920x1080p clips (10 sec each). I added a sharpening and Sony color corrector fx to each and rendered using the Sony AVC codec and compatible 10Mbps Blu-Ray template, something like John, with and without using the GPU. See A below.
Next I removed all fx and rendered as above, see B below. I then added the AAV Colorlab fx, which uses the GPU, to each and repeated both renders, see C below.
I monitored the CPU usage with Windows Task Manager and the GPU usage with GPU-Z.0.5.3. Results were as follows:
CPU/GPU; CPU Usage; GPU Usage; File size; Render Time
A
CPU; 55% - 65%; ~7%; 40.06Gb; 4:35
GPU; 70% - 75%; ~9%; 48.92Gb; 3:11
Edit: I create a nice looking table and it all gets messed when I post. How do you produce a neatly spaced table? I added a ; between the values to at least separate them.
That's good to know that the GPU makes a difference. My Nvidia 9400 GT is a way earlier model than your GTX460 and is even before the Geforce 200 series, the cutoff. I don't get any benefits. When I click the "Check GPU" button on the System tab for the Sony AVC custom settings dialog, the message comes back "GPU unavailable." That button is new in 10d, however I didn't need a button in earlier Vegas versions to tell me that the GPU was unavailable.