Hello all. I was testing number of render threads on a small test project. I have an 2.8 gHz i7 system with 8GB memory, a GTX-470 video card and a 1.3 TB Raid 0 hard drive system.
I was rendering 1280 X 720 Canon SX20is MOV files to Sony AVC for 1920 X 1080 Blu-ray, generating a AVC elementary stream.
Tests
Threads CPU/GPU Time
1 Thread CPU only 2:40
8 Threads CPU only 2:40
1 Thread with GPU 2:13
8 Threads with GPU 2:14
Using Task Manager with the resource monitor shows essentially the same CPU utilization no matter what the setting of threads or CPU/GPU usage. In portions of the timeline with straight video CPU utilization may only be around 21 - 25%. In portions os the timeline with effects or crossfades CPU utilization will go as high as 51%. The Resource Monitor does show addional threads appearing for the Vegas100.exe application as I go from 1 to 8 threads for rendering. However, it accounts for no difference in render time. So, what is the point?
Do any of you do see a difference on your systems?
I was rendering 1280 X 720 Canon SX20is MOV files to Sony AVC for 1920 X 1080 Blu-ray, generating a AVC elementary stream.
Tests
Threads CPU/GPU Time
1 Thread CPU only 2:40
8 Threads CPU only 2:40
1 Thread with GPU 2:13
8 Threads with GPU 2:14
Using Task Manager with the resource monitor shows essentially the same CPU utilization no matter what the setting of threads or CPU/GPU usage. In portions of the timeline with straight video CPU utilization may only be around 21 - 25%. In portions os the timeline with effects or crossfades CPU utilization will go as high as 51%. The Resource Monitor does show addional threads appearing for the Vegas100.exe application as I go from 1 to 8 threads for rendering. However, it accounts for no difference in render time. So, what is the point?
Do any of you do see a difference on your systems?