Q: Realtime Multi-Threaded/Processor Support

Jordan T. wrote on 6/24/2005, 8:38 PM
I recently purchased a new PC that has dual 2.8Ghz Xeon processors (800Mhz FSB). I tried using the Magic Bullet HD plugin and noticed a CPU utilization drop to about 30-50%; this was noted in other posts in this forum.

I dug a little deeper and I believe the following to be true:

1. When playing your project in the realtime preview, Vegas only uses one thread. This essentially negates the benefit of having two processors, because one CPU is idle.

2. When rendering to a file, however, Vegas uses the number of threads the user defined in the Video preferences page.

3. While using the Magic Bullet plugin and rendering a file, it appears that atleast one of the rendering threads is waiting for one of the other rendering threads to complete its work (processing the video thru the Magic Bullet plugin), and because of this, Vegas is not using the full processing bandwidth of my dual-CPU system.

My question is, can I configure Vegas in a way that better orchestrates it rendering threads, for both realtime preview and rendering to a file?

Comments

TheRhino wrote on 6/25/2005, 1:56 PM
I can't answer your question, but thanks for the info on turning off realtime preview to take advantage of the 2nd processor. You just made my day! I'm setting up an Athlon X2 system, and didn't notice remarkable performance [running once instance of Vegas] until I turned off the preview. I won't be using Magic Bullet, but if I find a way to use realtime preview with dual processors, I'll post my findings.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...