Ultimate Vegas Pro PC Build

Comments

weinerschizel wrote on 12/27/2022, 10:59 PM

Interesting. I'll have to keep my eyes open for a cheap intel GPU. Also I need to make sure my power supply can handle two GPU's. The Intel QSV GPUs seemed to need a fair amount of power. Was there a particular reason for an Intel & Nvida GPU instead of two Nvidia GPUs?

RogerS wrote on 12/28/2022, 2:22 AM

This was covered above- first NVDEC is more limited than QSV in Vegas. Second you can't choose between NVIDIA cards in Vegas- do you really want a low-end card used instead of the better one?

For recent gen GPUs the Intel power requirements are low- 225 W for the A750 for example or 75W for the A380. Here's one review of ARC: https://techgage.com/article/intel-arc-a750-a770-workstation-review/

Howard suggested you test your current cards and see if there are performance gains to be had with say AMD doing decoding and NVIDIA the rest.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with 31.0.101.4091 driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (driver 31.0.101.2115), dual internal SSD (256GB; 1TB), Windows 10 64 bit

Vegas 19.648
Vegas 20.270

VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

weinerschizel wrote on 1/12/2023, 10:55 AM

I've managed to get through to Eric at Magix. I'm working with him to troubleshoot this render issue. He had me download a system info file from msinfo32. I made a little screen capture on how to do that linked below:

https://drive.google.com/file/d/1ym2wjPfGp8P7W7TzDaDvVIWiIFuo-O6p/view?usp=sharing

Anybody else wants me to submit their problem along with mine, let me know and I'll forward on your msinfo file along with description of your instance of the render issue.

Howard-Vigorita wrote on 1/12/2023, 4:22 PM

Interesting. I'll have to keep my eyes open for a cheap intel GPU. Also I need to make sure my power supply can handle two GPU's. The Intel QSV GPUs seemed to need a fair amount of power. Was there a particular reason for an Intel & Nvida GPU instead of two Nvidia GPUs?

@weinerschizel Vegas doesn't have a way yet to select between 2 of the same brand of gpu for rendering. Seems to always choose an Intel igpu over an Arc when I enable both and does not team them like Resolve and ffmpeg do. Two amd or nvidia cards would probably be the same deal. But maybe it would be slot-dependent so you could pick that way.

Here in the US the Arc a380 is still available and inexpensive. My 1st install is documented here. Main advantage to me is the ability to decode 4:2:2 formats. Just be careful that your motherboard will operate with 2 gpus plus other hardware you have installed like multiple m.2 and and/or sata drives which might share or steal pci lanes from slot #2.

weinerschizel wrote on 1/12/2023, 4:29 PM

Hey @Howard-Vigorita thanks for heads up on motherboard. I'll double check that. I only thought of the power supply, I need to make sure it has enough power and the ports to power two cards.

I have noticed so far that my gforce card LOVES to decode clips w/o efx. It decodes them better than the old Radeon 480x. However, it falls flat on it's face once I color grade or add other efx.

Do you process the efx with the Intel card? I assume it's better for that?

The Nvida card renders like a BEAST. Unfortunately, there's issues with the Magix GPU render profiles, prohibiting me from using the render with differing source codecs. @RogerS has been contributing to another thread with me. We're working on getting resolution on with Magix on that issue. I may snag one of those Intel GPUs once we have it all sorted out with Magix :)

 


 

Reyfox wrote on 1/13/2023, 12:23 PM

The "ultimate" computer build that has run the Vegas Benchmark that @RogerS has been keeping up, tend to be AMD computers.

weinerschizel wrote on 1/24/2023, 10:59 PM

Yeah I saw Roger's spreadsheet. He's has done a great job with it. My goal at this point isn't to start a fresh build it's to maximize the i7 build I had built back in 2016, which at that time was about as good as I could get.

RogerS wrote on 1/24/2023, 11:02 PM

There's a new sample project and spreadsheet to play with. I'd love to see some more AMD systems on it.

https://forms.gle/ZVqESgyoej3eLzJ78

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with 31.0.101.4091 driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (driver 31.0.101.2115), dual internal SSD (256GB; 1TB), Windows 10 64 bit

Vegas 19.648
Vegas 20.270

VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

weinerschizel wrote on 1/26/2023, 12:39 PM

@Howard-Vigorita almost bought the Intel a380 yesterday! But then sat down and did more research on how much performance it offers. It seems it's maybe slower than the 2080 Ti in my system, which maybe a mute point due to using it for decode or encode an 2080 Ti for other.

When I check in Task Manager / Performance. The GPU seems to have ample headroom, is well under utilized (unless task manager doesn't tell the whole story). That said, the 2080 Ti doesn't really like to decode efx, LUTs, fades, etc.

What do you think I can squeeze yet for even better timeline playback?

Screen capture of fiddling on timeline with task manager open:

https://drive.google.com/file/d/1-DZ3FGiCoCdZvgB_Aa4t0JbLXks1RaRu/view?usp=sharing

Howard-Vigorita wrote on 1/26/2023, 3:23 PM

@weinerschizel I would not use any of the Arc boards as a primary gpu if you already had an Amd 580, Nvidia 1660, or better. What it really excels at is decoding and rendering hardware formats the others cannot. Vegas supports it for decoding avc and hevc in more color depth than others but does not support any more rendering formats yet than previously with uhd 630 igpus... 8-bit avc and up to 10-bit hevc but no hyper. Ffmpeg can render hyper, qsv-av1 (somewhat), but not qsv-vp9 yet. Makes a great 2nd gpu if your system can handle one. Don't think it would help Vegas playback much unless yoyu were dealing with a hard to decode format like hevc.

weinerschizel wrote on 1/26/2023, 4:39 PM

@Howard-Vigorita thanks for that tip. I think the most I do is 10-bit HEVC w/ my Mavic 2 Pro footage. Everything else is 8-bit. Where do you find the specs on what can be decoded / encoded w/ the hardware chosen? I keep getting the marketing fluff on all these cards when I research them.

I'm mainly looking at this point to get my timeline to playback better. It's pretty good, but for whatever reason doesn't like efx (despite GPU has plenty of headroom when working with them). I cannot find bottleneck. Here's the screen capture of me looking at task manager / performance tab:

https://drive.google.com/file/d/1-DZ3FGiCoCdZvgB_Aa4t0JbLXks1RaRu/view?usp=sharing

Todd-A0 wrote on 1/26/2023, 6:07 PM

The decoding for playback still appears to be single core, however it does use all your CPU cores when rendering, whereas with Iphone 10bit video, it will use all your CPU cores now for both playback and encoding. If your problem is only with playback, then this explains your problem, and most likely your bottleneck. It's not a lack of GPU power or CPU power, it's that Vegas can only use a single CPU core for GPU decode of this file on playback.

This is true for a simple transcode, but with an actual project you could be introducing other single thread effects or procedures that can also cause the lack of resource use you're seeing.

weinerschizel wrote on 1/26/2023, 6:31 PM

Interesting. Where do you find all this in-depth information? It sounds like Vegas doesn't fully support muli-threading... in certain aspects.

Todd-A0 wrote on 1/26/2023, 6:56 PM

This is a common problem, asked many times, I should also say I missed that you're using VP19, I am using the latest build of VP20 where it uses only a single core for playback BUT uses all your cpu cores for encoding. This may not be true for VP19, it's possible when rendering it's also only using a single CPU core to power the GPU decoder.

Look at your CPU in task manager like you've been doing. It can be difficult to see especially if you have other software running in the background. It is more obvious to me when I play at an Iphone HEVC 10bit 4K60 look at the CPU graph , then do the same with a DJI HEVC 10bit 4K60. It seems to be the case but always happy to be corrected. It's also a bit difficult making the comparison as the DJI files only use I and P frames that are easy to process, but Iphone uses 50% B frames that are computationally more complex and would be expected to use more CPU

fr0sty wrote on 1/26/2023, 7:43 PM

If you really want to get the most out of VEGAS, capture your video in intermediate codecs... The money you invest into new GPUs and CPUs could be invested into recorders that capture in ProRes 422 and some hard drives, and you'd probably get a better performance boost (not to mention a big quality boost) out of using ProRes across the board than you would buying this or that GPU, assuming your system already meets the recommended specs for editing 4K.

That would be an interesting test to try out... Shoot a gig using both SD cards and a compressed format like 10 bit AVC 4:2:2, then also run HDMI out of those cameras into Atomos recorders recording ProRes 4:2:2, then just before upgrading a PC, do a speed test editing a multicam edit of the ProRes clips on the old build... then upgrade it with a new CPU, RAM, and GPU, then do performance tests running the more compressed in-camera formats like 10 bit AVC 4:2:2 150mbps (like out of a Panasonic S1H) or HEVC... and see which performs better... spending the money on the Atomos recorders and some hard drives, or spending the money on new chips in the system.

Last changed by fr0sty on 1/26/2023, 7:43 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Howard-Vigorita wrote on 1/26/2023, 7:58 PM

I think the most I do is 10-bit HEVC w/ my Mavic 2 Pro footage. Everything else is 8-bit. Where do you find the specs on what can be decoded / encoded w/ the hardware chosen?

@weinerschizel The Nvidia support matrix is here:

https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

Never found an official amd version or an accurate Intel one but Intel's hevc decoding format support and performance is the most advanced. Your Mavic 10-bit hevc will be the most difficult to play due to it being hevc which might be made more difficult if the gop sizes are too large... gop size refers to the frame types mentioned earlier and how few and far between the special ones containing the decompression tables are spread out. I have some Mavic 8-bit hevc 60p samples and their maximum gop length is 30 which is on the high side in my opinion... I like to see 15 or preferably N=1 (known as intra) for quickest and easiest decoding. I think Mavic only uses 4:2:0 color depth which is preferred over 4:2:2 which only Intel can decode in hardware. Media Info usually reveals the color depth but often omits gop specs. Be helpful if you could make a short sample available on a cloud drive.

RogerS wrote on 1/26/2023, 8:13 PM

@weinerschizel For what can be decoded by which GPU see this chart. It should mostly be applicable to Vegas (though some of these formats aren't shot by any camera I've seen in the real world). Use Mediainfo to match your media to what's here.

https://www.pugetsystems.com/labs/articles/what-h-264-and-h-265-hardware-decoding-is-supported-in-davinci-resolve-studio-2122/

I think your terminology suggests the source of continued confusion over the role of GPUs. Decoding only refers to the playback of compressed media. Nobody would suggest you swap out a decent GPU for an Intel Arc or iGPU as overall GPU performance is inferior. However for decoding itself (not LUTs, not Fx, not animating pan/crop, etc.) there are benefits to Intel GPUs as shown in the chart above.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with 31.0.101.4091 driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (driver 31.0.101.2115), dual internal SSD (256GB; 1TB), Windows 10 64 bit

Vegas 19.648
Vegas 20.270

VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

fr0sty wrote on 1/26/2023, 9:22 PM

I didn't see you were shooting on a Mavic 2... not capable of shooting ProRes, but Mavic 3 is! So that upgrade might do you better than buying new computer stuff.

Last changed by fr0sty on 1/26/2023, 9:22 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

weinerschizel wrote on 1/27/2023, 11:02 PM

@fr0sty great point about ProRes. Hadn't thought of that. For a short while I was working in raw and ProRes (magic lantern stuff for 5D). Cut like butter but files were so darn big. My plan is to run this Mavic till it's used up. Then switch to a Mavic 3 when there's more used / depreciated ones out there.

@RogerS darn that's helpful! Thanks. Yeah, I don't know how they process efx. Your clarification helps. The decode module only decodes the codec? Makes since...

RogerS wrote on 1/28/2023, 3:02 AM

No worries, it's confusing. I found this schematic from NVIDIA helpful: https://developer.nvidia.com/nvidia-video-codec-sdk

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with 31.0.101.4091 driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (driver 31.0.101.2115), dual internal SSD (256GB; 1TB), Windows 10 64 bit

Vegas 19.648
Vegas 20.270

VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

weinerschizel wrote on 2/14/2023, 10:15 PM

Playing around with HEVC encoder / rendering a video. I noticed something interesting.

Does the 2080Ti and or Vegas not support HEVC GPU accelerated decoding?

RogerS wrote on 2/14/2023, 10:48 PM

It does except for 10 bit 422. Also see if GPU activity changes when you check and uncheck legacy HEVC in file io.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with 31.0.101.4091 driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (driver 31.0.101.2115), dual internal SSD (256GB; 1TB), Windows 10 64 bit

Vegas 19.648
Vegas 20.270

VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

Todd-A0 wrote on 2/14/2023, 10:51 PM

@weinerschizel It's not your hardware, Vegas's GPU decoder doesn't work with certain 10 bit HEVC media when using a decoder other than Intel IGPU. A vegas problem to fix, As for the ZigZag, that's the weird 60fps chunks that Vegas encodes in when using MagixAVC/HEVC Gpu encoder, it needs to pause every 60frames. You can use Voukoder instead

It must work the same as HLS video streaming which also uses 60frame chunks(mostly). you can see that here, and may be what Vegas is doing, not sure.

If it is doing that, it would then have to assemble the chunks at the end which may cause the 99%/100% render failures sometimes, either due to running out of hard drive, or Vegas bugging out.,

 

weinerschizel wrote on 2/14/2023, 11:02 PM

@RogerS good to know. I'll try the legacy mode. It's 8-bit HEVC (GoPro footage). I think my Mavic Pro 2 is 10 Bit HEVC so I'll keep that in mind as well.

@Todd-A0 dang! That's high speed. Where'd you get the debug window. Yeah, my vegas surges, both on timeline playback and when rendering. I also have Voukoder... Testing to see if I fixed issue w/ Magix Render Glitches.

Also curious if there's anything a person could do hardware wise to improve that surging. Sounds like not... Like Vegas Chunks away at that speed?

I think when the bigger IGPU boards get a bit cheaper, I'll throw one in as a secondary GPU. Also noticed my system can handle up to 128GB of memory. However, I don't ever seem to run out w/ 64GB.

Last changed by weinerschizel on 2/14/2023, 11:02 PM, changed a total of 1 times.

Windows 10 Ultimate Editing Machine 10 core i7-6950x CPU / 64gb ram / Nvidia 2080Ti GPU / M.2 main drive & 1tb SSD capture scratch drive

My work Real Estate Broker by day HERE / Camera Man for hire HERE / A mountain man otherwise HERE