Mixed GPU systems

SnarfConsortium wrote on 8/26/2025, 9:06 AM

Is anyone still using a intel arc card in their system for use with Vegas22? The last time I ran a dual GPU setup was back on Vegas 20-21 but had to remove the extra GPU for an integrated capture device.

 

I no longer need the device and have been wanting to reintegrate my a380 into my system, but looking over the forums I am seeing fairly bad performance from V22 and dual GPU setups that want to use the ARC for hardware AV1. But, those posts were from last year.

 

Has anyone had better performance with V22 and ARC cards or would I be better off just sticking to my primary RTX 2080ti for rendering?

Primarily edit footage captured via OBS from video games, with some live action bits mixed in.

Vegas Pro 23 (VP21 also installed for previous project that uses Vegas Effects heavily)

Win 11 Pro 24H2 (Build 22631.5909)

AMD Ryzen 9 5900X 12 cores

32GB DDR4 2133 MT/s

Nvidia RTX 2080-ti 11gb

Gigabyte X570 Aorus Ultra

Sony a6600, a5100,

OBS - 2560x1440, HDR, 60fps, HEVC, CQP @ 20, Main-10

TASCAM 16x08 US, Behringer ADA8200

Rode Podmic, Rode Procaster, Shure sm7b, AT-2020, AT-2035, AT-875R

Comments

johnny-s wrote on 8/26/2025, 11:16 AM

 

@SnarfConsortium

I use intel gpu with Nvidia to assist with 422 10 bit. See my signature, PC 2.

Last changed by johnny-s on 8/26/2025, 11:17 AM, changed a total of 1 times.

PC 1:

Intel i9-9900K

32 GB Ram

AMD Radeon XFX RX 7900 XT

Intel UHD 630

Win 10

PC 2:

AMD Ryzen 9 7950X3D 16 core CPU

64 GB Ram

Nvidia 4090 GPU

Intel A770 GPU

Win 11

 

Laptop:

Intel 11th. Gen 8 core CPU. i9-11900K

64 GB Ram

Nvidia RTX 3080 GPU

Win 10

Howard-Vigorita wrote on 8/26/2025, 12:28 PM

@SnarfConsortium Yes. I still have an a380 in my 11900k system along with an Nvidia 4090 and an Amd 6900xt. Also have one in a 9900k system along with an Amd Radeon7. I use the a380's for decoding with older Vegas versions. But mid-vp21, Vegas redesigned to run faster with timeline, fx, and decoder processing all in the same gpu. So I only use them in newer Vegas releases for qsv rendering. But great decoding in Resolve Studio and ffmpeg, however. Which both also do high performance qsv hyper-rendering, or whatever Intel calls it these days. I also have an a770 that at some point I'll be dropping into a new build along side something else... probably a 5090. Vegas is fine with dual gpus as long as they're not from the same vendor. I find I have to disable slower igpus otherwise Vegas always picks them for qsv rendering.

The arcs can decode av1, so that might be a plus. But Vegas still renders av1 with just the cpu. If you use Voukoder however, that will render with an Arc.

andyrpsmith wrote on 8/26/2025, 12:30 PM

Here is the Nvidia encode/decode matrix

https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

(Intel 3rd gen i5@4.1GHz, 32GB RAM, SSD, 1080Ti GPU, Windows 10) Not now used with Vegas.

13th gen i913900K - water cooled, 96GB RAM, 4TB M2 drive, 4TB games SSD, 2TB video SSD, GPU RTX 4080 Super, Windows 11 pro

RogerS wrote on 8/26/2025, 7:57 PM

You can see Arc performance with Fx heavy projects in my signature- it underperforms. I have a 2080 Super and would rather just use it for decoding, encoding and Fx than my Intel iGPU at this point as it's faster. The only exception is with 10-bit 422 HEVC where VEGAS automatically invokes the iGPU and it works well.

Howard-Vigorita wrote on 9/1/2025, 12:13 PM

Fwiw, I got the absolute best, smooth, and creamy display performance doing 4k 4:2:0 hevc multicam cuts plugged into an Intel a770 set to both video and i/o decoder. And audio mixing was fine too. But not so much with video FX in the picture. At the moment, however, the a770 is on the shelf.

jimingo-1 wrote on 9/2/2025, 11:14 AM

I have an AMD Radeon RX 6900 XT & an Intel Arc A770 installed and use both with Vegas 22. The Arc completely destroys the AMD if editing with 422 HEVC and is pretty much on par with everything else. I would get rid of the 6900 XT but I can't because of the Intel memory leak bug when rendering in 32 bit. Vegas still can't render to 32 bit while using Intel. For 8 bit, it works fine.