ARC Intel

Adi-W wrote on 9/17/2023, 11:23 AM

Hi, has anyone successfully installed and used an Intel ARC video card on a hold system like this one : i7-2600K CPU/Asus P8Z68-V PRO GEN3 ?

I know that ARC requires the resizable BAR for optimal performance but I just want to know if I would be able to install the driver and have Vegas recognise and use the video card. I will update my whole system later so that the card will be fully used then.

Comments

john_dennis wrote on 9/17/2023, 12:19 PM

I'd do the 13th generation intel i7 or i9 first. Use the on-die intel video adapter for decoding and worry about add-in cards later.

Not only is that what I would do, it's what I did in February.

wwaag wrote on 9/17/2023, 1:41 PM

Since I can't afford @john_dennis solution, I have installed an A770 on my old i7-8700K system. Before the A770, I had sucessfully used an A380. Doesn't support the resizable BAR, but works nonetheless. However, if feasible, go with @john_dennis recommendation.

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Adi-W wrote on 9/17/2023, 1:53 PM

Since I can't afford @john_dennis solution, I have installed an A770 on my old i7-8700K system. Before the A770, I had sucessfully used an A380. Doesn't support the resizable BAR, but works nonetheless. However, if feasible, go with @john_dennis recommendation.

Thanks, not ready either to upgrade my cpu first, so that's exactly what I wanted to know ! Did you bought the Intel® Arc™ A770 Limited Edition or another one ?

wwaag wrote on 9/17/2023, 2:20 PM

I'm not at my machine now, but I believe it was the LE.

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

wwaag wrote on 9/17/2023, 2:27 PM

Just checked--yes, it was the LE.

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Adi-W wrote on 9/17/2023, 3:53 PM

Just checked--yes, it was the LE.

Ok, thanks for letting me know.

 

Howard-Vigorita wrote on 9/17/2023, 5:06 PM

I have the original 16gb Intel Arc a770 in my 11900k system and Vegas sees it fine. Been use it for decoding and qsv rendering since it came out. Had to disable the igpu to use the Arc for Qsv rendering.

RogerS wrote on 9/17/2023, 5:50 PM

I think you'd get much better performance with no GPU and a 12th or 13th gen i5 with IGPU.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

Adi-W wrote on 9/18/2023, 12:00 AM

I think you'd get much better performance with no GPU and a 12th or 13th gen i5 with IGPU.

Yes for sure, but for the moment I just have a sandy bridge cpu with its iGPU HD3000 as main video card and, in others things, I am limited to HD resolution while my BenQ PD32 support 4k. I am planning to get in few months a 13th gen i7 cpu and will use then both the UHD770 iGPU + ARC770 with Vegas. I am planning also to add a Radeon RX 6900 or 7900 XT for a better timeline playback with fx. Does it make sens ?

RogerS wrote on 9/18/2023, 11:52 AM

I understand your situation. I got the i5-13600K myself and my initial plan was to not get a GPU and just use the i-GPU but then I found used NVIDIA were pretty reasonable.

The ARC 770 is fine but it's big benefit is Intel QSV decoding and encoding which you'd get with a K-series CPU anyway. You might get better performance for not much more with a used NVIDIA 2080 Ti, 3070, 3080, etc. The 6900 and 30800 should be similar performance; I personally have a use for CUDA (ai transcriptions) so went with NVIDIA. If you get an AMD or NVIDIA GPU you don't need an Intel ARC for anything so I'd save your money.

We don't have much data but 2 benchmarks are in my signature and there are several Arcs on the list.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

Adi-W wrote on 9/19/2023, 1:55 AM

The ARC 770 is fine but it's big benefit is Intel QSV decoding and encoding which you'd get with a K-series CPU anyway. You might get better performance for not much more with a used NVIDIA 2080 Ti, 3070, 3080, etc. The 6900 and 30800 should be similar performance; I personally have a use for CUDA (ai transcriptions) so went with NVIDIA. If you get an AMD or NVIDIA GPU you don't need an Intel ARC for anything so I'd save your money.

This is a good point but still the ARC770 has few advantages like AV1 encoding which igpu don't do if I am not mistaken and also arc770 is way more fast at encodng than the uhd770. Also ARC seems to be at the top as far as quality encoding AVC-HEVC-AV1 (but uhd770 is almost equal).
As it is not the case actually with RX 7900xt, I would use the rx for timeline playback/acceleration and then arc for rendering.
My priority go like this :
1- Best Timeline playback/Acceleration possible (with fx or IA etc...)
2. The fastest Dynamic RAM Preview
3. Rendering Quality (considering that I always could get it with cpu)
3. Rendering Speed come in last
As usual, upgrading a system is never an easy task !

Howard-Vigorita wrote on 9/19/2023, 3:27 AM
ARC770 has few advantages like AV1 encoding which igpu don't do if I am not mistaken ...

Here are the latest Intel encode/decode tables...

https://www.intel.com/content/www/us/en/docs/onevpl/developer-reference-media-intel-hardware/1-0/features-and-formats.html

looks like 11th & 12th gen igpus only decode av1. Just tried an av1 encode via my uhd750 with VoukoderPro and it failed. Succeeds via my Arc a770 as well as the Arc a380 with almost identical render times.

RogerS wrote on 9/19/2023, 6:31 AM

The AMD 7000 and NVIDIA 4000 series also have AV1 encoding so why get an ARC if you plan to get one of them?

I haven't seen many comparisons on render speed of ARC vs IGPUs in VEGAS.

Howard may have tests on quality.

Personally I still do software encodes for final renders as a 13th CPU can use all its cores and get it done reasonably quickly with good quality.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

GJeffrey wrote on 9/19/2023, 7:46 AM

@RogerS

You can find a comprehensive cou gpu encoding comparison following the below link

https://rigaya.github.io/vq_results/

It's not done with Vegas but gives an idea of what to expect in term of speed and render quality.

ARC is faster that iGPU but the same quality wise.

Howard-Vigorita wrote on 9/19/2023, 11:57 AM

The AMD 7000 and NVIDIA 4000 series also have AV1 encoding so why get an ARC if you plan to get one of them?

I haven't seen many comparisons on render speed of ARC vs IGPUs in VEGAS.

Howard may have tests on quality.

The Arcs add more decoding options. And load-splitting improves performance. And the Arc a380 is dirt cheap and seems to encode at the same speed as the a770 in the test I performed last night on my 3-gpu machine... and that was with the a380 connected via a pcie4 riser to my 3rd slot which is only x4. My only warning is that if your pcie lanes are oversubscribed or your pcie slots are all occupied, there may be issues adding a 2nd gpu.

I'm thinking of doing a new quality analysis layout that includes render speed. Problem is that right now only Voukoder and FrameServer are able to render av1 from Vegas and they are both constrained in quality by the limited feed Vegas gives all 3rd party plugins. So I'm waiting for Vegas to add av1 and/or vp9 as native render presets before getting into that. I could start with Arc vrs other avc and hevc, however...

RogerS wrote on 9/19/2023, 12:32 PM

1- Best Timeline playback/Acceleration possible (with fx or IA etc...)
2. The fastest Dynamic RAM Preview
3. Rendering Quality (considering that I always could get it with cpu)
3. Rendering Speed come in last

1. Fast NVIDIA or AMD GPU for Fx and AI + Intel iGPU or ARC card for decoding
2. Just a function of 1 and decoding so not an independent variable
3. Just do CPU renders as #4 doesn't seem to be concerned about speed anyway. x264, x265, ProRes and MagicYUV do what I need

The next generation of ARC is something to keep an eye out for as it may be more capable for #1 while also being great for decoding. I wish Intel success as NVIDIA pricing is absurd and AMD's drivers are a mess.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

Adi-W wrote on 9/20/2023, 2:02 PM

The AMD 7000 and NVIDIA 4000 series also have AV1 encoding so why get an ARC if you plan to get one of them?

Yes, you are right, and my plan was to go directly with a rx-7900 xt but it will not fit in my mid-tower pc. So that's why I look at the ARC (380 or 770) for replacing my igpu hd3000 until I upgrade all my system + case probably Q1 next year. And it doesnot seems that next generation of ARC willl be out before that.

1. Fast NVIDIA or AMD GPU for Fx and AI + Intel iGPU or ARC card for decoding

2. Just a function of 1 and decoding so not an independent variable

I always was wondering how and what Vegas decide to use for "Dynamic RAM Preview". The GPU (the one selected in Preferences), the CPU or both depending of kind of video and fx ?

RogerS wrote on 9/20/2023, 2:30 PM

Dynamic ram preview is just like playing back the timeline into ram (not realtime) so decoding and Fx both play into how quickly it is generated. The CPU and GPU are both used. GPU is controlled by preferences/video and file io.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

Adi-W wrote on 9/20/2023, 7:10 PM

Good to know that, thank you.

RobertoL wrote on 2/19/2024, 9:38 PM

Hi everyone. I am wondering if you guys could give me some recommendations. I have an i7 13 gen cpu, but I haven't bought a good gpu yet. Would you recommend arc A750 or Arc A770 to pair with this cpu. I usualy work on 1080p resolution alone, but I would like to have 4k capabilities for the future. The Idea is to buy something not that expensive that coud improve my time line, rendering times, and Av1 encoding, or shoud I save money and buy an expensive Nvidia gpu?

fr0sty wrote on 2/19/2024, 10:57 PM

You're better off buying a low end Nvidia RTX 4000 series GPU, as it can also do AV1 encoding, and your CPU can do everything else the A750/770 can do.

RogerS wrote on 2/20/2024, 12:05 AM

I'd recommend the best GPU that you can afford, and if you already have an Intel iGPU with your CPU to go for NVIDIA or AMD. ARC is great for video decoding but not so much for other calculations. I bought a used NVIDIA GPU myself as it's better value. See some GPU performance benchmarks in my signature.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark: https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark: https://forms.gle/eErJTR87K2bbJc4Q7

RobertoL wrote on 2/22/2024, 11:08 AM

Thanks for your suggestions.

This is an Intel Arc A770 and A750 Content Creation Review from Puget systems, where it shows interesting information. I wonder how it performs with Vegas pro.

https://www.pugetsystems.com/labs/articles/intel-arc-a770-and-a750-content-creation-review-sept-2023-update/

Howard-Vigorita wrote on 2/22/2024, 6:39 PM

I didn't get good results in Vegas adding an Arc to a 11900k system and using it as the main gpu with the uhd750 igpu also active. I got better results with the a770 Arc selecting it as the main gpu in Vegas video prefs compared to the a380. But overall performance on Vegas benchmarks like Red Car and Sample Project were similar to an Nvidia 1660 which I also tried. My observations were that my laptop with an Nvidia 3060 gpu was clearly faster as main gpu than the IrisXe in the laptop or the Arc a770 in my 11900k machine. No comparison to using an Amd 5700xt, Vega64, or Radeon VII instead. But that was a while ago and I haven't rerun the tests lately to see if Intel driver improvements change anything for Vegas.

I got great results with both the a770 and a380 for decoding 4k hevc 4:2:0 which is pretty much all I shoot. I got into the Arcs because the 11900k igpu was worse for me than the 9900k igpu. The Arc a770 fixed that for the 11900k machine and an a380 also gave the 9900k a nice boost. I shoot mostly 4k hevc 4:2:0 but the Arcs also added 4:2:2.

Qsv rendering by Arcs in Vegas is also quicker than the igpus but can only be used with the igpus disabled in bios. Resolve, ffmpeg, and QsvEncC64, however, support Intel Hyper Deep Link qsv rendering which uses multiple Arc gpus and IrisXe igpus together. Be nice if Vegas did that too. Btw, Vegas has always had an issue discerning between multiple gpus of the same vendor for rendering... maybe if they are both pcie boards, switching slots might help.

Btw, Intel is no longer sourcing new Arcs itself. There's supposed to be a new Arc line coming that's more powerful. Latest rumor is that they're not coming till q2 2025... previous rumor was this summer.