AMD RYZEN 7000 Series iGPU

MH7 wrote on 7/19/2023, 9:08 AM

Hi guys,

I just have a question that I hope someone can answer concerning the iGPU of the AMD RYZEN 7000 Series iGPU. For anyone who has an AMD RYZEN 7000 series CPU in their PC, does VEGAS Pro 20 recognise it, and if so, how does it perform?

Thanks in advance for any help re this!

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

Comments

Reyfox wrote on 7/25/2023, 5:46 AM

I too am interested. Just curious, but thinking in a couple of years from now a possible upgrade.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

RogerS wrote on 7/25/2023, 5:58 AM

For the VP 20 benchmark we have 4 results using a 7000 series CPU with iGPU. However neither user is actually using it for decoding- one is using their NVIDIA GPU and the other is using legacy AVC for some reason.

One of them did use the iGPU for VCE encoding and got the fastest time of any AMD computer and one second behind the fastest computer tested.

https://docs.google.com/spreadsheets/d/1j8x8w3wYjtEQt1Jg4UFVzA20HCPI9bUlW6uNnQ5bd6Q/edit?usp=sharing

Reyfox wrote on 7/25/2023, 6:02 AM

Impressive results, to the point of why have a graphics card. The user did state that there was one freeze out of doing it 6 times. But the numbers speak for themselves.

Something to really consider for the future as it matures.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

RogerS wrote on 7/25/2023, 6:34 AM

Without the high-end GPU doing preview acceleration the render time would be poor. CPU is handling decoding and the IGPU encoding but the Fx are handled by the GPU.

Only point is that it seems to work in VEGAS and I would love more data if anyone buys one!

Reyfox wrote on 7/25/2023, 6:40 AM

So FX are handled exclusively by the GPU?

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

RogerS wrote on 7/25/2023, 7:02 AM

Hard to say "exclusively" but if you do one of the benchmarks yourself and try it with preferences/video GPU for video processing on and off you'll see a significant difference (5x or so on my laptop, less difference on my more powerful desktop CPU).

Fx, moving text or objects, around, etc. are helped by the GPU. The iGPU can also help but is much weaker. The CPU does tell it what to do.

Reyfox wrote on 7/25/2023, 7:20 AM

I don't have an iGPU on my CPU to do a valid test. I do know that using CPU only is slower than using CPU and GPU for anything from preview to rendering. Just wondering at what "level" of GPU equivalent is the iGPU of the AMD 7000 series CPU's.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

RogerS wrote on 7/25/2023, 7:37 AM

I know, it wasn't a request to you but whoever wants to get such a CPU. I can only test the systems in my signature (and have in both benchmarks though I don't see a real point in using the iGPU as the main one.)

Even Intel ARC systems that are faster than Intel iGPUs don't perform so well for timeline processing (with our limited data) though they do a great job for just encoding and decoding.

Last changed by RogerS on 7/25/2023, 7:39 AM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Howard-Vigorita wrote on 7/25/2023, 6:58 PM

I cannot imagine how an igpu with only 2 compute units could possibly handle a Vegas config selecting it in video prefs as the main gpu. Which is what you'd have to do for it to do video timeline compositing or fx processing. The amd igpu in my aging Nuc (Vega/M) does a reasonably decent job but it has 20 compute units. And does a better job in that capacity than my Arc a380 which has 8 compute units. My Arc a770 is actually pretty fast in comparison with 32 compute units and it's 16gb vram turned in a pretty respectable 1:32 running solo on the AI Torture test the other day.

Regarding how a Ryzen 7000 might do on display acceleration, it would have to drive the monitor to participate at all. For a desktop 7000, the monitor would have to be plugged into motherboard hdmi rather than a gpu board. I imagine the display of a mobile/laptop might be hard-wired to it. I cannot infer much about my Vega/M display performance because the Nuc is internally hard-wired to its other igpu, an hd630.

Since the 7000 igpu seems too under-powered to do much other than decoding and rendering, I would imagine that's probably the best way to make the most of it. Try and at least take as much load as possible off of a main gpu. Generally, gpu/igpu makers use the same logic arrays on all their hardware of a similar generation.

Wolfgang S. wrote on 7/26/2023, 1:56 AM

Please be aware that the technical specification of Vegas recommends only ​an Intel® GPU HD Graphics 630 series or higher as i-GPU, but no AMD i-GPU. So, I would not build a system based on an AMD i-GPU.

I have also my desktop with an AMD processor - but have combinded that with a powerfull nvidia GPU (see my signiture). BUT if I compare the performance with my laptop, what has an i-GPU, where the decoding of the i-GPU in the timeline outperforms my RTX 3070 Ti in my laptop, I would not purchase again an AMD-processor today: the performance improvement both in decoding auf HEVC footage but also in rendering is significant.

Beside that, the decoding advantage for H.265 by the Intel i-GPU is still significant (and cannot be delivered by the Nvidia or AMD GPUs in the full range). Have a check of this paper here - and understand that this is no Resolve function, but a hardware functionality by the GPUs - and also valid in Vegas:

https://www.pugetsystems.com/labs/articles/what-h-264-and-h-265-hardware-decoding-is-supported-in-davinci-resolve-studio-2122/

And the capabilities of Intels i-GPU is documented in a good way too:

https://www.intel.com/content/www/us/en/docs/onevpl/developer-reference-media-intel-hardware/1-0/overview.html

I do not know if the decoding capabilities will be improved for the AMD and Nvidia cards in the future. But they have not done that for years by now, so the sure choice is to go for an Intel i-GPU.

Sorry for all AMD-processor fans (I was also one).

Last changed by Wolfgang S. on 7/26/2023, 1:58 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Wolfgang S. wrote on 7/26/2023, 2:02 AM

New version: https://www.intel.com/content/www/us/en/docs/onevpl/developer-reference-media-intel-hardware/1-1/overview.html

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

MH7 wrote on 7/26/2023, 8:50 PM

Hmm…I didn’t think this would’ve been a popular question. I was just curious. Thanks to all of you for replying. From my own previous research, it appears that the main reason for the existence of the iGPU in AMD’s RYZEN 7000 series of CPUs is for diagnostic reasons.

For example, if your dedicated GPU, like an AMD RX 6700 XT or NVIDIA RTX 3060 Ti, suddenly stopped working, thus rendering you with no display output, you’d be able pull out your monitor’s display cable out of the back of the dGPU (dedicated graphics card), plug it into one of your AMD AM5 motherboard’s display outputs, regaining back a functional display output, then go about diagnosing why your dGPU is not working or working correctly.

This, I believe, is the main function and reason why AMD put iGPUs in their AMD RYZEN 7000 series CPUs and not for gaming or any kind of productivity work. Nevertheless, anyone is welcome to correct me if I’m wrong. 🙂

Last changed by MH7 on 7/26/2023, 8:53 PM, changed a total of 2 times.

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

Wolfgang S. wrote on 7/27/2023, 1:00 AM

The use of an (Intel) i-GPU is a major improvement - and that was implemented in the I/O preferences in Vegas. So it IS an important question.

Intel i-GPUs works. I cannot say anything about AMD i-GPUs really.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

RogerS wrote on 7/27/2023, 1:55 AM

AMD decoding with the iGPU also seems to work in VEGAS based on the benchmark results. There have also been numerous issues with mobile Radeon Graphics drivers completely breaking the preview (seems to work at the moment). Whether there's a similar performance boost to an Intel IGPU I can't say, and there isn't support for 10-bit 4:2:2 HEVC.

Being able to boot without a GPU is handy for diagnostics and initial install.

Reyfox wrote on 7/27/2023, 7:26 AM

...and there are the AMD APU's that suppose to be coming down the road.

I find it wonderful with all the hardware choices out there.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

Wolfgang S. wrote on 7/27/2023, 7:41 AM

The support in Vegas is the question - but more important: it is the hardware support that is important. Roger is right - the modern Intel i-GPU supports the decoding of HEVC 10bit 422 too, and that may be important in the future if more and more cameras add HEVC 10bit 422 in there cameras.

It was tested here for Resolve - but the findings are valid for other NLEs too:

https://www.pugetsystems.com/labs/articles/what-h-264-and-h-265-hardware-decoding-is-supported-in-davinci-resolve-studio-2122/

https://www.intel.com/content/www/us/en/docs/onevpl/developer-reference-media-intel-hardware/1-1/overview.html#DECODE-OVERVIEW-11-12

Last changed by Wolfgang S. on 7/27/2023, 7:43 AM, changed a total of 1 times.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

Howard-Vigorita wrote on 7/27/2023, 3:46 PM

It was tested here for Resolve - but the findings are valid for other NLEs too:

Don't overlook some differences between Resolve Studio and Vegas. Resolve uses ffmpeg6 for its decoding and qsv rendering engines. Which Vegas does not. Thus Resolve can bring to bear Intel hyper rendering of qsv which bonds multiple Intel Arc gpu/Iris igpu for rendering to spread and accelerate that load while decoding on one of the devices... the testing I've done indicates Resolve and ffmpeg6 use the fastest Arc for decoding on my multi-Intel-gpu systems.

When Vegas sees multiple Intel igpu/gpu processors, it ignores all but the slow igpu for rendering. But does allow picking the fastest Arc for decoding in the i/o panel. For that reason I disable all but my fastest Arc when using Vegas. When using Resolve Studio, ffmpeg6, or QsvEnc, I enable them all (uhd750, a380, and a770). I really wish Vegas would catch up with it's Intel support. Or step up its render-plugin quality pipeline to match it's own render presets. So that Voukouder, MagicYUV, and AviSynth/ffmpeg/QsvEnc/FrameServer don't take such a quality hit when used as Vegas render plugins.