100% GPU Utilisation in Vegas Pro, is it normal?

MH7 wrote on 11/5/2023, 4:42 AM

I believe that I might’ve talked about this quite a while back on this forum. You can see my current PC specs below in my sig. The thing is, today, when messing around with VEGAS Pro 20 (Build 411), I have noticed that, when I rendered out my project 4K25, using Magix H.264 CoDec, looking in Task Manager, I saw that my RX 580 (8GB) was being 100% utilised by Vegas Pro 20 and my CPU utilisation (IIRC) was around anywhere from 16-20%.

Now, in the past, I would’ve thought Vegas Pro using 100% of your GPU was a good thing, it’d meant that the CPU, so to speak, could sit back and relax with hardly doing anything whilst the GPU was doing most of the work and heavy lifting, so to speak.

However, in a post a while back on here, someone seemed to indicate that, for a balanced system, you want 50/50 utilisation - 50% utilisation of your CPU and 50% utilisation of your GPU. Would this be correct, in an ideal situation and setup?

Thanks in advance for any help re this!

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

Comments

Dexcon wrote on 11/5/2023, 4:59 AM

Regardless of the CPU/GPU % useage, the primary question I would have thought is whether or not you are happy with the rendering result that you're currently getting?

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

RogerS wrote on 11/5/2023, 5:19 AM

I wouldn't worry too much about the percent one way or another, it's like driving your car while watching the tachometer- who cares so long as it's in a reasonable range.

Also, Windows is just giving you a rough idea of activity- with NVIDIA it doesn't even include CUDA computing in the % calculation. Take a look at the AMD GPU and if you see 3D and encoding activity (assuming VCE is selected under MagixAVC), you are fine. That should take much of the load off of the CPU so it's not a bottleneck anymore.

As to whether the system is working as it should, you can try one of the sample projects in my signature and compare it to similar systems.

MH7 wrote on 11/5/2023, 7:24 AM

Regardless of the CPU/GPU % useage, the primary question I would have thought is whether or not you are happy with the rendering result that you're currently getting?

Well, to be honest, I actually (accidentally) started rendering out the video using just the CPU. This is because it’s been a while since I have used Vegas Pro. I went back into the settings, saw the dropdown menu where you can select - AMD VCE - (or something like like that) and the render was decently faster.

But, something interesting I did discover, whilst I know that rendering out a video using just the CPU can be quite a bit slower, compared to my RX 580 rendering out the same video, the R7 1700 CPU was decently slower.

Nevertheless, to answer your questions. It’s decent, but I know improvements could be made. Therefore, I have seriously considered upgrading my 6-year old system next year. However, if you’re talking about the quality of the video that’s rendered out, yeah, it’s pretty good. I am happy with it.

Last changed by MH7 on 11/5/2023, 8:00 AM, changed a total of 6 times.

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

RogerS wrote on 11/5/2023, 8:00 AM

I'd highly recommend going Intel for the CPU and as the K model has an iGPU that does most of what an Arc can do for decoding and encoding, would get a dedicated GPU from NVIDIA or AMD instead as they're still more powerful for calculations which will help with the timeline, Fx, and other software.

14th gen Intel is hardly an improvement over 13th generation so if the latter is on sale I'd get that. Very happy with my 13600K and VEGAS.

Howard-Vigorita wrote on 11/5/2023, 10:17 AM

14th gen Intel, with its novel design shrink and split-up, should yield higher thermal limits and significantly better performance than its predecessors. Particularly for the igpu, which for the 1st time, is not on the same tile as its Intel 4 (7nm) cpu. The igpu is still designated uhd770 but uses an even smaller tsmc 5nm lithography so it's capabilities should be unchanged while it's performance should go way up. This all contrasts with 13th gen single-tile cpus that are all Intel 7 (10nm) throughout. Looking forward to seeing Vegas benches on it.

RogerS wrote on 11/5/2023, 9:13 PM

I've seen no reviews which concluded the 14th generation is any real improvement over 13th. It is more power hungry (less power efficient) and runs even hotter. My 13600K has enough headroom to overclock it and easily surpass the marginal improvements the base 14600K has.

I haven't read any tests of the UHD 770 and would be interested to see that for VEGAS or in general.

There are some areas (like CPU rendering on the Core i7-14700K) where we did see a decent 7-9% improvement, but for those same workloads, the Core i9-14900K only saw a 2-3% performance gain. This means that the 14700K only saw an improvement due to the additional four E-cores that Intel added and the small frequency bump – not from any sort of architecture improvements.

https://www.pugetsystems.com/labs/articles/14th-gen-intel-core-processors-content-creation-review/

Howard-Vigorita wrote on 11/5/2023, 9:58 PM

Don't see any mention in there about the new split-chip design, lithography, it's impact on thermals, or anything about igpu impact on their benches.... makes me wonder if they leave out anything related to igpus since Ryzen desktops usually don't have one.

RogerS wrote on 11/5/2023, 10:30 PM

[Edit: I think I figured out the disconnect. Meteor Lake "Core Ultra" with these power efficiency improvements is for mobile CPUs and will release this December. 14th gen desktop appears to be Raptor Lake refresh link link ]

Puget Systems tests with Resolve and Premiere and commented favorably on QSV vs Ryzen systems which lack it.

Compared to AMD, we are overall looking at a remarkably consistent ~8% performance lead with the new 14th Gen processors compared to AMD. Intel’s lead is larger (to the tune of ~15%) for LongGOP codecs like H.264 and HEVC, where Intel Quick Sync gives these processors a boost, and smaller for Intraframe and RAW codecs where the higher number of full performance cores helps AMD close the gap.

The other things- well if there's no measurable performance difference then does it matter?

Gaming Nexus tests thermals, wattage and efficiency for the i7, i9 and i5 CPUs and compares against AMD and older generation Intel CPUs

Intel's own marketing talks about slightly higher clocks, extra cores, application optimization and more cache. https://www.intel.com/content/www/us/en/newsroom/news/intel-core-14th-gen-desktop-processors.html#gs.712pe8

Real-world differences appear marginal and Intel representatives are not promising more than that:

“There is no architecture change,” Patel said. “So the IPC [instructions per clock] is exactly similar to what we had before.”

Intel, surprisingly, did not release gen-over-gen comparisons for any games played on the 14th-gen Core. According to Chandler, the improvement would be in the “mid single digit, maybe upper single digit” range, excluding the effects of APO.

https://www.pcworld.com/article/2103293/intels-14th-gen-core-chips-hit-6ghz-but-performance-stalls.html

We've known for a while that the 14th-gen Core were going to be a Raptor Lake Refresh, so in other words an update of the previous generation, and this time around that means essentially the exact same chip with a bump in clock frequencies. There's no IPC improvement here, no real tweaks to the silicon, it's a pretty straightforward refresh...

The Raptor Lake Refresh is pretty much just that, a refresh, and the newly released chips certainly don't deserve to be called a 14th generation. Apart from the 14700K which gets a few extra E-cores, this is a simple rebranding with some software optimizations for games.

https://www.techspot.com/review/2749-intel-core-14th-gen-cpus/

Last changed by RogerS on 11/6/2023, 1:21 AM, changed a total of 2 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Howard-Vigorita wrote on 11/6/2023, 11:00 AM

So much for group-think. Specs indicate 14th gen clocks go from 5.5 ghz to 6 which I would consider more than slight. My 9900k system has a max of 5 compared to 5.3 in my 11900k, which is in fact a much better performer in Vegas. And actually draws less power. Which flat out contradicts the sentiment expressed in that article about the Noctua air-cooling in my 9900k vrs aio cooling in my 11900k. I also think the 14th gen separation of the igpu from the cpu onto its own tile is potentially huge... maybe making the installation of 2nd Arc gpu, like I've had to do, unnecessary... but we'll see when more Vegas users report. And when the new Arcs land next year.

mark-y wrote on 11/6/2023, 9:06 PM

 I saw that my RX 580 (8GB) was being 100% utilised by Vegas Pro 20 and my CPU utilisation (IIRC) was around anywhere from 16-20%.

Are you sure it isn't the other way around?

That would be the most often reported (normal) scenario.

RogerS wrote on 11/6/2023, 9:48 PM

Is the iGPU separated in 14th gen Raptor Lake refresh or 14th gen Meteor Lake? I can't find a source that says the die has changed between Raptor Lake and Raptor Lake refresh. It appears to be the same Intel 7 (10nm) process node.

If independent tests come to the same conclusion using different methodologies I think we can rule out groupthink. Intel's own presentation to media uses the PugetSound benchmark showing nominal improvements (see media presentation PDF): https://www.intel.com/content/www/us/en/newsroom/news/intel-core-14th-gen-desktop-processors.html

The 13th gen i9 is at 5.8Ghz, 14th gen i9 goes up to 6.0Ghz for a 4% increase. i5 is only 100Mhz higher on the E-cores. I can achieve greater clock speeds than the 14600K with my 13600K- a few hundred Mhz is really a rounding error in terms of render performance for the same architecture- already did those tests in VEGAS for render times.

Last changed by RogerS on 11/7/2023, 1:57 AM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Howard-Vigorita wrote on 11/7/2023, 12:31 PM

14th gen is Meteor Lake and the cpu got a 4-way split enabling varying die-widths and subcontracting the igpu tile fabrication to tsmc who did it with 4nm. I think the reporting is that Intel did the rest with Intel 4 (7nm). The split design is similar in concept to Amd chiplets. Most of the info, graphics, and timelines being reported are from the Intel presentations at the Hot Chips 34 conference (aka, Hot Chips 2023) this past August. Just keep in mind that future plans are always subject to change and usually are. But gen 14 was likely already in production and testing at the time so everything about that is solid. If you want more info for your own edification, you can sign up for Intel insider and beta groups where Intel internal papers are often made available subject to nda... which btw covers nothing I've mentioned.

Now this is pure speculation on my part, but... if you're wondering why Intel seems to be downplaying gen 14 performance improvements, my theory is they are trying to protect offerings for the workstation and data center markets. They publicly refer to the gen 14 market as targeting enthusiasts. The new w-series Xeons for workstations and data centers were only recently released with split-cpu designs, but all 10 nm throughout. Some, myself included, might think twice about their likely relative performance compared to gen 14. Based on their timelines, Intel 4 (7nm) Xeon fabrication is at least 6 months out. As are the new (3 or 4nm?) Arcs, which their public plan calls for release starting with a data center model.

fr0sty wrote on 11/7/2023, 2:04 PM

However, in a post a while back on here, someone seemed to indicate that, for a balanced system, you want 50/50 utilisation - 50% utilisation of your CPU and 50% utilisation of your GPU. Would this be correct, in an ideal situation and setup?

No. This would assume the GPU and CPU are equal at all tasks, which they are not. Some things the GPU will do way better, others the CPU does better, and so you'll never see a perfect balance between the two. Usually you'll see the GPU doing more when it comes to video related tasks, as it does the encoding (for some formats) and decoding as well, plus accelerating many effects used as well.

john_dennis wrote on 11/7/2023, 2:16 PM

Posted without further comment:

https://www.vegascreativesoftware.info/us/forum/system-upgrade-2023--137456/?page=3#ca892625

RogerS wrote on 11/7/2023, 6:31 PM

Meteor Lake will be released in December for mobile and in 2024 for desktop and I look forward to reading about its performance then.

https://www.pcworld.com/article/2081277/intel-confirms-a-desktop-version-of-meteor-lake-is-coming.html#:~:text=In%20an%20interview%20with%20Michelle,%E2%80%9CYes%2C%E2%80%9D%20Holthaus%20replied.

Former user wrote on 11/7/2023, 8:46 PM

@RogerS It most likely is coming to desktops in the sameway Tiger lake did. Not for wide adoption mainstream, but rather soldered in place (or an actual socket) for prebuilds like All in one's.

Arrow Lake looks like next release to replace Raptor lake refresh late 2024

https://www.tomshardware.com/news/intel-arrow-lake-desktop-mobile-to-have-different-isas

RogerS wrote on 11/7/2023, 9:08 PM

Thanks for confirming- it looks like it will be utilized for its extreme power efficiency, which would lend itself well to compact prebuilds.

I'm interested in what they do with this mobile architecture if they can get to near 12th/13th gen performance with less power and heat tradeoffs you might be able to pair it with higher wattage GPUs and get overall much better performance? My laptop will soon be 6 years old so will upgrade at some point.

So for performance desktops we'll have to wait until Arrow Lake for an actual boost over Raptor Lake.

MH7 wrote on 11/8/2023, 12:50 AM

 I saw that my RX 580 (8GB) was being 100% utilised by Vegas Pro 20 and my CPU utilisation (IIRC) was around anywhere from 16-20%.

Are you sure it isn't the other way around?

That would be the most often reported (normal) scenario.

Yeah, I am sure. I checked Task Manager. Initially I’d temporarily forgotten that I needed to select AMD’s VCE option in the Magix 4K25 render template and saw my CPU doing most of the work and the render quite slow. I then cancelled the render, went into the render template again and selected AMD VCE and saw the render speed increase by quite a decent amount. I checked Task Manager both times, when just rendering with the CPU and then the GPU.

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

MH7 wrote on 11/8/2023, 12:57 AM

However, in a post a while back on here, someone seemed to indicate that, for a balanced system, you want 50/50 utilisation - 50% utilisation of your CPU and 50% utilisation of your GPU. Would this be correct, in an ideal situation and setup?

No. This would assume the GPU and CPU are equal at all tasks, which they are not. Some things the GPU will do way better, others the CPU does better, and so you'll never see a perfect balance between the two. Usually you'll see the GPU doing more when it comes to video related tasks, as it does the encoding (for some formats) and decoding as well, plus accelerating many effects used as well.

Well, that’s what I kinda thought. I did think that if you’re render out a project with the GPU then you’d see the CPU not being used much because of GPU rendering being selected.

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

MH7 wrote on 11/8/2023, 1:21 AM

I want to say that, whilst I’ve had no experience with Intel’s latest hardware, apart from 15 years ago when I had a HP Pre-built with an Intel Core 2 Quad Q6600 CPU with an NVIDIA GT 430 1GB vRAM (upgraded to) and 4GB of total system DDR2 RAM (upgraded from 2GB), several people have commented how hot Intel CPUs gets over AMD’s.

Although, with that said, it would seem if set to the specified Intel recommended settings these Intel CPUs (12th, 13th, 14th generation) will operate at cooler temperatures. I think settings both in the motherboard’s BIOS and in Intel’s software. To be honest I can’t quite remember what specific settings, but most probably wattage settings, or TDP (Thermal Design Power) settings.

Last changed by MH7 on 11/8/2023, 1:28 AM, changed a total of 5 times.

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

RogerS wrote on 11/8/2023, 2:10 AM

I built a desktop last year (see my signature) and happy to advise on that.

If you limit voltage and wattage in bios you can keep temperatures down without much performance penalty. Of course a good fan and high airflow case would be very important as well. It was also the reason I chose i5 over i7 and i9- you get a good number of cores and multithreading but it's a lot easier to cool. Take the money you save on the CPU and invest in ram, the GPU, etc.