Low CPU usage on rendering

a1base wrote on 12/10/2020, 9:48 AM

I'm running Vegas Pro 17.0 (build 452) on a Ryzen 3700x system, and rendering 1080p videos at 60fps. Until recently the software performs as expected, loading all 16 threads to 100% and chewing through the rendering process.

Recently however, The CPU usage tops out at 50%, and it's not even loading all the cores evenly. The last 4 cores are maxed out in usage, while the first 4 are less than 10%. The others vary incrementally.

I'm using saved project template and am rendering to the MAGIX AAC mp4 format using the internet 1080p 59.94 fps template, so I'm not sure how anything could have changed.

 

I'd really appreciate any ideas of what might be going on? Obviously my rending times have doubled which is a bit of an issue for production. The system is performing no other tasks at the time of rendering that would be utilizing the CPU.

 

Thanks

 

Will

Comments

LongIslander wrote on 12/10/2020, 1:07 PM

 

If all your settings are set to the above the cpu will be the only thing being used. Check your thread count, make sure your using mainconcept encode mode, and shut off hardware decoding.

RogerS wrote on 12/10/2020, 8:14 PM

If your goal is speed, use a GPU render template (NVENC or VCE), and have GPU preview acceleration and hardware decoding enabled. Can you share your exact render settings?

The settings from LongIslander should utilize your CPU fully if that's your goal.

walter-i. wrote on 12/11/2020, 7:12 AM

 

If all your settings are set to the above the cpu will be the only thing being used. Check your thread count, make sure your using mainconcept encode mode, and shut off hardware decoding.

Understanding question:
"GPU Accelaratio of Video Processing" does not have to be set to "Off" - is this setting only for the video preview on the timeline?

 

a1base wrote on 12/11/2020, 10:19 AM

Morning all!

Thanks for the replies.

 

I'm just creating daily videos for my family back home, so speed of rendering is definitely the priority.

My PC build has a 16 core processor, 32GB 3600GHz RAM, but only a mediocre video card - an RX 580x, so I figured prioritizing the CPU was the best option. I'm an absolute beginner at this though, so I may be on the wrong track. Any suggestions as to my settings would be gratefully appreciated!

 

@RogerS here are my render settings:

Video Settings

File I/O settings

Render settings 1

Render Template

 

If you need to see something else please let me know!

 

Thanks in advance for all your advice.

 

Will

RogerS wrote on 12/11/2020, 10:37 AM

Short answer is that a mediocre GPU is still faster than a fast CPU for certain tasks, so I encourage you to try to use it. Will review the rest later.

lenard wrote on 12/11/2020, 3:41 PM

I don't have a 12+ core cpu but from what I've seen, with vegas you have 2 options. When you turn on GPU vegas will no longer use all your cpu and you get the problems you describe, or you turn GPU off, and vegas uses all your cpu. But also it has been said Vegas using your GPU with the lower cpu use is still faster than CPU using 100% of cpu even with a basic edit

The ideal would be GPU turned on and 100% CPU, but apparently that's not possible

 

RogerS wrote on 12/12/2020, 2:24 AM

First, use these current settings to do a CPU render of anything. Make a new project with just one video of some length and render it. Note how long it takes.

Next, let's see if we can beat that time while also being stable. I'm looking at your settings and for File I/O, select AMD decoding for now.

Then I would enable GPU processing in the main tab and put dynamic ram preview to the default of 200.

Finally choose a render template that is GPU enabled. For example Magix AVC/AAC Mp4 and then Internet HD 1080p (AMD VCE) or something like that.

Render the exact same video at the exact same frame rate and resolution, only this time with GPU doing encoding, decoding and timeline processing. It should go much faster and your CPU will be loaded less.

Now if you have black frames or other problems then you might want not turn off GPU decoding and GPU preview acceleration and setting dynamic ram preview to 0 while still trying to do a VCE render. Compare speeds- should still beat CPU only. And if nothing works well you can always return to the settings you shared above.

LongIslander wrote on 12/12/2020, 3:29 AM

 

If all your settings are set to the above the cpu will be the only thing being used. Check your thread count, make sure your using mainconcept encode mode, and shut off hardware decoding.

Understanding question:
"GPU Accelaratio of Video Processing" does not have to be set to "Off" - is this setting only for the video preview on the timeline?

 

yes you are correct 👍. that is only for preview.

walter-i. wrote on 12/12/2020, 3:46 AM

Understanding question:
"GPU Accelaratio of Video Processing" does not have to be set to "Off" - is this setting only for the video preview on the timeline?

 

yes you are correct 👍. that is only for preview.

Thanks @LongIslander

RogerS wrote on 12/12/2020, 5:05 AM

I think it is also active during rendering, and is what processes things like GPU-enabled Fx.

lenard wrote on 12/12/2020, 6:52 AM

Understanding question:
"GPU Accelaratio of Video Processing" does not have to be set to "Off" - is this setting only for the video preview on the timeline?

 

yes you are correct 👍. that is only for preview.

Thanks @LongIslander

GPU is very necessary for the preview, and you can see that if using a Nvidia card you look at 3d and cuda gpu windows with preview on and preview off, much less activity with it off. But also with rendering frames gpu is used for levels and scaling and simple fade in/out transitions and edit points without transitions. Timeline rendering(playback) you can tell by the extra cpu it's using and possible dropped frames, but with rendering (encoding) you can better see if the GPU is being used or not by the render time. The render time was half with GPU on compared to GPU off, with a very simple project without GPU accelerated FX, where only levels, edit points, transitions (fade), pancrop and scaling were used

I then tested a situation where having a GPU isn't necessary. that is a basic transcode and doing nothing that you would typically do in an NLE, not even edit points. CPU alone was 10% faster than GPU+CPU, as Vegas was now free to consume more CPU.

walter-i. wrote on 12/12/2020, 1:29 PM

@lenard
Interesting - Thanks!

a1base wrote on 12/12/2020, 4:33 PM

Thanks all for the advice.

 

@RogerS I'll do the comparative testing you suggest. Thanks a lot for your time!

a1base wrote on 12/12/2020, 9:55 PM

I thought you'd be interested in my results. I think it shows that GPU encoding isn't necessarily the best way for Vegas, unless you've got a commercial grade GPU.

The test video was a 27:34 minute long video, rendered at 1080p, 60fps.

For the GPU encoding test, with the settings outlined RogerS above the rendering took 1 hour, 16 minutes and shared the load between the CPU and GPU. It never really loaded the GPU fully.

GPU rendering

 

The CPU rendering took 26 minutes 32 seconds and not only loaded the CPU fully, it actually took advantage of the boost clock to keep it at between 113% and 116% on all 16 threads.

For my setup it seems obvious that Vegas prefers a CPU heavy workload.

 

Again, thanks for all the assistance.

 

Will

lenard wrote on 12/12/2020, 10:13 PM

 

I explained the problem with the RogerS example. It doesn't replicate reality with a basic edit, let alone a complex edit. It may depend on how simple your edits are though, no scaling , no levels, white balance, colour grading etc etc. You could load up an old project and test

The vegas problem: can't fully utalise CPU with GPU on, but it doesn't take much for the GPU enhanced rendering to out perform the fully utalised CPU render but when you do a simple transcode you don't use gpu, but that isn't how people use NLE's. An example project is redcar, you can find a link on here, 44second encode GPU on, 239 seconds GPU off.

RogerS wrote on 12/13/2020, 2:13 AM

@a1base Glad to see your CPU performance is back to normal. What changed?

You do have a fast CPU, but this GPU performance is worse than expected. Can we confirm it's working right?
Which render template did you end up using? Was it GPU accelerated? Are you using the AMD Pro/Enterprise drivers?

Did you click on the GPU in task manager or your software to see exactly what it was doing? Was GPU encode working? If so, I think you should have been above 15%. I see more like 20-30% total GPU usage when both decoding and encoding are working.

Actually, Vegas works better with consumer than commercial-grade GPUs (GeForce vs Quadro, for example), perhaps due to driver support. I don't think you can generalize from this example that GPU is worse than CPU, but rather for this task with your system configuration, CPU was faster.

I just did a test with my system:

For a simple 4K transcode (UHD 24P to UHD 24P) and MagixAVC Mainconcept and NVENC templates, I got:
CPU only 9:15
GPU 1:18 (% usage was 14-50% on NVIDIA as video encode looks like sawtooths. Also Intel did decoding, NVIDIA encoding)


Same file, MagixAVC 4K XAVC-S source to 1080p render
CPU only: 3:01
GPU: 0:36

lenard wrote on 12/13/2020, 5:52 AM

@a1base Glad to see your CPU performance is back to normal. What changed?

You do have a fast CPU, but this GPU performance is worse than expected. Can we confirm it's working right?
Which render template did you end up using? Was it GPU accelerated? Are you using the AMD Pro/Enterprise drivers?

 

@RogerS your GPU isn't being used, as I like to explain to new vegaspro students, always look at 3d and cuda graphs, nothing else when you tell someone how much compute your GPU is using . encode/decode/copy is not compute. You can test this by turning your GPU off in vegas, decode and encode still function, because it's not really your GPU.

So what we see is when a1base engages his gpu, but it's not used due to the rogers choice of project being a basic transcode where the gpu has nothing to do, it merely slows down his cpu by 2/3, when we turn off his gpu, and vegas is free to use all his cores again his render speed increases by the amount expected plus a little more due to his cpu boosting while it didn't with GPU engaged.

So what i'm saying with the cursory data i'm seeing. I don't see a problem with his computer and it appears to be running as expected, the only problem is Vegas and it's failure to utalise high core CPU's when GPU is engaged. Ofcourse I welcome the input of Vegas experts who would disagree

RogerS wrote on 12/13/2020, 6:57 AM


I don't think we are defining the GPU the same way. Encoding and decoding are integral functions that dedicated GPU hardware performs, and they do it faster than a CPU can (in general). The CUDA cores, or AMD equivalents, don't have much to do during a transcode.

I suppose a high-performance CPU could perform decode and encode with plenty of power to spare, but I would like to see more data to confirm that the AMD card is actually being used here, or that the same holds true with other high-powered systems.

Last changed by RogerS on 12/13/2020, 6:58 AM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

walter-i. wrote on 12/13/2020, 12:16 PM

@VEGASDerek
I appreciate the opinion of both @RogerS and @lenard very much.
For me, as an interested reader, this is getting very confusing, and I don't really know which of the two is more or less correct.

Could you make a statement on how exactly Vegas works?
Would be nice - thanks in advance.

michael-harrison wrote on 12/13/2020, 1:01 PM

@walter-i. as far as seeing what's being used at the GPU level, @lenard is more correct. What is being used at the GPU isn't *only* the 3D pipeline but the sum of all the engines at that level.

So while, imo, Vegas *should* be using both the 3D and en/decode pipelines when the GPU is enabled, at least it's getting the use out of the GPU when en/decode is enabled.

System 1:

Windows 10
i9-10850K 10 Core
128.0G RAM
Nvidia RTX 3060 Studio driver [most likely latest]
Resolution        3840 x 2160 x 60 hertz
Video Memory 12G GDDR5

 

System 2:

Lenovo Yoga 720
Core i7-7700 2.8Ghz quad core, 8 logical
16G ram
Intel HD 630 gpu 1G vram
Nvidia GTX 1050 gpu 2G vram

 

walter-i. wrote on 12/13/2020, 2:39 PM

Thank you too, @michael-harrison for your explanation.

You are all really trying hard, with all your expertise - but I think it really is time for a Vegas employee to tell us how the interaction between GPU, CPU and RAM is regulated in Vegas Pro, and which ones Attitude should be taken to get the most out of the software.

Since it is fundamental information, this should perhaps be documented in a small technical article, a webinar - as Gary Rebholz did when introducing Vegas Pro 18, in the manual or in the help directly in the software. Here, between the posts in the forum - I'm afraid - it would soon no longer be found.
Regarding RAM, there are also more guesses than clear announcements about what you should really set: "0", "200" or would you prefer half of the available RAM?
That could also be clarified on this occasion.

Edit:
When I talk about RAM in this context, I mean dynamic RAM preview (setting in Preferences / Video / Dynamic RAM preview max. (MB))

Many Thanks in advance!
Walter

Howard-Vigorita wrote on 12/13/2020, 4:35 PM

My thinking is more like that of @RogerS ... the gpu more as the entirety of the video board. Also keeping in mind that what's seen in Task Manager charts is not necessarily complete. Just what the video driver bothers to interrupt operations to report to the Windows metrics subsystem. Which is allot less when you're talking about operations of an igpu tightly bound to the cpu. I tend to consider gpu/cpu utilization charts only a rough approximation of what's happening and place more value on overall render processing time which I consider the bottom line.

RogerS wrote on 12/13/2020, 9:38 PM

Thank you too, @michael-harrison for your explanation.

Regarding RAM, there are also more guesses than clear announcements about what you should really set: "0", "200" or would you prefer half of the available RAM?

That could also be clarified on this occasion.

Many Thanks in advance!
Walter

Ram isn't the same as dynamic ram preview. If there were one answer for what you should set, Vegas should just set it and lock out users from changing it. You definitely don't want to set half of system ram to dynamic ram preview or there would be nothing left for the rest of Vegas.
I'd rather not have an announcement about a legacy feature which no longer functions properly with certain systems and media, but rather a replacement for it that works with modern decoding and doesn't leave artifacts in renders.

lenard wrote on 12/13/2020, 11:17 PM

My thinking is more like that of @RogerS ... the gpu more as the entirety of the video board. Also keeping in mind that what's seen in Task Manager charts is not necessarily complete.

Power draw is a way of discovering how much compute is being used, and if power draw can't be examined there's the symptoms of gpu temperate and fan speed. The reason to think about GPU as compute only with decode/encode as separate entities is due to the problem vegas has with compute and decode. For the problem discussed here (inability to efficiently use CPU) , it occurs due to engaging compute (GPU ON) and unrelated to GPU decode/encode. And the instability/crashing of Vegas is much more severe with GPU decode on, unrelated to compute. To lump them together as GPU can cause confusion.

Also describing GPU use as a single number(the number on the non expanded gpu box) without clarification is a problem, as nobody knows what that means, could be encode/decode/compute. If decode is at 80% due to reading a number of 4K files compute could be at 0% or 80%, can't be determined unless you look at the gpu graphs, and as said, in case of Nvidia, the 4 engines Vegas use are 3d,cuda,decode,encode. different applications use different engines