Vegas Pro 16 - why my CPU / GPU doing nothing during rendering?

LukasD wrote on 9/26/2020, 11:29 AM

Hi

I have Vegas Pro 16. All the time when I render I see that CPU (Ryzen 1700 OC) and GPU (RTX 2060) don't utilize whole power. CPU always work at 50% and GPU just does almost nothing. Even when I render just one track project with no effects or modification.

For comparison I installed trial of Adobe Premiere and there is totally different situation. The power of CPU/GPU is utilized almost max (on GPU used 100% CUDA cores) and CPU dependently on moment is used up to 90%.

I use Nvidia NVENC presets 1080p (2160p does not work with NVENC, any idea why? only software rendering is possible)

Thanks for help.

Edit: One issue I solved. I figure out why NVENC didn't work with 2160p. After install studio driver it started working fine.

Comments

michael-harrison wrote on 9/26/2020, 12:34 PM

Have you chosen your video card in Preferences -> Video?

System 1:

Windows 10
i9-10850K 10 Core
128.0G RAM
Nvidia RTX 3060 Studio driver [most likely latest]
Resolution        3840 x 2160 x 60 hertz
Video Memory 12G GDDR5

 

System 2:

Lenovo Yoga 720
Core i7-7700 2.8Ghz quad core, 8 logical
16G ram
Intel HD 630 gpu 1G vram
Nvidia GTX 1050 gpu 2G vram

 

LukasD wrote on 9/26/2020, 1:22 PM

Hi 

All is selected as should be. This is obvious :) One issue I solved. I figure out why NVENC didn't work with 2160p. After install studio driver it started working fine.

I made some tests and results are interesting. I found that efficiency of CPU depends on codeck of video file imported to the project.  

Some examples, and interesting results: 

1. Render 1080p  of 1440p H.264 file on timeline (generated by NVENC from OBS)  brings only 50% CPU usage during render.  

Render of the same file but in 4K output format brings up to 90% CPU usage.  Time render is the same as 1080p.  

2. The render 4K, 1080p, 720p of 4K or 1080p AVC/HEVC file on timeline, brings always 95% CPU usage during render.  

Conclusion  - render speed by CPU depends on source format of imported video file.   H.264 slows down render process.  The best option is AVC/HEVC for source files.  

GPU encoder is used only to encoding small packs of data comes from CPU render process (that's why on task manager we can observe usage spikes).   There is no complex CUDA utilize as it is in Adobe Premiere - in Vegas they are used only to accelerate some plugins that use GPU). 

 

 

fr0sty wrote on 9/26/2020, 1:23 PM

My 2060 is rendering nvenc in all resolutions just fine, maybe you need a driver update?

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

LukasD wrote on 9/26/2020, 1:25 PM

My 2060 is rendering nvenc in all resolutions just fine, maybe you need a driver update?

Hi. How looks your GPU ussage during render? Can you upload a screenshot?

Mohammed_Anis wrote on 9/26/2020, 1:48 PM

 

I made some tests and results are interesting. I found that efficiency of CPU depends on codeck of video file imported to the project.  

 

This is basically universal and applicable to all NLEs.

The only software that actually does take in every single resource your computer throws at it is Blender, but even that can vary depending on what function is being carried out at the time.

Also, VEGAS PRO 16 still falls behind 17 by a considerable margin, if my memory isn't too fuzzy.

I've always kept GPU disabled then. 17 is the only time I actually found the GPUS to be useful, since what's under the hood was dramatically changed.


 

 

 

 

 

 

 

 

Musicvid wrote on 9/26/2020, 4:00 PM

20-30% GPU is about the maximum, since it's not utilizing GPU cores. Don't expect more.

fr0sty wrote on 9/26/2020, 4:24 PM

I have seen VEGAS max out my GPU before, but in other situations it stays rather low, even though the encode speed remains the same. Having power left over doesn't always mean that power would have been useful for the task at hand, or that it wasn't bottlenecked waiting on some other task to finish first.

Last changed by fr0sty on 9/26/2020, 4:25 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

LukasD wrote on 9/27/2020, 2:19 AM

20-30% GPU is about the maximum, since it's not utilizing GPU cores. Don't expect more.

I did more research and found out that CUDA core are being used for acceleration Vegas's effects (called GPU accelerated). So also during rendering it speed up overall render process.

I could observe that GPU H.265 encoder in Premiere was used more intensively than in Vegas. That's why I created that post surprised that Vegas uses just only few percent. Question is - why Vegas dont use all power of GPU? Perhaps needs optimizations or there are other factors.

Mohammed_Anis wrote on 9/27/2020, 4:10 AM

20-30% GPU is about the maximum, since it's not utilizing GPU cores. Don't expect more.

I did more research and found out that CUDA core are being used for acceleration Vegas's effects (called GPU accelerated). So also during rendering it speed up overall render process.

I could observe that GPU H.265 encoder in Premiere was used more intensively than in Vegas. That's why I created that post surprised that Vegas uses just only few percent. Question is - why Vegas dont use all power of GPU? Perhaps needs optimizations or there are other factors.

CPU rendering will always remain more efficient unless it is coupled with special effects that is better suited for GPU processing.

To answer your question though, it depends on each software's architecture.

Hitfilm/VEGAS Effect uses algorithms that are found in gaming. As such, video playback and effect processing on these software's utilitize more GPU then any other NLE in the market.

When you export however, the load is only the on CPU.

Why? GPU utilization and efficiency of output is not the same

You also need to analyze your own hardware and if bottlenecking occurs during specific functions, since GPUs and CPUs talk to each other. So if your CPU is 2007 model, and your GPU is a 2014 model, no matter how powerful the CPU is - there's a chance it might not be efficient with your GPU.

I have an old skylake i7 (6700k) paired with a 1080ti and it sometimes bottlenecks at a1080p editing.

Thankfully, I don't mind editing at a lesser resolution, since this is just preview stuff.

But I do plan on getting an AMD Ryzen 7 or at least a higher gen CPU to eliminate this issue.




 

andyrpsmith wrote on 9/27/2020, 5:29 AM

The other thing to remember is that not all codecs are supported by the GPU chip, for example the Sony XAVCS is not encoded even though it is H264 (on my 1080Ti) and most other H264 are supported.

Last changed by andyrpsmith on 9/27/2020, 5:30 AM, changed a total of 1 times.

(Intel 3rd gen i5@4.1GHz, 32GB RAM, SSD, 1080Ti GPU, Windows 10) Not now used with Vegas.

13th gen i913900K - water cooled, 96GB RAM, 4TB M2 drive, 4TB games SSD, 2TB video SSD, GPU RTX 4080 Super, Windows 11 pro

LukasDr wrote on 9/27/2020, 5:57 AM

Hi again. Now I'm writing from my correct account.

I made a small test with exactly the same project on both software, this time on PC Ryzen 3900X (12 core) and GTX 1650 4GB (the chipest Nvidia Turing card):

1. Adobe Premiere - Hardware encoding - 14 copied AVC 4K clips, no effects - render time 38 sec

2. Adobe Premiere - Software encoding - 14 copied AVC 4K clips, no effects - render time  1.25 min

(As I noticed during software coding Premiere still use around 30% CUDA cores)

 

3. Vegas Pro 16 - NVENC encoding - 14 copied AVC 4K clips, no effects - render time  3.52 min

 

So, result brings conclusion. Wasteing CUDA cores's potencial is a huge mistake.  The slowest Turing card renders 6x time faster than 12 core Ryzen 3900X CPU.  It really gives food for thought.

RogerS wrote on 9/27/2020, 7:03 AM

What are your render settings in Vegas and Premiere? CPU and GPU render quality also differ, so speed isn't the only factor.

adis-a3097 wrote on 9/27/2020, 7:56 AM

AFAIK, CUDA needs certification, OpenCL doesn't.

LukasDr wrote on 9/27/2020, 8:51 AM

What are your render settings in Vegas and Premiere? CPU and GPU render quality also differ, so speed isn't the only factor.

I set identical settings. Resolution, bitrate, codec etc.

Howard-Vigorita wrote on 9/27/2020, 9:57 AM

I think the only decoding support in Vegas 16 was for Intel igpu's. I believe Nvidia decoding was not implemented until Vegas 17 was 1st introduced and that amd decoding didn't get done till the last v17 update. I'm guessing you're not making use of any gpu decoding in Vegas 16. Since rendering depends on 1st reading and decoding the file, encoding suffers without gpu assisted decoding. Not sure when Premier implemented it but it appears they got it going in the version you're trying out.

Another factor is load-time. Some apps load and decode as much as they can when you load the project while Vegas uses more of a just-in-time scheme which tends to be more efficient in the long run. That can make apps that take forever to load and/or import project clips look better rendering if you forget all the time you've already spent waiting. You can even out load-scheme differences if you take into account load and import time using fresh clips.

LukasDr wrote on 9/27/2020, 10:17 AM

Hi Howard

This is the lastest version of Premiere (subscription). Im impresed how fast it work there. I love Vegas and its worflow (its also kind of DAW Software, and my first contact with Vegas was in 2004), never thought about moving to other video editor since my main profesion is music production. But some aspect could be better optimized, rendering is one of them. Perhaps developers has it on their list. Im here since half year and don't know how it works on that forum (i'm beta tester on Samplitude Pro forum so, I suppose that tradisions are simular)

Mohammed_Anis wrote on 9/27/2020, 11:48 AM

Hi Howard

This is the lastest version of Premiere (subscription). Im impresed how fast it work there. I love Vegas and its worflow (its also kind of DAW Software, and my first contact with Vegas was in 2004), never thought about moving to other video editor since my main profesion is music production. But some aspect could be better optimized, rendering is one of them. Perhaps developers has it on their list. Im here since half year and don't know how it works on that forum (i'm beta tester on Samplitude Pro forum so, I suppose that tradisions are simular)

Oh, there's lots that the developers can take away from Premiere. You are correct on this, most assuredly.

Although, as others have pointed out, GPU decoding was never efficient on 16 as it is now on 18 and during the last update of 17. So you're essentially comparing outdated technology with newer ones.

 

LukasDr wrote on 9/27/2020, 1:29 PM

 

 

Oh, there's lots that the developers can take away from Premiere. You are correct on this, most assuredly.

Although, as others have pointed out, GPU decoding was never efficient on 16 as it is now on 18 and during the last update of 17. So you're essentially comparing outdated technology with newer ones.

 


Hi. I'm fully aware that Vegas 16 does not suport hardware decoding. That's why I never mantioned about decoding. My post concern rendering and encoding proces during export (decoding is not critical process in that case, its mostly usefull during work on timeline). So I think that results would be quite similar. Unless I don't know about some essential changes made in V18 in relations to V16.

I also have installed Trial Vegas 18 (the latest build), but couldn't make any tests - V18 crash when I try to use NVENC rendering with any output resolution (any idea how to fix that?) When export with no NVENC render stop/freeze after some random time. So, no possible for me to test V18 at this time unfortunately.

fr0sty wrote on 9/28/2020, 2:17 PM

In VEGAS 18, go into the help menu, run the driver update utility, download whatever driver it recommends.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

LukasDr wrote on 9/29/2020, 12:35 AM

In VEGAS 18, go into the help menu, run the driver update utility, download whatever driver it recommends.

Hi. I have the same driver installed that Vegas 18 suggests to install. Today Ill check Trial on my another computer with Ryzen 1700 and RTX 2060.

For now Its been crashing on computer with GTX1650 4GB.

Former user wrote on 9/29/2020, 1:54 AM

 

3. Vegas Pro 16 - NVENC encoding - 14 copied AVC 4K clips, no effects - render time  3.52 min

 

So, result brings conclusion. Wasteing CUDA cores's potencial is a huge mistake.  The slowest Turing card renders 6x time faster than 12 core Ryzen 3900X CPU.  It really gives food for thought.

If you turn GPU processor to OFF, and do a NVENC encode, what happens then, does cpu use go up, and what about total encode time?

LukasDr wrote on 9/29/2020, 7:30 AM

 

3. Vegas Pro 16 - NVENC encoding - 14 copied AVC 4K clips, no effects - render time  3.52 min

 

So, result brings conclusion. Wasteing CUDA cores's potencial is a huge mistake.  The slowest Turing card renders 6x time faster than 12 core Ryzen 3900X CPU.  It really gives food for thought.

If you turn GPU processor to OFF, and do a NVENC encode, what happens then, does cpu use go up, and what about total encode time?


Hi.

1. Vegas Pro 16 - CPU (Ryzen 3900X) Software encoding 14 copied AVC 4K clips, no effects - output rendering format AVC 4K, 29,970fps, 12.000.000 bitrate, render time  4.58 min

2. Vegas Pro 16 - CPU (Ryzen 3900X/GPU OFF in settings) NVENC encoding 14 copied AVC 4K clips, no effects - output rendering format AVC 4K, 29,970fps, 12.000.000 bitrate,  render time  3.50 min

It seems that encoder still working when GPU is OFF. So time of render is plus/minus the same.

andrewcg wrote on 9/29/2020, 11:56 AM

I'm not sure why but GPU-Z reading shows higher GPU usage during render compared to windows task manager for me.

Former user wrote on 9/29/2020, 11:14 PM

 

3. Vegas Pro 16 - NVENC encoding - 14 copied AVC 4K clips, no effects - render time  3.52 min

 

So, result brings conclusion. Wasteing CUDA cores's potencial is a huge mistake.  The slowest Turing card renders 6x time faster than 12 core Ryzen 3900X CPU.  It really gives food for thought.

If you turn GPU processor to OFF, and do a NVENC encode, what happens then, does cpu use go up, and what about total encode time?


Hi.

1. Vegas Pro 16 - CPU (Ryzen 3900X) Software encoding 14 copied AVC 4K clips, no effects - output rendering format AVC 4K, 29,970fps, 12.000.000 bitrate, render time  4.58 min

2. Vegas Pro 16 - CPU (Ryzen 3900X/GPU OFF in settings) NVENC encoding 14 copied AVC 4K clips, no effects - output rendering format AVC 4K, 29,970fps, 12.000.000 bitrate,  render time  3.50 min

It seems that encoder still working when GPU is OFF. So time of render is plus/minus the same.

Yes, GPU is processing (cuda), NVENC is encoder (not cuda) . It may be more difficult to see on such a powerful processor but you should notice with GPU processing OFF, your cpu use goes up BUT not by much so even though the GPU has been uncoupled and any latency between GPU and CPU will not cause a slow down, Vegas still won't use all your cpu to increase the frame rate to send to NVENC

So where you say the problem is that Vegas won't use the GPU efficiently, it may be more a case that it can't take much use of GPU because it can't send the data to it fast enough. I doubt you're ever worse off using GPU even with a 3900x, except for the stability problems. On my slow I7 I tried sharpening and glow filter with and without GPU processing, Vegas is about 5x faster with GPU processing, but I become cpu bound 100% without GPU processing, you wouldn't