is it possible to get Vegas Pro to use my gpu instead of maxing my cpu

JGamerXone wrote on 4/23/2020, 1:41 AM

In terms of the software I'm using I currently have Vegas Pro 17 and I have windows 10, my gpu is the NVIDIA GeForce RTX 2080.

For some reason Vegas Pro refuses to use my new gpu, back before I swapped gpus Vegas would run fine, but ever since I got a new gpu instead of speeding up renders it slowed down drastically and now pins my cpu at 100%. In the settings menu it says it will use the gpu, but it clearly isn't, I've tried everything I could think of, but I can't get it to use the gpu at all.

Comments

fr0sty wrote on 4/23/2020, 1:55 AM

A few things.

1. Not every render format supports GPU acceleration. GPUs are only capable of rendering AVC (H264) or HEVC (H265) formats. If you do not see "NVENC" next to the render preset you are using, it does not support GPU acceleration.

2. There's 2 places in settings to enable GPU acceleration of playback and effects. File I/O in the settings tab controls GPU decoding, which will accelerate how fast the video plays back on the timeline with no effects applied. In the Video tab of preferences, GPU acceleration can be enabled to accelerate certain effects that support it.

If you are doing both of the above and still do not notice any improvement, you may need to update your GPU driver. Do not use the game ready drivers, but rather the studio drivers from Nvidia.

JGamerXone wrote on 4/23/2020, 2:36 AM

A few things.

1. Not every render format supports GPU acceleration. GPUs are only capable of rendering AVC (H264) or HEVC (H265) formats. If you do not see "NVENC" next to the render preset you are using, it does not support GPU acceleration.

2. There's 2 places in settings to enable GPU acceleration of playback and effects. File I/O in the settings tab controls GPU decoding, which will accelerate how fast the video plays back on the timeline with no effects applied. In the Video tab of preferences, GPU acceleration can be enabled to accelerate certain effects that support it.

If you are doing both of the above and still do not notice any improvement, you may need to update your GPU driver. Do not use the game ready drivers, but rather the studio drivers from Nvidia.

Well I tried the driver, don't know if it helped at all. so I had a look in the template I was using and found an option (nv encoder).It doesn't bring cpu below 100%, but Vegas is using my gpu now even if only 15% and doubled render speed.

fr0sty wrote on 4/23/2020, 3:37 AM

Keep in mind, it isn't using your GPU itself (not the main compute components of it, at least) to do the rendering. Your GPU has an encoder chip on it that is separate from the CUDA/Compute cores that do the bulk of the processing for your GPU, so that may be why you are not seeing 100% usage, Vegas is only using what it needs to do effects acceleration on the timeline, then the rest of the work is being passed on to that encoder chip to finish the job. Same applies with GPU decoding, that same chip is used.

The driver doesn't necessarily mean better performance, but it does result in better stability based on my findings. Most of the time, when users are having GPU issues, updating to studio drivers (or enterprise for AMD) seems to solve it.

Last changed by fr0sty on 4/23/2020, 3:41 AM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

JGamerXone wrote on 4/23/2020, 4:10 AM

Keep in mind, it isn't using your GPU itself (not the main compute components of it, at least) to do the rendering. Your GPU has an encoder chip on it that is separate from the CUDA/Compute cores that do the bulk of the processing for your GPU, so that may be why you are not seeing 100% usage, Vegas is only using what it needs to do effects acceleration on the timeline, then the rest of the work is being passed on to that encoder chip to finish the job. Same applies with GPU decoding, that same chip is used.

The driver doesn't necessarily mean better performance, but it does result in better stability based on my findings. Most of the time, when users are having GPU issues, updating to studio drivers (or enterprise for AMD) seems to solve it.

Well it doubled the render speed, so it is going at a reasonable speed now, even if it can go faster I'm just glad it isn't taking 40 minutes to render 20 minute videos with minimal effects outside of some fading for smooth transitions.

j-v wrote on 4/23/2020, 4:24 AM

Your GPU is also able to help faster reading of files ( decoding). What are your settings in Preferences/ File I/O?

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 23H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
560.81 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2127
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 560.81 and Intel UHD Graphics 630 with driver 31.0.101.2127
Vegas software: VP 10 to 21 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

fr0sty wrote on 4/23/2020, 4:30 AM

If it's doing real time renders, that's about what you want to target as far as render speeds go. 20 min video takes 20 mins, right?

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

JGamerXone wrote on 4/23/2020, 4:35 AM

Your GPU is also able to help faster reading of files ( decoding). What are your settings in Preferences/ File I/O?

I just did a test changing the settings in file I/O and it seems to slow it down a ton setting it to use NVIDIA. (actually Vegas reset my template, so it was using the amd instead of nvidia oops...)

JGamerXone wrote on 4/23/2020, 4:36 AM

If it's doing real time renders, that's about what you want to target as far as render speeds go. 20 min video takes 20 mins, right?

Yeah, it's a good amount of time, before it was just super slow. It was the main reason I upgraded gpus in the first place.

sickpuppy wrote on 4/26/2020, 6:11 AM

A few things.

1. Not every render format supports GPU acceleration. GPUs are only capable of rendering AVC (H264) or HEVC (H265) formats. If you do not see "NVENC" next to the render preset you are using, it does not support GPU acceleration.

2. There's 2 places in settings to enable GPU acceleration of playback and effects. File I/O in the settings tab controls GPU decoding, which will accelerate how fast the video plays back on the timeline with no effects applied. In the Video tab of preferences, GPU acceleration can be enabled to accelerate certain effects that support it.

If you are doing both of the above and still do not notice any improvement, you may need to update your GPU driver. Do not use the game ready drivers, but rather the studio drivers from Nvidia.

WOW!!! Just simply - WOW!!!
I made those adjustments and did a couple render tests of an old project..
With the old settings my GPU was idleing without any load at all but the CPU was running 100% all the way and it took 14min 35sek to render.
After changing the settings you mentioned my GPU was loaded between 15 and 20% during the render, the CPU was running no more than 50% and the complete render was done in 2min 15sek..

That is just... WOW!!! Sorry for crashing your thread OP, but this was just.. And I've been running these old settings for years and didn't find out why I couldn't activate the GPU. The "NVENC"-tip was probably the most important one that I haven't paid attention to before, since I think I can remember touching the settings on a earlier stage without any effect, and then just changing the settings back to what it was..