Forcing Vegas to only use NVidia gpu

RealityStudio wrote on 10/30/2021, 1:28 PM

Hey all, I'm using a Dell XPS 15 laptop, i7 11800H 8 core cpu along with both Intel integrated graphics and an NVidia 3050ti and I'd like to force Vegas to only use the NVidia 3050ti for processing. From what I see there are two places where you can configure gpu use:

1) In Options->Preferences in the "Video" tab you can choose which gpu is used for acceleration of video processing, so I have that set to the NVidia.

2) In Options->Preferences in the "File I/O" tab you can choose which hardware decoder to use, so I have that set to NVidia NVDEC.

Normally I still use 30fps 8 bit h264 codec on my Sony A7S3 because it works fast and I get full frame playback on the timeline, but now I'm toying with their 10 bit 60fps HEVC codec. Thing is though when I playback video on the time line and look at gpu/cpu use it jumps like this:

CPU: 2% to 66%

Intel: 0% to 34%

NVidia: 0% to 18%

...which seems to imply that it's still using the Intel gpu for something while the NVidia gpu snoozes so my 60fps HEVC footage plays back at around on only 27fps or so. Is there another setting somewhere in Vegas that I need to configure to make sure Vegas is not using the Intel gpu for processing?

EDIT: In case it comes up, switching to the Intel gpu for decoding in the "File I/O" tab drops timeline playback from 27fps to 3fps.

EDIT: I'm using the latest NVidia studio drivers.

 

 

Comments

j-v wrote on 10/30/2021, 1:43 PM

My opinion: Unless you did not install the newest drivers for the Intel and the Nvidia (Studio driver) the best choice what to choose is the best made by Vpro 18 and 19 itself.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 23H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
560.81 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2127
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 560.81 and Intel UHD Graphics 630 with driver 31.0.101.2127
Vegas software: VP 10 to 21 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

RealityStudio wrote on 10/30/2021, 1:44 PM

Oh I'm using the latest NVidia studio drivers, I'll add that to the original post. If I let Vegas choose the decoder in auto mode it chooses the Intel gpu, which drops timeline playback from ~27fps to ~3fps, so I manually set that to the NVidia gpu. Vegas does automatically choose the NVidia card for gpu processing under the "video" tab. Given the high Intel gpu use I can't help but feel it's still being used for processing somewhere, I'm just not sure where that setting may be hidden.

Howard-Vigorita wrote on 10/30/2021, 5:11 PM

If your screen is hardwired to the Intel gpu, you will not be able to totally disable it. To verify, go into your bios and see if there's an option to disable the Intel igpu. If it's anything like my older Dell Xps15, won't be there. Also, if you fire up the Intel Graphics Command Center app and go to the display page, the lower right-hand corner of the screen graphic will show "Intel" if that's how the built-in display is wired. Fwiw, 60fps 10-bit hevc don't fly for me no matter what I use to decode... suggest you consider either shooting or transcoding to 30 fps hevc and keeping your decoder set to Intel and stick to the drivers you get from Dell.

RogerS wrote on 10/30/2021, 5:42 PM

For certain formats Vegas will use the Intel GPU regardless of settings as it's the only one that can decode the footage. That's a good thing, so no need to try to stop it

For HEVC 10 bit 60 I would make proxy files.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208
VEGAS Pro 22.93

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Former user wrote on 10/30/2021, 8:14 PM
but now I'm toying with their 10 bit 60fps HEVC codec. Thing is though when I playback video on the time line and look at gpu/cpu use it jumps like this:

CPU: 2% to 66%

Intel: 0% to 34%

NVidia: 0% to 18%

...which seems to imply that it's still using the Intel gpu for something while the NVidia gpu snoozes so my 60fps HEVC footage plays back at around on only 27fps or so. Is there another setting somewhere in Vegas that I need to configure to make sure Vegas is not using the Intel gpu for processing?

Playback your HEVC and take a screen shot of both your Nvidia and Intel GPU info in task manager

 

In a laptop situation it is often the case they only give you full access to 1 of the GPU's, maybe that is why your intel GPU is not working correctly for decode, but maybe bios or drivers. Vegas Pro and HEVC 60 files aren't really compatible even if your decoder was working fine, it's the high frame rate files that will most often cause a user to ask why they get such poor playback when Vegas isn't using CPU or GPU

+++

I missed the bit about the files being 10bit HEVC, if they are 420 color, you should try and get your intel deocder to work (if that's even possible), but Vegas can't decode the 10bit 420 files with Nvidia or AMD cards, if your files are 10bit 422 HEVC, neither of your decoders will be able to play it back, CPU alone will not work with 4K60 HEVC well and might be what you're seeing

RealityStudio wrote on 10/30/2021, 11:24 PM

Thanks for the info everyone! So I set the footage to be HEVC 10 bit 150mbps 4:2:0 from the Sony A7S3 because from this NVidia support matrix:

https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

...that mode is supported by NVDec, which I suppose would explain why I get ~27fps when setting Vegas to use NVDec for decoding compared to ~3fps when setting Vegas to use Intel Quicksync for decoding.

My Intel graphics command center does indeed say "Intel" at the lower right of each display so I guess my laptop is indeed hardwired to use that no matter what I do for certain things. Although if if it's true that Vegas doesn't support 10 bit 420 color at all with NVidia cards then I guess I'm hosed anyways.

I don't want to use proxies for various reasons so I guess I'll just stick to good 'ol h264 8 bit for now and revisit this again in the future. Thanks for the help everyone!

 

Former user wrote on 10/31/2021, 1:00 AM
. Although if if it's true that Vegas doesn't support 10 bit 420 color at all with NVidia cards then I guess I'm hosed anyways.

Vegas doesn't support certain 10 bit 420 HEVC formats such as what your camera produces using Nvidia GPU decode. There have been many complaints about that, so most likely still is the case. Vegas will use Nvidia GPU decode on some 10bit 420 HEVC video such as what my Samsung camera produces in HDR mode

 

 

 

Howard-Vigorita wrote on 11/1/2021, 11:51 AM

Thanks for the info everyone! So I set the footage to be HEVC 10 bit 150mbps 4:2:0 from the Sony A7S3 because from this NVidia support matrix:

https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

...that mode is supported by NVDec, which I suppose would explain why I get ~27fps when setting Vegas to use NVDec for decoding compared to ~3fps when setting Vegas to use Intel Quicksync for decoding.

My Intel graphics command center does indeed say "Intel" at the lower right of each display so I guess my laptop is indeed hardwired to use that no matter what I do for certain things. Although if if it's true that Vegas doesn't support 10 bit 420 color at all with NVidia cards then I guess I'm hosed anyways.

I don't want to use proxies for various reasons so I guess I'll just stick to good 'ol h264 8 bit for now and revisit this again in the future. Thanks for the help everyone!

@RealityStudio Would suggest you compare the workflows and output results you get shooting 4:2:0 h264 60fps and hevc 30fps. Although hevc performs poorly for me at high frame rates, it out performs h264 at high bitrates using normal frame rates. On my systems, some of which are lower in performance than yours, I get a larger performance boost with hevc than h264 if I split off encoding and decoding onto separate gpus. The higher frame rate will benefit material with allot of motion but hevc handles motion and higher bitrates much better than h264 so that might end up a wash depending on subject matter. Hevc compression is also better than h264 which will let hevc get away with a lower overall bitrate and smaller files.

A trick you might try is keeping the frame rate normal while bumping up the shutter speed. I was able to stop humming bird wings an 30 fps with a 1/500 sec shutter. Although what I was going for was slowmo which looked more glassy shot it at hevc 240 fps. To see the slowmo well for edit purposes I either had to render it or patch the 240fps clip down to 30fps frame rate using ffmpeg without runtime compensation. Altering a clip's frame rate like that is another trick. Which is what I ended up going with. That preserves each frame as captured while specifying a lower frame rate which makes the clip play longer and look slowmo. Although Vegas has some nice slowmo fx, it seems better at speeding up the frame rate of a clip manipulated externally like that than playing the original high frame rate clip with slowmo fx.

RealityStudio wrote on 11/1/2021, 12:01 PM

@Howard-Vigorita I use a Sony A7S3 which unfortunately does not support HEVC 30fps, only 24 and 60fps. Yeah kinda odd but it is what it is. I hate 24fps so I was looking at 60fps to see if it would be acceptable for what I film. I mostly just wanted to switch to HEVC to save on archival storage space figuring it would need less space to store content. 10 bit color was a bonus although strictly speaking not necessary as all my color grading involves application of one lut hence 8 bit color honestly is mostly ok, and I don't film in S-Log either. I do keep shutter speed locked at 1/60th (I just like the look of it better) and shoot wide open at 20mm f1.8 so I leave ISO at auto to sort out the rest.

Unfortunately my HEVC experience came to an abrupt end with the poor Vegas timeline performance so that's that, going to stick to 8 bit H264 30fps for now. Hopefully this is something Vegas can improve over time.

Howard-Vigorita wrote on 11/1/2021, 12:36 PM

Odd that 24fps is the only single rate hevc they support. And if it's strictly long gop, might not be that great for edit anyway. You may get best editing results with their intra format but it'll be large files and no gpu support at all. Maybe a future firmware will bring more options.

Former user wrote on 11/1/2021, 6:45 PM
 

@RealityStudio Would suggest you compare the workflows and output results you get shooting 4:2:0 h264 60fps and hevc 30fps. Although hevc performs poorly for me at high frame rates, it out performs h264 at high bitrates using normal frame rates.

@Howard-Vigorita That is interesting, can you point to areas where it out performs, in what scenarios. I'd like to compare results

Howard-Vigorita wrote on 11/2/2021, 11:02 AM

Kind of a moot point here if the a7s3 can't shoot at 30'ish. But if your camera can do both, it's easy enough to shoot 2 clips and compare. I'll see if I can post some some numbers on different systems soon as I get a chance.

Howard-Vigorita wrote on 11/2/2021, 3:49 PM

Just shot a couple of 1-minute test clips of my patio pond, both with a zcam e2 set to high bitrate 4:2:0.

h.264, 8-bit, avc at 60 fps: 1.65 gb (237 mbps); h.265, 10-bit hevc at 30 fps: 1.44 gb (203 mbps)

h.264 clip plays 60fps at best-half on 9900k; preview-auto on xeon. Transcode time to Magix avc with default 24/48 template: 2:27 (9900k vce); 1:45 (xeon vce); 1:59 (xeon nvenc). Render time graded 60fps with Magix avc 40/80 for YouTube: 1:33 (9900k vce).

h.265 clip plays 30fps at best-full on 9900k; best-half on xeon. Transcode time to Magix avc with default 24/48 template: 0:50 (9900k vce); 0:53 (xeon vce) 1:07 (xeon nvenc). Render time graded 30fps with Magix avc 40/80 for YouTube: 0:55 (9900k vce).

Identically graded clips just uploaded to YouTube (but wait overnight for processing to complete... see current optimal res in stats for nerds):

Btw, the 11900k w/6900k plays them both at best-full but the transcode of the 60 fps avc is 2:15 compared to 0:36 for the 30 fps hevc.

EDIT: took a while but here's a link to a zip with the 2 camera clips and Vegas19 grade projects (approx 3gb).

RealityStudio wrote on 11/5/2021, 2:07 AM

Ended up buying the cheapest M1 based Mac that I could find which was the base model Mac Mini for $650 just to goof around with. I'm not familiar with Final Cut but decided to install the free trial to see how it handles the footage that my laptop struggles with and to my surprise it can handle 10 bit 60fps 150mbps HEVC on the timeline with ease at full frame rate. Even scrubbing was fast and smooth, go figure! I'm not really in the mood to change programs so hopefully Vegas can improve in performance over time.

Howard-Vigorita wrote on 11/5/2021, 12:08 PM

That looks like a good m1 get-acquainted platform and I see B&H has it. Not sure which m1 they did this on but saw this report on reddit suggesting the possibility it might some day some way run Vegas:

https://www.reddit.com/r/Windows11/comments/ob8awj/windows_11_arm_is_amazing/

RealityStudio wrote on 11/5/2021, 8:12 PM

I wonder if they could ever get Vegas working on Windows arm given how many decades of code is in there and how many legacy x86 software modules it depends on but who knows I guess.

I did do two more tests. I bought Davinci Resolve Studio since it works on both PC and Mac and decided to test the same footage on the same Windows Dell XPS 15 laptop that Vegas struggles on. Once again surprise surprise it plays 60fps HEVC 150mbps footage at full frame rate easily (42% gpu) and also scrubs quick and smooth.

I took it a step further and decided to compare render times with my typical settings using the same laptop, same render settings and using same NVidia encoder and this one was a shocker. For a 5:24m video with 30fps 4k 100mbps h264 source file these are the render times to 30fps 4k 10mbps HEVC:

Vegas Pro: 7:07

Resolve: 2:41

Way more of a difference that I ever would have expected. I took a look at cpu/gpu use while Resolve was rendering and it was 10% cpu and the gpu was pegged at 99%.

All this basically proves to me that it's a software issue that Vegas needs to work on, and not an issue with my laptop. I will eventually install Resolve on my little M1 Mac Mini out of curiosity to see how long that one takes to render the same clip.

Former user wrote on 11/5/2021, 9:30 PM
 

I took it a step further and decided to compare render times with my typical settings using the same laptop, same render settings and using same NVidia encoder and this one was a shocker. For a 5:24m video with 30fps 4k 100mbps h264 source file these are the render times to 30fps 4k 10mbps HEVC:

Vegas Pro: 7:07

Resolve: 2:41

That most likely is correct, except if you're still using VP18 as I do, and find NVENC encodes to be much slower compared to VP19, although other people haven't found that. The bug if there is one lays with 3000 series GPU's and NVENC encoding in VP18. Those times may also be normal with what you're doing. All the other well known editors are faster as they are more capable of using GPU and CPU

Way more of a difference that I ever would have expected. I took a look at cpu/gpu use while Resolve was rendering and it was 10% cpu and the gpu was pegged at 99%.

If you are looking at GPU, the single figure, rather then the GPU engines, then that 100% is not necessarily GPU processing it could be the NVENC encoder working at 100% which is also a great indicator as the bottleneck in your system is your hardware encoder, and that's the best possible bottleneck to have as the editor is pushing data to the encoder faster then the encoder can operate

All this basically proves to me that it's a software issue that Vegas needs to work on, and not an issue with my laptop. I will eventually install Resolve on my little M1 Mac Mini out of curiosity to see how long that one takes to render the same clip.

Resolve is fast but difficult to master it's full potential. M1 has other options that are just as fast and easier to learn, with less features

 

RogerS wrote on 11/5/2021, 9:43 PM

Can you share screenshots of how the other programs are using the GPU? They're probably using both 3D and encode to get high performance. As Todd mentions, the single figure doesn't tell you the whole story.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208
VEGAS Pro 22.93

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

RealityStudio wrote on 11/5/2021, 10:13 PM

I'm using Vegas 19 with latest patch. Ok I'll check the more detailed gpu metrics in task manager next time I do some tests so we can get more info on what's going on. Yeah the appeal of Vegas is how quick and easy it is to use, I find some things in FCP and Resolve to be infuriating. Like I love how Vegas simply saves all project settings in a single *.veg file that I can put into the folder of a video shoot. Makes it incredibly simple to archive footage, move footage around to other hdd's, copy shoots back to revisit them in the future, etc. With Resolve they have a database, then projects, then events and all this nested stuff that is saved who knows where which makes it really tough for someone like me with 1000+ video shoots to archive stuff off my machine, revisit shoots later, change storage, etc.

Howard-Vigorita wrote on 11/6/2021, 11:05 AM

With Resolve they have a database, then projects, then events and all this nested stuff that is saved who knows where which makes it really tough for someone like me with 1000+ video shoots to archive stuff off my machine, revisit shoots later, change storage, etc.

The database is probably a major key to Resolve playback performance. They create all these little background pre-renders and cobble them together with the database for playback. Vegas will probably have to do something similar to compete. Though a simpler approach like a btree library might work. Btw, reports I've seen from other Resolve users is that they get 3x the performance on an m1. Compared usually to a gtx1080. Might shake things up if Apple marketed a video board powered by an m1.

RealityStudio wrote on 11/6/2021, 11:26 AM

Well I ended up uninstalling Davinci Resolve and asking for a refund, their database requirement and all it's baggage simply isn't workable for my needs. Shame as it seemed like a nice program aside from that, but I still have FCP to check out as well.

I posted a more detailed gpu pic below during my HEVC render on Vegas 19 on my laptop. Problem is all the numbers bounce around all over the place so it's a bit deceptive, like cpu is 62% in that screen grab but often it's a fraction of that, it's all over the place. But something is clearly wrong as the machine is mostly underused, there's constant stalls and delays which would explain why it's so slow to render. I know it's been this way for many years but hopefully this is something the Vegas crew could bring up to speed to what the other nle's can do.

 

EDIT: @Howard-Vigorita To your comment about an M1 video board, I think much of the efficiency of their M1 design is because it's all integrated from the ground up and uses a shared memory pool to eliminate many bottlenecks in typical PC design. It's similar to how game consoles are designed and how they can often punch well above their weight in terms of performance.

RogerS wrote on 11/6/2021, 11:14 PM

Screenshots of Vegas isn't as interesting as we know how that works- would like to see other software, especially FCP X (I assume there are some Mac-specific tools to show hardware utilization). When I used FCP it was creating optimized media upon import by default, even for h.264. It took forever to get started but then worked great. Not sure what it is like these days.

For Vegas it's always like this. For more even loading of the hardware try Voukoder- there is x.264, x.265 and GPU-accelerated x.264 encoding.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208
VEGAS Pro 22.93

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

RealityStudio wrote on 12/4/2021, 5:47 PM

I'm reviving this as I just did some more tests. I ended up keeping Davinci Resolve after all, since it's just a one time fee I figure why not, I can watch them improve it over time. For my tests I use the exact same source and output options in all cases:

Source: 4K 100mbps h264 MP4 of 7:37 length from a Sony A7S3

Output: 4k 10000kbps h265 MP4 of 7:37 length, raw file with only one color lut added, nothing more, NVidia encoder in all cases.

I did three render comparisons, here's the results.

1) Vegas with Magix

Render time was 9:28

Cpu and gpu are bounce all over the place, with cpu bouncing between 5% and 80%, gpu bouncing between 27% and 87%.

 

2) Vegas with Voukoder

Render time was 6:47

Cpu around 82%, gpu around 66%

 

3) Davinci Resolve Studio 17

Render time was 3:43

Cpu solid at 10%, gpu solid at 100%.

 

I attached task manager pics for each render to show cpu and gpu usage. Clearly Vegas+Magix is software limited, it's use of my hardware is all over the place and extremely inefficient. Vegas+Voukoder is much better, both faster and infinitely more consistent but still software limited. Resolve is clearly hardware limited, it smashes my gpu at 100% the entire time and is far faster than Vegas with either option.

I know Vegas's limited use of hardware has been an ongoing thing for many years now but hopefully the devs can improve this in Vegas 20, I'll keep my fingers crossed. I'd love to keep using Vegas as it's still much quicker to edit in Vegas for my needs than Resolve.

For the pics below, 1st is Resolve, 2nd is Vegas+Magix, 3rd is Vegas+Voukoder