Hey all, I'm using a Dell XPS 15 laptop, i7 11800H 8 core cpu along with both Intel integrated graphics and an NVidia 3050ti and I'd like to force Vegas to only use the NVidia 3050ti for processing. From what I see there are two places where you can configure gpu use:
1) In Options->Preferences in the "Video" tab you can choose which gpu is used for acceleration of video processing, so I have that set to the NVidia.
2) In Options->Preferences in the "File I/O" tab you can choose which hardware decoder to use, so I have that set to NVidia NVDEC.
Normally I still use 30fps 8 bit h264 codec on my Sony A7S3 because it works fast and I get full frame playback on the timeline, but now I'm toying with their 10 bit 60fps HEVC codec. Thing is though when I playback video on the time line and look at gpu/cpu use it jumps like this:
CPU: 2% to 66%
Intel: 0% to 34%
NVidia: 0% to 18%
...which seems to imply that it's still using the Intel gpu for something while the NVidia gpu snoozes so my 60fps HEVC footage plays back at around on only 27fps or so. Is there another setting somewhere in Vegas that I need to configure to make sure Vegas is not using the Intel gpu for processing?
EDIT: In case it comes up, switching to the Intel gpu for decoding in the "File I/O" tab drops timeline playback from 27fps to 3fps.
EDIT: I'm using the latest NVidia studio drivers.

