Specs: Vegas 22 latest build / i9-14900ks (not overclocked) / Corsair Dominator 64 GB 5600 (XMP) / AMD Radeon RX 6900 XT / Intel Arc A770 / Windows 11
Timeline: QFHD 24p (3840x2160, 23.976 fps) 32 bit float. Approx 8 minutes in length
Meida: Mostly 422 10 Bit XAVC HS (h.265) clips from Sony FX3. A few other clips in my timeline are HEVC 6K clisp from a Panasonic S5ii and UHD clips from a DJI Drone.
Render Settings: I'm using a Magix HEVC UHD 24p template, rendering using Main Concept.
This render completes in 21 minutes in Vegas 20. It takes 1 hour 40 minutes in Vegas 22 with the exact same settings. This is most likely because of a Memory leak issue that's been previously discussed in other posts. In my tests to avoid this issue I've discovered some odd things.
1) In Vegas 20, my optimal settings were my Arc A770 for decoding (in I/O) and AMD Radeon RX 6900 xt in GPU acceleration. This seemed to result in best playback and render times. In Vegas 22, I can not pick the Arc in I/O and Radeon RX in GPU acceleration at the same time. If I do, all the clips from my Sony FX3 are black in the timeline and they do not render. In Vegas 22, I must pick the same device as my decoder and GPU acceleration. I can however pick separate devices IF I check the "enable experimental HEVC decoding". This does not solve any rendering issues though if checked.
2) This issue seems to be due to an Intel GPU memory leak so I expected this issue to go away if I used the Radeon as both the decoder and in GPU acceleration. The issue does not go away. If I view my performance monitor, I see that the Intel GPU is still using all it's memory and it's not supposed to be active. I have no other open programs.
3) If I turn off both GPU acceleration and hardware decoding, my media in my tilmeline looks like this. This doesn't happen in 20.
All this being said, I did realize that I can get the memory leak to go away by simply plugging my monitor in to my Radeon instead of my Arc. I can now use Intel or Amd as my decoder or GPU accelerator and my render times have improved (although still much slower than Vegas 20). I can now get this project to render in 30 minutes (Vegas 20 renders it in 21). Why would plugging my monitor in to a different port cause the memory leak?