Vegas 22 Memory Leak - One cause identified

jimingo-1 wrote on 10/23/2024, 11:26 AM

Specs: Vegas 22 latest build / i9-14900ks (not overclocked) / Corsair Dominator 64 GB 5600 (XMP) / AMD Radeon RX 6900 XT / Intel Arc A770 / Windows 11

Timeline: QFHD 24p (3840x2160, 23.976 fps) 32 bit float. Approx 8 minutes in length

Meida: Mostly 422 10 Bit XAVC HS (h.265) clips from Sony FX3. A few other clips in my timeline are HEVC 6K clisp from a Panasonic S5ii and UHD clips from a DJI Drone.

Render Settings: I'm using a Magix HEVC UHD 24p template, rendering using Main Concept.

This render completes in 21 minutes in Vegas 20. It takes 1 hour 40 minutes in Vegas 22 with the exact same settings. This is most likely because of a Memory leak issue that's been previously discussed in other posts. In my tests to avoid this issue I've discovered some odd things.

1) In Vegas 20, my optimal settings were my Arc A770 for decoding (in I/O) and AMD Radeon RX 6900 xt in GPU acceleration. This seemed to result in best playback and render times. In Vegas 22, I can not pick the Arc in I/O and Radeon RX in GPU acceleration at the same time. If I do, all the clips from my Sony FX3 are black in the timeline and they do not render. In Vegas 22, I must pick the same device as my decoder and GPU acceleration. I can however pick separate devices IF I check the "enable experimental HEVC decoding". This does not solve any rendering issues though if checked.

2) This issue seems to be due to an Intel GPU memory leak so I expected this issue to go away if I used the Radeon as both the decoder and in GPU acceleration. The issue does not go away. If I view my performance monitor, I see that the Intel GPU is still using all it's memory and it's not supposed to be active. I have no other open programs.

3) If I turn off both GPU acceleration and hardware decoding, my media in my tilmeline looks like this. This doesn't happen in 20.

All this being said, I did realize that I can get the memory leak to go away by simply plugging my monitor in to my Radeon instead of my Arc. I can now use Intel or Amd as my decoder or GPU accelerator and my render times have improved (although still much slower than Vegas 20). I can now get this project to render in 30 minutes (Vegas 20 renders it in 21). Why would plugging my monitor in to a different port cause the memory leak?

Comments

john_dennis wrote on 10/23/2024, 12:27 PM

@jimingo-1 said: "Why would plugging my monitor in to a different port cause the memory leak?"

AMD might ask why one would buy a video adapter and have it drawing power in a machine without having a display attached to a port on the device. I'm unfamiliar with all the different configurations that one might have, but it seems that configuring a device on a port would be housekeeping early on in the driver configuration or some other aspect of running a video adapter. At a minimum, I would think running without an attached output device would not be a common scenario.

Is this a "thing"?

I'm open to fact-checking for the next couple of weeks.

Howard-Vigorita wrote on 10/23/2024, 12:32 PM

I think the present state of the vp22 video engine rewrite requires that the gpu selected in video prefs matches the one selected in i/o. Which optimally is what your hdmi plugs into. It's way slower rendering that way but pretty good playing and editing. I'm starting my own big projects in vp21 b208 for the old rendering performance using multiple gpus. But again make sure hdmi is plugged into the video prefs gpu.. Btw, I got smoothest previews in vp21 with vid prefs gpu set to and plugged into an a770. But allot of flashing of the status-bar at the bottom of the preview display with the 6900xt. So annoying I shut the preview status-bar off. Same project in vp22 plays on the 6900xt, a770, or 4090 with no status-bar flashing.

jimingo-1 wrote on 10/23/2024, 12:35 PM

I do understand what you're saying John but the memory leak thing happens to computers with iGPUs as well (instead of systems with 2 dedicated video cards). I have two other machines that are experiencing this issue and both have Intel iGPUs. They both have additional graphics cards (one has a NVIDIA and the other has an AMD). That scenario would definitely be more common. It seems like a Vegas issue though because I don't have problems in any other programs.

jimingo-1 wrote on 10/23/2024, 12:38 PM

Yeah Howard...right now I have both set to AMD and also have my monitor plugged in to my AMD port. For me, it's the only way to get rid of the memory leak issue. Slightly worse performance but stable.

jimingo-1 wrote on 10/23/2024, 8:28 PM

I just unplugged my AMD card to test my system with one GPU (the Arc A770). I also made sure my iGPU was turned off. The problem still persisted. I guess it's just a problem with Vegas 22 and Intel graphics.