V22 i-GPU / GPU settings for HEVC 10bit footage

gabgilson wrote on 8/6/2024, 7:59 AM

Hi there. I know this has been dealt with on and off in various other posts, but I can't see a clear answer to follow, and I wonder if V22 has changed best practice?

In short, I use sony HEVC 10 bit footage (from an A7siii). I've just switched from an AMD desktop to a laptop with an i9-13980 and Nvidia 4070 GPU.

Super grateful if anyone has advised playback and render settings for this combination. I think it's this - but have I got it scambled?

  • Preference video tab. Set GPU acceleration to Auto Nvidia 4070
  • Preferences File I/O. Set decoder to Intel R series (i.e. internal GPU)
  • Preferences File I/O. Experimental and legacy both left unchecked
  • Render. Choose a QSV based template. (I tried an NVENC but got garbage in the render).

Does that look right? And do I need to do anything in the BIOS to really use the QSV with i-GPU combination?

Rendering with the above settings shows 90% activity in the 4070, a bit in the iGPU and CPU down below 10%. So I don't think I've got the QSV working. Speed is nothing dramatic - similar to 5900x and 6700xt combo on my desktop

Very happy to try and post again if I've scrambled the info. I've been using AMD CPU and GPU for years before this, so this is a bit of a mind switch to make!

 

Thanks in advance.

Comments

Howard-Vigorita wrote on 8/6/2024, 8:48 AM

Yes, the Vegas optimal setup is changed. Vp21 build 208 was the last release which was optimal to use an igpu for decoding. Since then, automatic in i/o was changed to match the gpu selected in video prefs which should be marked Optimal there. I've tried selecting a vp22 igpu or 2nd gpu in i/o and performance was a little less than the default of letting them match. Overall, the new approach seems to favor playback/editing over file-rendering. I think this new approach is still under development so hopefully they may find a way to optimize both in future releases of vp22.

gabgilson wrote on 8/6/2024, 9:04 AM

Thanks @Howard-Vigorita that explains some of the odd render times I'm getting. Have you found the best render settings to use in v22? (I'm trying to render 10bit HEVC footage to 4k). If the iGPU route is no longer worth using, then I'm guessing choosing intel QSV or NVENC are equally fine? Or is mainconcept better now? (This is for i9 and 4070 GPU laptop combo)

Howard-Vigorita wrote on 8/6/2024, 9:38 AM

I leave everything mostly at defaults. I shoot almost all 4:2:0 10-bit hevc and render the same for YouTube finals with either QSV or Nvenc if I'm in a hurry. Or MainConcept for the best results over night. I render VCE on my desktop for the quickest drafts.

I tried shooting 4:2:2 hevc but the Iris/xe igpu in my 12900h laptop didn't help at all. But yours may be different. I cannot do anything with my laptop igpu in bios... I think my display is hard-wired to it.

RogerS wrote on 8/6/2024, 6:41 PM

If it's not a 10 bit 422 HEVC source I'd leave file io decoding set to NVDEC. If you have to use 422 only the Intel iGPU can decode it.

For render to AVC, I find QSV too low quality to use in VEGAS so recommend NVENC. If you want to use QSV, do so through Voukoder. Mainconcept or Voukoder x264 are great if you have time for a CPU only render.