Hardware decoder to use (under Vegas File I/O opton)

James-Segwell wrote on 12/6/2023, 5:16 AM

My laptop has an NVIDIA RTX 3060 + an embebed AMD Radeon graphics. By default, Vegas (v21) chooses the AMD Radeon as the default 'hardware decoder to use', and the RTX 3060 as 'The RAW processor to use'. Would it not be better to use the RTX 3060 as the 'hardware decoder to use', or even off so it can use the CPU instead? . Feedback appreciated.

Comments

Wolfgang S. wrote on 12/6/2023, 5:43 AM

In preferences/video/gpu acceleration I would use the nvidia rtx 3060+. This helps to calculate transitions, color corrections.

In preferences/file i/o/hardware decoder to use you decide, which GPU is used to decode your source files. A lot of people use here i-GPUs from Intel, especially for HEVC this is important. I use here my Iris Xe GPU, what is part of the CPU. I do not know how well your AMD GPU performs in terms of file decoding, I would recommend to try that - you see that especially on the playback behaviour, how many frames per second you will achieve.

raw processor to use should be important if you use raw files - but I do not know if you use that.

So, what type of footage do you edit?

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

James-Segwell wrote on 12/6/2023, 5:59 AM

Thanks for the comment Wolfgang S. As for the GPU acceleration I always use the NVIDIA graphics. So, it seems that the integrated GPU is the preferred option for i/o hardware decoding, as you mentioned it's part of the CPU. I will try that and compare it to other options.

I don't shoot in RAW at present, though it's good to know.

Thanks

Wolfgang S. wrote on 12/6/2023, 6:33 AM

So, it seems that the integrated GPU is the preferred option for i/o hardware decoding, as you mentioned it's part of the CPU. I will try that and compare it to other options.
 

sure, but I do not know how well the amd based i-GPUs perform really. The intel based GPUs are great for sure. But try it out.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

James-Segwell wrote on 12/6/2023, 8:04 AM

I will try and compare the results between using the integrated and the main graphics card, and give my feedback. Thanks.

Howard-Vigorita wrote on 12/6/2023, 10:24 AM

@James-Segwell You might try going into settings for Windows under System/Display/Graphics and add Vegas as an app. That will give you the opportunity to assign Vegas a high-performance or power-saving gpu. The choice you make there seems to affect what Vegas assigns by default in Video Prefs... on my laptop I get Vegas to mark the Nvidia gpu as Optimal there and use it by default.

James-Segwell wrote on 12/6/2023, 11:09 AM

Hey Howard-Vigorita, I never knew about that tip. Many thanks!

James-Segwell wrote on 12/6/2023, 11:16 AM

@James-Segwell You might try going into settings for Windows under System/Display/Graphics and add Vegas as an app. That will give you the opportunity to assign Vegas a high-performance or power-saving gpu. The choice you make there seems to affect what Vegas assigns by default in Video Prefs... on my laptop I get Vegas to mark the Nvidia gpu as Optimal there and use it by default.

Hey Howard-Vigorita, I never knew about that tip. Many thanks!

Wolfgang S. wrote on 12/6/2023, 11:37 AM

True, this can be done. I tried that long time ago too, and it has an impact to what is seen as optimized GPU by Vegas. But I never ended up to understand, what it helps if I assign low power performance to Vegas. Things like transitions should be calculated by the most powerfull gpu, what is the 3060+ in this system. Otherwise you will see a drop in the playback performance.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

James-Segwell wrote on 12/6/2023, 12:20 PM

True, this can be done. I tried that long time ago too, and it has an impact to what is seen as optimized GPU by Vegas. But I never ended up to understand, what it helps if I assign low power performance to Vegas. Things like transitions should be calculated by the most powerfull gpu, what is the 3060+ in this system. Otherwise you will see a drop in the playback performance.

I'll try it out and see what difference it makes.