Comments

fr0sty wrote on 8/9/2019, 2:34 PM

Depends on how you have it set up... for instance, you can have timeline rendering set to use your intel quicksync, while you use AMD VCE or Nvidia NVENC to do the rendering, which spreads that load across 2 devices... this can produce better results assuming each device being used doesn't bottleneck the other in the process. Test with both configurations, using just one GPU, and using both, to find out.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

D7K wrote on 8/9/2019, 3:09 PM

Thanks, have several projects to test.

Howard-Vigorita wrote on 8/9/2019, 11:32 PM

I am noticing that both Intel and AMD GPU's are showing using both in watching time line and rendering. Is this good, bad or Ugly?:) What would cause this? VP17.


@D7K there's an option in general settings to enable Intel QSV encoding and decoding when available. Good idea to check it. Then you don't have to select the Intel gpu in Video settings. Select the AMD one there instead and Vegas will still call the Intel iGpu for qsv stuff. I get really bad results if I select the Intel gpu in settings/video. Another thing you might try if you have a motherboard hdmi port is to plug your monitor into that but still select the AMD card in Vegas. Seems to render a bit faster for me when I do that. Also be aware that Vegas is not in full control of the Intel iGpu usage... the cpu hands some stuff off to it's iGpu through it's back door direct connection all on it's own. It's all good.

fr0sty wrote on 8/9/2019, 11:48 PM

One more option, you can use your main GPU for timeline acceleration and for rendering (VCE or NVENC), and you can go into the file i/o tab of preferences and have the quicksync decode the video. Then you have intel decoding, GPU rendering using OpenCL on its processor cores, then GPU encoding on its encoder chip. 3 chips all working in tandem.

That in mind, it seems like the fastest way to take advantage of Vegas 17 is an intel CPU, as AMD CPUs do not have any decoding capability and therefore no way to get the 3 chip solution working as described above. That said, I don't know if using quicksync over your GPU to do decoding will provide superior results over just using the GPU for all 3, this would make for some interesting benchmarking.

Last changed by fr0sty on 8/9/2019, 11:50 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

bitman wrote on 8/10/2019, 4:46 AM

@fr0sty keep in mind that using QSV over Nvidea (or using both) may look interesting to share the load so it will be more balanced, but I assume it will heat up the CPU more, and this could trigger throttling down (or stop boosting) the CPU clock speed for thermal reasons negating the profit of QSV (unless you have a more elaborate and fancy cooling system).

APPS: VIDEO: VP 365 suite (VP 22 build 194) VP 21 build 315, VP 365 20, VP 19 post (latest build -651), (uninstalled VP 12,13,14,15,16 Suite,17, VP18 post), Vegasaur, a lot of NEWBLUE plugins, Mercalli 6.0, Respeedr, Vasco Da Gamma 17 HDpro XXL, Boris Continuum 2025, Davinci Resolve Studio 18, SOUND: RX 10 advanced Audio Editor, Sound Forge Pro 18, Spectral Layers Pro 10, Audacity, FOTO: Zoner studio X, DXO photolab (8), Luminar, Topaz...

  • OS: Windows 11 Pro 64, version 24H2 (since October 2024)
  • CPU: i9-13900K (upgraded my former CPU i9-12900K),
  • Air Cooler: Noctua NH-D15 G2 HBC (September 2024 upgrade from Noctua NH-D15s)
  • RAM: DDR5 Corsair 64GB (5600-40 Vengeance)
  • Graphics card: ASUS GeForce RTX 3090 TUF OC GAMING (24GB) 
  • Monitor: LG 38 inch ultra-wide (21x9) - Resolution: 3840x1600
  • C-drive: Corsair MP600 PRO XT NVMe SSD 4TB (PCIe Gen. 4)
  • Video drives: Samsung NVMe SSD 2TB (980 pro and 970 EVO plus) each 2TB
  • Mass Data storage & Backup: WD gold 6TB + WD Yellow 4TB
  • MOBO: Gigabyte Z690 AORUS MASTER
  • PSU: Corsair HX1500i, Case: Fractal Design Define 7 (PCGH edition)
  • Misc.: Logitech G915, Evoluent Vertical Mouse, shuttlePROv2

 

 

Former user wrote on 8/10/2019, 5:19 AM

The reason laptop manufacturers all use movie playback as a benchmark for their battery life is because in reality there's very little happening on cpu or gpu, as it's quicksync or the equivalent efficiently doing all the work instead. The IGPU in intels are often said to use about 15w max, but I vaguely recall the UHD630 graphics on an i9-9900K in i'ts 170w configuration using more like 35w(that sounds unlikely but so does a 95w chip doing 170w that isn't overclocked). I"m assuming graphic benchmarking to create these wattages.

I do know gamers will turn off their IGPU as either there is evidence or just speculation that being turned on even if doing nothing does contribute to heat output that can reduce performance

D7K wrote on 8/10/2019, 11:13 AM

With 5 high performance fans and liquid cooling not sure I have to worry about over heating but with a total of 4 gig's of ram (32 memory + 8 GPU) I guess it could happen. When I have some time I do some testing.

D7K wrote on 8/10/2019, 12:55 PM

When rendering 4K video with color correction and using new lens correction to make drastic changes due to the use of the CPU (80% or so), Intel GPU (70-90%), and AMD GPU (17%) the intel HVEC is the speed champ vs the AMD HVEC (took 4 times as long due to utilizing the GPU 100% but the cpu only 20%). Render on 4K as described is less than 80% of the native length. My machine is an i7700 4.2 mhz/auto OC 4.4 mhz 32 gig memory and an Radon RX480 8 gig.

Former user wrote on 8/10/2019, 7:04 PM

With 5 high performance fans and liquid cooling not sure I have to worry about over heating but with a total of 4 gig's of ram (32 memory + 8 GPU) I guess it could happen. When I have some time I do some testing.

It's more of a laptop problem. But with many laptops that use discreet GPU they turn off the IGPU & there is no way of activating it possibly to help reduce thermal throttling of cpu.

bitman wrote on 8/11/2019, 2:21 AM

@Former user indeed, laptops and video editing, not an ideal mariage ...

Last changed by bitman on 8/11/2019, 2:21 AM, changed a total of 1 times.

APPS: VIDEO: VP 365 suite (VP 22 build 194) VP 21 build 315, VP 365 20, VP 19 post (latest build -651), (uninstalled VP 12,13,14,15,16 Suite,17, VP18 post), Vegasaur, a lot of NEWBLUE plugins, Mercalli 6.0, Respeedr, Vasco Da Gamma 17 HDpro XXL, Boris Continuum 2025, Davinci Resolve Studio 18, SOUND: RX 10 advanced Audio Editor, Sound Forge Pro 18, Spectral Layers Pro 10, Audacity, FOTO: Zoner studio X, DXO photolab (8), Luminar, Topaz...

  • OS: Windows 11 Pro 64, version 24H2 (since October 2024)
  • CPU: i9-13900K (upgraded my former CPU i9-12900K),
  • Air Cooler: Noctua NH-D15 G2 HBC (September 2024 upgrade from Noctua NH-D15s)
  • RAM: DDR5 Corsair 64GB (5600-40 Vengeance)
  • Graphics card: ASUS GeForce RTX 3090 TUF OC GAMING (24GB) 
  • Monitor: LG 38 inch ultra-wide (21x9) - Resolution: 3840x1600
  • C-drive: Corsair MP600 PRO XT NVMe SSD 4TB (PCIe Gen. 4)
  • Video drives: Samsung NVMe SSD 2TB (980 pro and 970 EVO plus) each 2TB
  • Mass Data storage & Backup: WD gold 6TB + WD Yellow 4TB
  • MOBO: Gigabyte Z690 AORUS MASTER
  • PSU: Corsair HX1500i, Case: Fractal Design Define 7 (PCGH edition)
  • Misc.: Logitech G915, Evoluent Vertical Mouse, shuttlePROv2