I am noticing that both Intel and AMD GPU's are showing using both in watching time line and rendering. Is this good, bad or Ugly?:) What would cause this? VP17.
Depends on how you have it set up... for instance, you can have timeline rendering set to use your intel quicksync, while you use AMD VCE or Nvidia NVENC to do the rendering, which spreads that load across 2 devices... this can produce better results assuming each device being used doesn't bottleneck the other in the process. Test with both configurations, using just one GPU, and using both, to find out.
I am noticing that both Intel and AMD GPU's are showing using both in watching time line and rendering. Is this good, bad or Ugly?:) What would cause this? VP17.
@D7K there's an option in general settings to enable Intel QSV encoding and decoding when available. Good idea to check it. Then you don't have to select the Intel gpu in Video settings. Select the AMD one there instead and Vegas will still call the Intel iGpu for qsv stuff. I get really bad results if I select the Intel gpu in settings/video. Another thing you might try if you have a motherboard hdmi port is to plug your monitor into that but still select the AMD card in Vegas. Seems to render a bit faster for me when I do that. Also be aware that Vegas is not in full control of the Intel iGpu usage... the cpu hands some stuff off to it's iGpu through it's back door direct connection all on it's own. It's all good.
One more option, you can use your main GPU for timeline acceleration and for rendering (VCE or NVENC), and you can go into the file i/o tab of preferences and have the quicksync decode the video. Then you have intel decoding, GPU rendering using OpenCL on its processor cores, then GPU encoding on its encoder chip. 3 chips all working in tandem.
That in mind, it seems like the fastest way to take advantage of Vegas 17 is an intel CPU, as AMD CPUs do not have any decoding capability and therefore no way to get the 3 chip solution working as described above. That said, I don't know if using quicksync over your GPU to do decoding will provide superior results over just using the GPU for all 3, this would make for some interesting benchmarking.
@fr0sty keep in mind that using QSV over Nvidea (or using both) may look interesting to share the load so it will be more balanced, but I assume it will heat up the CPU more, and this could trigger throttling down (or stop boosting) the CPU clock speed for thermal reasons negating the profit of QSV (unless you have a more elaborate and fancy cooling system).
Former user
wrote on 8/10/2019, 5:19 AM
The reason laptop manufacturers all use movie playback as a benchmark for their battery life is because in reality there's very little happening on cpu or gpu, as it's quicksync or the equivalent efficiently doing all the work instead. The IGPU in intels are often said to use about 15w max, but I vaguely recall the UHD630 graphics on an i9-9900K in i'ts 170w configuration using more like 35w(that sounds unlikely but so does a 95w chip doing 170w that isn't overclocked). I"m assuming graphic benchmarking to create these wattages.
I do know gamers will turn off their IGPU as either there is evidence or just speculation that being turned on even if doing nothing does contribute to heat output that can reduce performance
With 5 high performance fans and liquid cooling not sure I have to worry about over heating but with a total of 4 gig's of ram (32 memory + 8 GPU) I guess it could happen. When I have some time I do some testing.
When rendering 4K video with color correction and using new lens correction to make drastic changes due to the use of the CPU (80% or so), Intel GPU (70-90%), and AMD GPU (17%) the intel HVEC is the speed champ vs the AMD HVEC (took 4 times as long due to utilizing the GPU 100% but the cpu only 20%). Render on 4K as described is less than 80% of the native length. My machine is an i7700 4.2 mhz/auto OC 4.4 mhz 32 gig memory and an Radon RX480 8 gig.
Former user
wrote on 8/10/2019, 7:04 PM
With 5 high performance fans and liquid cooling not sure I have to worry about over heating but with a total of 4 gig's of ram (32 memory + 8 GPU) I guess it could happen. When I have some time I do some testing.
It's more of a laptop problem. But with many laptops that use discreet GPU they turn off the IGPU & there is no way of activating it possibly to help reduce thermal throttling of cpu.