Rendering GH5 5k vs GH4 4K with AMD Radeon VII and Intel HD GPUs

NCARalph wrote on 1/1/2020, 5:45 PM

Rendering 5k video (4992x3744) is MUCH slower than 4k (3840x2160) so I tried some tests, all on 30 second clips cropped to 1080. All files are on the NVME SSD and the Video GPU was set to the AMD.

All these numbers are approximate because the usages bounced all over the place although I did run the tests 5 times to get a better idea of what was going on.

First, just playing the clips on the timeline at 1080 gave me about .8 fps on the 5K and about 26 fps on the 4K at Best Full. Interestingly, on the 5K clips, almost all the work was done on the Intel HD GPU 40% utilization vs about 6% on the AMD. However, this reversed on the 4K clips, the HD went down to about 20% and the AMD up to about 40%.

Similarly, VCE rendering the 5K took 1:42 with about 70% of the HD used, 6% of the AMD and 40% of the CPU. Rendering the 4K clip the same way took 0:25 with 35% of the HD used, 50% of the AMD and 70% of the CPU. In both cases, the AMD video memory was barely used at all, about 500 MB out of 16 GB.

I tried disabling the HD Gpu in the device manager and the 5K rendering was slightly faster 1:31 with the AMD usage about the same, but the CPU dropped to under 50%. The 4K render was the same time 0:25 with the same GPU and CPU usage. For rendering, it seems that the HD GPU is mostly getting in the way. For timeline playback though it's essential.

My guess is that a couple of things are going on. First, as has been discussed, the AMD decode hasn't been released, I can't wait. But, I think it's more than that, The fact that so little of the video memory is in use seems suspicious. There's also something odd with the decoding since the the HD GPU reports heavy use of the 3D processing, but nothing for the video decoding hardware. The AMD reports heavy use of the video encoding and a few % for the 3D.

I'm hoping that an upcoming release will make more use of the AMD hardware, it's not even getting warm now.

NOTE: for anyone playing with this kind of thing, Vegas does something in the background when it first starts that eats a huge % of the CPU. Sometimes that lasts for a few minutes and really messes up the timing measurements. Keep watching the task manager till the CPU for Vegas goes to 0%.


TheRhino wrote on 1/1/2020, 6:29 PM

If you have a way to post your source file 30 sec. clips & VEG on a file server website I will run it on my 9900K with VEGA 64 for comparison.

IMO your 4-core 6700K @ 4ghz is going to struggle with 5K regardless the GPU. Before I upgraded to the 9900K, I was trying to edit 4K on a 6-core Xeon @ 4 ghz, which performs similar to the newer 6700K, and I had difficulty previewing & rendering 4K intermediates. On the 8-core 9900K & VEGA 64 4K work is as fluid as 1080p was on my Xeon. A 1 minute 4K project using 4K intermediates takes about 1 minute to render to intermediates & is even faster rendering to HEVC / MP4...

Note that on my system, for Vegas to utilize the onboard QSV & VEGA 64 in combination I need a monitor connected to the internal GPU/motherboard connector. I have not looked to see if there is a work-around since I use 3 monitors, but without onboard QSV I get slower results... Also, I would like Magix to release a V17 update that fully utilizes AMD... The Radeon VII & VEGA have a lot of compute potential that is unused by Magix...

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...