2nd gpu in 11900k system

Howard-Vigorita wrote on 7/20/2022, 11:00 PM

Got a little annoyed at the low hevc decoding performance of the Intel hd750 igpu in my 11900k system so I just threw an Nvidia (Asus brand) 1660ti in along side an amd 6900xt. The Red Car 4k Hevc bench when from 1:00 down to 46 seconds doing a vce render with the Nvidia decoding instead of the Intel.

Other source media formats like avc and prores are about the same either way but I'll leave Nvidia as my decoder. I think Intel sacrificed some hevc 4:2:0 performance to do 4:2:2 which isn't all that great either

I should mention however, that before I added the extra gpu board that I was having trouble with Vegas 19 not making either the gpu or igpu available in the I/O screen as a decoder. Oddly, both the Amd gpu and Intel igpu showed up in the video preferences pane. I had to uninstall the latest Intel driver and install an earlier driver from the Asus motherboard site to fix that. After adding the Nvidia gpu I now have all 3 to pick from in Video and I/O preferences. Still getting best overall performance with amd as the main gpu and vce for renders. I/O prefs look like this:


Comments

Former user wrote on 7/21/2022, 1:36 AM

Other source media formats like avc and prores are about the same either way but I'll leave Nvidia as my decoder. I think Intel sacrificed some hevc 4:2:0 performance to do 4:2:2 which isn't all that great either

 

Maybe a Vegas bug. No idea about the capabilities of hd750 decoder, but I doubt it would be the bottleneck in that benchmark project due to hardware limitations. The Nvidia 20 and 30 series can decode 20x 4K30 8 bit HEVC's simultaneously or 19x 10bit HEVC, if Intel IGPU was only 1/2 as capable it should not cause a slow down with Redcar HEVC benchmark.

GPU decoder only needs to decode 2x 4K25 videos at once with the benchmark or is it more?

Howard-Vigorita wrote on 7/21/2022, 11:40 AM

GPU decoder only needs to decode 2x 4K25 videos at once with the benchmark or is it more?

2 with 1 in a moving pip which transitions to a 3rd. Plus multiple sections with 2 and 3 full-screen composite envelopes. However I simplified the hevc 4k 30p Red Car with media substitution to use the same 8-bit hevc clip for all 3 which the bench as designed plays concurrently with different offsets. But easy enough for anyone to swap in their own camera clips. Have done that myself with larger 10-bit 4k camera clips with similar results.

Howard-Vigorita wrote on 7/21/2022, 12:20 PM

Maybe a Vegas bug. No idea about the capabilities of hd750 decoder, but I doubt it would be the bottleneck in that benchmark project

I think the disappearing vp19 decoders are definitely a Vegas bug probably relating to their amd navi implementation... I had a vega64 in the xeon and it didn't happen there till I swapped in the 5700xt which uses the same driver. But vp18 on the same machine was fine. Vp18 has a pip bug with the 5700xt that blacks out the entire background around the pip window... this was fixed (except for a small border) in vp19. I suspect the original vp19 navi fix is the source of the occasional disappearing decoders issue.

I already know that Vegas hevc decoded on the 11900k/hd750 is substantially slower than on the 9900k/hd630... which is why I threw in the 2nd gpu, being that hevc is mostly what I shoot. When I get a chance all also test the same with ffmpeg -hwaccel on both machines to see if it's app-independent.

Former user wrote on 7/21/2022, 6:12 PM

@Howard-Vigorita Would you know of any GPU decoder benchmark software? I've tried searching but never found anything. You would have thought they should be out there.

Btw I was looking at quality encoding tests by a user, and noticed HEVC encoding with your series of GPU's is noted as 40mbit/s bug, and it only encodes at 40mbit/s or higher. Is that an old bug that was fixed long ago?

Other thing interesting to see is how at 40mbit/s it's video quality is alleged to be the highest quality of any GPU, better than a 2 pass placebo encode. What do you make of that, It surely can't be accurate?

It also shows how truly terrible AVC GPU encoding is on both AMD 6000 series cards and 480/580, but all of these tests are for gaming, not the settings we'd use for NLE GPU encodes, and we still have the problem with GPU renders being higher quality than software renders, so not sure how credible this is

Howard-Vigorita wrote on 7/22/2022, 5:42 PM

@Former user Don't know of any gpu decoder benchmark software. Ffmpeg can do a decoder-only measurement for nvidia with this powershell command but has not implemented decoding for amd:

measure-command { ffmpeg -hwaccel cuda -i CLIPNAME -f null - }

With Vegas, I find that the highest hevc bitrate I can set in the Magix Hevc template for cbr is 240mbps. But could only get 70 with various amd boards (6900, 5700, Radeon7) at 30 fps and 100 at 60 fps. Also amd boards only render 8-bit. With Nvidia I get 237at 8-bit and 240 at 10-bit. Intel: 240 all around.

Former user wrote on 8/4/2022, 3:07 AM
 

I already know that Vegas hevc decoded on the 11900k/hd750 is substantially slower than on the 9900k/hd630... which is why I threw in the 2nd gpu, being that hevc is mostly what I shoot. When I get a chance all also test the same with ffmpeg -hwaccel on both machines to see if it's app-independent.

@Howard-Vigorita A benchmark that compares your Intel IGPU decode with others. It appear to be up to 150% faster than your 6900XT GPU, with the Intel ARC 380 decoder being much faster than everything else. I wonder if it's the same decoder in 12700K etc or of it's been improved. It might be the card to get if only interested in it's decoder and encoder, to be used as a 2nd GPU. Depends on how Vegas handles it.

Howard-Vigorita wrote on 8/4/2022, 10:03 AM

@Former user I heard they corrected a coding error in the arc a380 driver that lead to the boost. Right now the Gunnir one can be gotten from China on ebay for around $425 which is around 2x list price. I'm very interested in that board because the Nvidia one does nothing for legacy-hevc in Vegas... it still uses my hd750 unless I disable it in bios in which case it uses cpu which is even worse. The new 101.3222 driver helps a little, however, taking about 10 sec off the Red Car Hevc bench... I'm guessing the coding error affected 11th and 12th gen drivers too and that this latest update fixes it.