Hevc performance/quality with VP19 build 643

Howard-Vigorita wrote on 6/24/2022, 8:21 PM

I generally expect to get the fastest renders of hevc projects from my newer 11900k system with it's water-cooled 6900xt. But this is what I got from the RedCar project using three 4k hevc source clips and rendering them to 4k hevc to simulate a multicam project:

Oddly, the faster 11900k system runs over 60% slower than the 9900k if legacy-hevc decoding is selected. But with legacy-hevc unchecked, the faster 11900k system outperforms the older system as expected. Not sure why this is but whatever library is being used for legacy-hevc, it doesn't seem to like something about the hd750 that is in the 11900k very much.

To see the quality differences from these decoding choices, I had to stick to a straight-up transcode in order to have a solid input to output comparison to feed to the ffmetrics utility. For this I used a single 30-second clip I shot with a zcam e2 in zraw which their converter let me save hevc-lossless; Vegas seems to be able to handle it pretty well. The zcam can do what it calls a "partial-debraying" which means it can only reorder the chroma data while preserving the 4:2:0 subsampling characteristic of the Sony Exmor sensor in the camera and writing it to media with a bit depth of 10 bits. Once exported out of the camera, the app can also do the full debraying with an upscale to 4:2:2 or 4:4:4 but I didn't do that. For this test I measured the elapsed time for Vegas to do the transcode from hevc 4:2:0 to a render with Magix Hevc using the MainConcept encoder preset configured for 10-bit maximum quality output at a maximum bitrate to 240 mbps. It was pretty slow but I think it approaches Vegas' possible upper limits of quality.

Curiously, the 11900k system outperforms the 9900k in this single-clip transcode. All my previous tests like this show a similar dramatic quality improvement for hevc-legacy over igpu decoding going as far back as v17, so that's nothing new. Based on prior tests I did with v16 of Vegas, the current hevc-legacy option seems to have been it's default. Been doing my recent final renders for deliverables with hevc-legacy on my 9900k since I first burned-in the 11900k so I guess I'll keep doing that.

Out of curiosity, I also ran my little quality test on a xeon system, which has no igpu, but uses 2 gpu's instead. At the moment I have an Amd Vega64 and an Nvidia 1660 in it and normally set the Amd as the main gpu and the Nvidia for decoding. For this test I rendered with the Nvidia so I could compare its 10-bit output which Amd doesn't support. I tend to go by the vmaf quality which although not quite as high as MainConcept, looks pretty good.

The interesting thing here is that I got identical quality from both the Amd and Nvidia decoders... suggesting the onboard chips used for decoding might be identical. I should also note that igpu selection in every case made no difference to either speed or quality when hevc-legacy was selected but there was definitely activity happening in the secondary gpu even when none was selected... I didn't try pulling the Nvidia board or disabling the Intel igpu in bios but it seems that's what it would take.

Comments

Former user wrote on 6/24/2022, 9:44 PM
 

The interesting thing here is that I got identical quality from both the Amd and Nvidia decoders... suggesting the onboard chips used for decoding might be identical.

We know SO4 decoder(HEVC) is single core, rather than multicore, compared to Legacy HEVC decoder it's very low powered, so what's one way of reducing the load on the decoder?

Send the decoder a compressed signal from the GPU decoder

Did you ever buy Davinci Resolve studio, or do you have another NLE?

It would be interesting for you to try the quality tests between using gpu decoder and not using gpu decoder with another quality NLE. Is this quality problem you're seeing with GPU decoders a Vegas thing only?

Another interesting thing for you to do is Use sO4 decoder but untick GPU decoding, what happens then?

 

Former user wrote on 6/24/2022, 11:54 PM

So I did some testing and the results are clear, SO4 is bad at doing everything, it's slow, incapable of using cpu resources available to it, responsible for most Vegas Crashes, it's inefficient and thanks to your research we also know it produces low quality encodes. These are my results

Your idea that GPU decoders inherently produce low quality results is not correct, infact it's nothing to do with GPU decoding, proven by using SO4 but turning the GPU off, and still producing the same low quality results.

As Owners of Vegas we should begin a campaign to get rid of the Magix SO4 decoder. It's caused so much misery.

* My original file was 4K hevc 420, all encodes are done Via Voukoder. This is done so I can more accurately compare results with Davinci Resolve that also uses Voukoder

Howard-Vigorita wrote on 6/26/2022, 1:46 PM

I only got average quality from Voukoder if Vegas legacy-hevc was not enabled. Perhaps manipulating the SO4 internal setting is related to flipping legacy-hevc decoding.

I do have Davinci Studio but don't know it very well. However a straight-up transcode for quality/performance measurements should be easy enough. Should be the same difference with ffmpeg directly. If you want to try the same project and test-clip I used, it's zipped on a cloud drive here:

https://drive.google.com/file/d/1a0r86Ux1yFcGY4-ub8NnEr0V7MUI-u1F/view?usp=sharing

Note that zcam captures raw with zlog2 but I didn't bring a lut into play for my tests.

Former user wrote on 6/26/2022, 8:00 PM

I was wondering how you were even able to use a 10bit file because in the past I've seen Vegas force SO4 while in legacy mode. With HEVC it seems to depend on the codec. Sony a7sIII 10bit HEVC is forced but not your zcam media and some other hevc 10bit I tried which is a good thing, this leaves the possibility of getting quality encodes from Legacy except if using the Sony codec.

But if people don't know that HEVC SO4 decoder gives low quality encodes they may never experiment. It's becoming more obvious why the legacy hevc decoder is the default even in VP19, SO4 hevc decode is really bad in all aspects.

I tried SO4 AVC decode it works fine as far as quality, not sure why the NVENC is registering as higher quaity over software encode. Need to check my software encode settings

Howard-Vigorita wrote on 6/27/2022, 9:19 AM

It's becoming more obvious why the legacy hevc decoder is the default even in VP19, SO4 hevc decode is really bad in all aspects.

I get the fastest renders and smoothest playback with legacy-hevc unchecked so I keep that my default until the final render.

Former user wrote on 6/27/2022, 8:14 PM

It's becoming more obvious why the legacy hevc decoder is the default even in VP19, SO4 hevc decode is really bad in all aspects.

I get the fastest renders and smoothest playback with legacy-hevc unchecked so I keep that my default until the final render.

That's the problem, I also did that, most people until now did not understand why a legacy decoder that doesn't use GPU decode was the default, thanks to your research we now know why. It produces low quality encodes. but with the right file will get smooth playback, and depending on file a smoother playback than legacy with formats that allow GPU DECODING with your hardware. At 4K playback is mostly unusable for any serious work with HEVC files, the exception I"ve seen is some DJI media that doesn't use any B frames, and that demonstrates the HEVC SO4 decoder problem, it's extremely weak

I'd actually be interested to see if any fixes relating to performance increases in HEVC SO4 playback resulted in the low quality playback seen in VP19. It's a bit like encoding in x264 using ULTRAFAST will encode faster but the quality will be much lower than if MEDIUM or PLACEBO was used. Is HEVC SO4 decoder using ULTRAFAST settings for decode, because it has to, due to it's limited processing ability (not multicore)

 

 

Former user wrote on 6/27/2022, 10:20 PM

I had a look at your Test file, It's over 600MB/s 4K HEVC and very complex, 80% B frames

You would expect Magix SO4 GPU decoder to have a problem this file and it does, but I made a comparison with Resolve GPU decoder turned off. It could have been Premiere or MEP used in this comparison, nothing special about Resolve. Playback of a 4K HEVC is better on a competing NLE when handicapped by turning it's GPU decoder off while producing a much higher quality encode.

Howard we need SO4 fixed or replaced, what's your opinion?

 

Howard-Vigorita wrote on 6/28/2022, 8:26 AM

I don't ever use the internal setting to change so4. I'm thinking turning legacy-hevc off might turn it off also... I seem to recall one of the Vegas guys saying something like that in a post once. I don't mind turning hevc-legacy on to get the highest quality decode in front of a render. Of course it would be nice if it was even faster display performance with it off. Legacy-hevc what I'd prefer getting fixed first to remove the penalty using it on an 11900k. So I don't have to switch machines to a 9900k for final renders. Perhaps if optimizing buffering or internal processing paths is the cause of the penalty, fixing that might benefit both. Another possibility is Vegas providing an entry point for external 3rd party decoders as they do for encoders like Voukoder.

Former user wrote on 6/28/2022, 9:14 PM

what I'd prefer getting fixed first to remove the penalty using it on an 11900k. So I don't have to switch machines to a 9900k for final renders.

Or better still have 1 decoder that does GPU decoding that is good at everything and let this small Vegas Creative team focus on 1 decoder, not 2.

You would think being legacy there is no further development of mxhevcplug, but on the other hand the legacy decoder is the default HEVC decoder, so is it really legacy? and if all 11th and 12th gen people are getting this slow down then it should be looked at, HOWEVER, the slow hardware encoding that only affects NVIDIA GPU's has never been fixed in the past 5 years, and that's not legacy

Btw I've been looking into various HEVC encoding methods. Best so far is 4K60 with B frames editing like a knife through butter, problem is as soon as I go from 8bit to 10bit start getting the lagging frame rate at edit points, but it fully recovers quite quickly, no need to pause/play to get it working again. It is my thinking when transcoding to HEVC the only reason I can think off is to use 10bit at 4K to allow for GPU decoding, that's not available on AVC.

But this seems largely pointless now that you've revealed SO4-GATE, I wouldnt' want anyone encoding with it producing a subpar encode most likely without them even noticing at the time. Your technique of editing with SO4, but changing decoders before encode, restart and encode sounds farcical as far as trying to explain the need for this to a new user, even if it is the best compromise for certain media.

RogerS wrote on 6/28/2022, 9:20 PM

Maybe it will be addressed in VP 20? The VCS team have said on this forum that stability and performance are priorities, and the issues with NVDEC and HEVC are known to them, so that's what I'm holding out hope for. I agree users shouldn't even have to think about any of this.