NVENC AVC Rendering Quality?

NickHope wrote on 9/25/2019, 4:57 AM

I badly need to replace my PC displays, maybe for something like this 43" UHD monitor (or even a UHD TV). To drive it I'll need to replace my ancient AMD Radeon HD6970. In view of the NVDEC support in VP17, I'm thinking of getting an NVIDIA GPU rather than AMD, perhaps an RTX2070 or RTX2080. But NVENC rendering might be useful to me too.

So has anyone studied the quality of NVENC AVC renders in comparison to any of these alternatives?:

  • CPU-only MAGIX AVC
  • AMD VCE
  • Intel QSV
  • x264

RTX(Turing)-generation comparisons would of course be most relevant, but earlier generations also of interest. And I'll only be rendering 8-bit for the foreseeable future.

Note that the frame rate of Vegas projects needs to be matched to the NVENC-rendered file to avoid repeat frames. That is crucial when doing comparison testing on the Vegas timeline by eye or by the video scopes. The same forum thread also shows favorable results for VCE rendering, to the extent that I would probably use it over other methods, if I had recent AMD GPU.

I do realise that there are at least 2 other issues with NVENC rendering:

HOS Render Quality Metrics tool here, if anyone wants to use it for a test.

Comments

j-v wrote on 9/25/2019, 5:35 AM

I do realise that there are at least 2 other issues with NVENC rendering:

I don't see both, not on laptop and not on desktop from signature, not for Magix AVC and not for Magix HEVC.

 

Last changed by j-v on 9/26/2019, 3:03 AM, changed a total of 1 times.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Former user wrote on 9/25/2019, 6:12 AM

@NickHope This Benchmarking thread has a good mixture of systems, may be of use in evaluating your choices ... https://www.vegascreativesoftware.info/us/forum/benchmarking--116285/

These are also some render quality tests I did, includes ffmpeg.

Theres RTX and GTX Nvenc used in some of the tables below.

fifonik wrote on 9/25/2019, 6:24 AM

I did such comparison using MSU Quality Measurement Tool when investigated Mainconcept/Magix AVC encoders quality issue.

In short: for FullHD with bitrate I used (20 Mbps), NVENC was the best from NVENC/QSV/VCE and all of them were behind x264.

As I did not have NVIDIA GPU & Intel CPU I was not able to encode with such encoders myself so I asked others to do this.

I should have results and can try to share them if you interested in.

P.S. I did it in VP15 and I do not know if encoders were improved since then in terms of quality and if quality is the same for different GPUs for the same brand.

Last changed by fifonik on 9/25/2019, 6:25 AM, changed a total of 1 times.

Camcorder: Panasonic X1500 + Panasonic X920 + GoPro Hero 11 Black

Desktop: MB: MSI B650P, CPU: AMD Ryzen 9700X, RAM: G'Skill 32 GB DDR5@6000, Graphics card: MSI RX6600 8GB, SSD: Samsung 970 Evo+ 1TB (NVMe, OS), HDD WD 4TB, HDD Toshiba 4TB, OS: Windows 10 Pro 22H2

NLE: Vegas Pro [Edit] 11, 12, 13, 15, 17, 18, 19, 22

Author of FFMetrics and FFBitrateViewer

Musicvid wrote on 9/25/2019, 7:38 AM

Thanks again to @Former user for running tests on such a wide variety of encoders. These are a valuable reference that will keep their value for some time.

Now, to switch hats completely, my subjective impression of QSV h264 (the only HW encoder I can test), is that it is absolutely horrid. I had to set it at CQ 14 to approximate the visual clarity of x264 at RF 20. By then, the QSV files are 10-15% larger than x264. File sizes are nearly the same at CQ 15, and the QSV file is watchable, but softer side-by-side. And all this pronouncement from a half-blind guy who intends to surrender his license in a couple of years.

So yeah, I'm encoding the latest Ken Burns series with QSV, and plan on redoing it in x264, sometime. I'm still convinced that hardware encoders' only redeeming quality is that they are fast.

Former user wrote on 9/25/2019, 7:48 AM

If you want the highest quality and the fastest encoding I would not think anyone would disagree that Turing NVENC gives the best results. The speed aspect is not as noticable using Vegas due to the delays in serving the frames to the NVNEC encoder . I don't have VP17, so can't speak on that, but not heard anyone talking about improvement in speed

Former user wrote on 9/25/2019, 8:19 AM

@Musicvid @Former user Thank you Musicvid. The objective results in the tables bear out both of your observations, QSV fares poorly and the RTX Nvenc scores higher than the GTX Nvenc.

Musicvid wrote on 9/25/2019, 9:31 AM

Yes, I have found that objectifying results causes more agreement than do schoolyard bragfests, although I am getting lazier about it, without feeling too much "wronger."

I'm glad you have taken up the task of documenting this stuff including hardware encoders, since there are a whole new set of "ifs" with which to deal.

What I'm getting from reputable internet sources is that NVENC is less bad.

j-v wrote on 9/25/2019, 9:51 AM

@Former user

I don't have VP17, so can't speak on that, but not heard anyone talking about improvement in speed.

When you also use the NVDEC in Pref/FileI/0 I see a lot of speed improvement.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

NickHope wrote on 9/25/2019, 10:13 AM

Thanks very much everyone. I had completely missed that benchmarking thread.

@j-v It would be useful if you post your findings on those 2 threads, to show that those bugs do not apply universally.

@Former user In the "BruceUSA" image that shows VCE 3.1, do you know what generation the NVENC was (that has a slightly poorer result than the VCE)?

2 further questions. Apologies if these are already answered within the threads linked above:

  1. Anyone got a feel for the comparative AVC quality of the very latest NVENC vs VCE? i.e. (RTX/Turing) NVENC vs (Radeon VII) VCE 4.1.
  2. Anyone got a feel for the rendering speed of Turing NVENC vs x264 at similar quality?
Former user wrote on 9/25/2019, 10:46 AM

@NickHope This BruceUsa test was my first foray into this testing when the HO util became available. Before that, if you recollect we were putting source and target one above the other on two tracks and observing the “difference”.

To try and answer your question, it was probably before last November, when I upgraded my PC, so I was using my i7 4790K + GTX 1080. RE: your Nvenc query. Not so, correction, see the red asterix * in the BruceUSA table, only the Cuda and Cpu tests were done using the i7 4790k.

Of more significance is that its called BruceUSA because it was his source material that was used to do this test. I used a mate's Laptop to test the VCE, as I have no VCE capability. There were lower data rates used in these tests because I couldn’t get VCE to render at a higher data rate. Note that theres a lower data rate VCE (no version number) thats quite poor. I cannot recollect exactly, but maybe I had access to BruceUSA'S rendered out VCE clip, and used it for one of the two. I just now retested the two VCE clips using ffmpeg instead of HO and the results are in line with the previously reported tests. My Mates VCE was the VCE 3.1, slightly higher data rate. The other VCE may also have been 3.1, but the data rate difference is important in evaluating results also.

In all of the other table/tests I used my own 27s clip with much higher rendered out data rates, I would have also had the old PC available and so able to get both GTX and RTX results.

Former user wrote on 9/25/2019, 10:48 AM

@NickHope 

“I had completely missed that benchmarking thread”

Feel free to become the 13th warrior🤣

j-v wrote on 9/25/2019, 10:53 AM

@j-v It would be useful if you post your findings on those 2 threads, to show that those bugs do not apply universally.

Why?
I don't feel the need to show again and again my experiences with that new decoder and encoder I did already many times and special not on posts from others unless they are asking for a reaction as you did.
I'm not going to arguie them there. They can read my posts with enough proof if they are interested.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Former user wrote on 9/25/2019, 11:25 AM

@NickHope "Anyone got a feel for the comparative AVC quality of the very latest NVENC vs VCE?"

I believe that its important for any quality checking that you test at the type of data rates that interest you, what you are likely to use, as there may be a bigger variation at lower data rates?

fr0sty wrote on 9/25/2019, 5:15 PM

Nick, I have a Radeon 7, I would be happy to encode a clip for you for testing purposes... then you can find someone with an RTX to encode the same clip, and that'll give you a good comparison.

However, I am NOT happy with my GPU. It is super fast, at what it does properly, but AMD has lagged behind in support with media creation apps, such as Vegas still not having decode working among multiple other apps having various issues with them. I have my GPU connected to a 50 inch 1080p as the main monitor, and a 50 inch 4k OLED HDR as the secondary. When I turn on my computer, the image will be overscanned on the 50 inch, and the only way to fix it is to turn on the OLED, which causes the 1080p to flash and get a new signal sent to it. After that, it will not be overscanned anymore, but then the OLED screen, when I turn it on, about 50% of the time it will flicker to static and eventually lose signal entirely, and from time to time, it will randomly lose signal in the middle of me editing, and I have to switch windows HDR mode off, then turn it back on, to get it back (or make a resolution change, anything that makes the GPU resend the video signal).

I've read reports of others having similar issues, and there doesn't seem to be anything done to fix it. Also, I've heard reports of issues with the HBM2 RAM not laying flush against the heat sync, which can cause overheating issues on what should be perfectly good GPUs.

I would go for the Nvidia GPU if I were you.

wwaag wrote on 9/25/2019, 9:19 PM

@NickHope

Like fr0sty recommended, go with Nvidia. You have a lot more rendering options, especially if you use the command line capability of HOS.

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Musicvid wrote on 9/25/2019, 10:29 PM

Lacking some burning deadline, or profound impatience, or a nongiveashitical project, it's still x264 at this house. After all these years, I'm still ruled by outcomes, even if expediency usually wins.

NickHope wrote on 9/26/2019, 1:41 AM

@j-v It would be useful if you post your findings on those 2 threads, to show that those bugs do not apply universally.

Why?
I don't feel the need to show again and again my experiences with that new decoder and encoder I did already many times and special not on posts from others unless they are asking for a reaction as you did.
I'm not going to arguie them there. They can read my posts with enough proof if they are interested.

Why? So that anyone who suffers those issues, or even a developer who is trying to troubleshoot them, can find your results on the relevant thread. They won't find them otherwise.

Anyway don't bother; your FHD demonstration is not really relevant to either bug, since they only appear to affect 4K rendering.

I mentioned the bugs to "get them out of the way" and avoid a discussion about them here, so we can concentrate on the quality. Obviously that backfired.

NickHope wrote on 9/26/2019, 1:55 AM

@NickHope "Anyone got a feel for the comparative AVC quality of the very latest NVENC vs VCE?"

I believe that its important for any quality checking that you test at the type of data rates that interest you, what you are likely to use, as there may be a bigger variation at lower data rates?

@Former user UHD 3840x2160 30p YouTube upload is my priority. Currently using x264 AVC via HOS Render+. I was using crf18 and am set to start using crf20 with less compression (some discussion here). In either case the data rate tends to be around 100Mb/s. Pretty high for AVC, and perhaps OTT for YouTube upload, but I prefer to give my videos the best chance of decent quality outcome.

Former user wrote on 9/26/2019, 3:02 AM

@NickHope Pretty much what I would do if uploading to youtube, the data rates for my testing were 100Mbps also, UHD 25fps so hopefully may have some relevance. As you can see the ffmpeg test did quite well, crf 18.

Former user wrote on 9/26/2019, 5:09 AM

@NickHope "Anyone got a feel for the comparative AVC quality of the very latest NVENC vs VCE?"

I believe that its important for any quality checking that you test at the type of data rates that interest you, what you are likely to use, as there may be a bigger variation at lower data rates?

@Former user UHD 3840x2160 30p YouTube upload is my priority. Currently using x264 AVC via HOS Render+. I was using crf18 and am set to start using crf20 with less compression (some discussion here). In either case the data rate tends to be around 100Mb/s. Pretty high for AVC, and perhaps OTT for YouTube upload, but I prefer to give my videos the best chance of decent quality outcome.

Is there a reason you don't want to use HEVC NVENC for your 4K final export? The AMD cards from RX480 onwards have poor H.264 encoding but much better HEVC quality/speed. On the subject of YT, MKV files are encoded to 1080P faster than mp4, My 45min MKV videos take 25mins, while the mp4 equivilent take 41min

Also I think this is the best no compromise use of hardware encoding, as long as you have a fat pipe to the internet it doesn't matter if you're using a much higher encoding bitrate. You could make the argument that it's inefficient use of local storage.

 

john_dennis wrote on 9/26/2019, 9:38 AM

"Is there a reason you don't want to use HEVC NVENC for your 4K final export?"

Nick said, "I did try MAGIX HEVC but it was painfully slow to render, so I won't be using that."

NickHope wrote on 9/26/2019, 9:45 AM

"Is there a reason you don't want to use HEVC NVENC for your 4K final export?"

Nick said, "I did try MAGIX HEVC but it was painfully slow to render, so I won't be using that."

But that was CPU-only. I'd definitely revisit it if I had a GPU that could do it.

Musicvid wrote on 9/26/2019, 9:52 AM

GPU-Enabled HEVC is still painfully slow, relatively speaking.

I recall it coming in around the same as software x264 in a single test.

In terms of quality it seems "less bad" than hardware AVC. That's jmo.

john_dennis wrote on 9/26/2019, 12:02 PM

I rendered 30 seconds from XAVC UHD 100 Mb/s source of high action water polo using an AMD Radeon RX480. The times are as follows:

AMD AVC - Default

AMD HEVC - Default

HOS GOP-15 Zero Latency (my preferred CCL)

Clearly there is a render time advantage to hardware encoding. DUH!

I saved the output with the intent of measuring quality later as my lunch/drinking plans allow. I also have a shoot later this afternoon.

Out of curiosity, I doubled the bit rate of the AMD HEVC render template with the following results:

AMD HEVC - 48 Mbps Target, 96 Mbps Peak

Edit: Updated Picture Quality and Mediainfo Reports for Rendered Outputs