"don't use the ... GPU decoder due to the low quality results"

bvideo wrote on 7/10/2022, 11:36 AM

I've seen a similar thing said in various threads. How exactly does one decoder differ from another in "quality"? Specifically, given one previously encoded video decoded by any two decoders, would there even be any difference in the results?

Comments

fr0sty wrote on 7/10/2022, 11:48 AM

GPU decoding shouldn't produce inferior results, but GPU encoding does.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Musicvid wrote on 7/10/2022, 11:53 AM

Agreed.

bvideo wrote on 7/10/2022, 12:02 PM

Encoding: it's a given that there will be differences, including differences in measurable or perceptible quality. I am wondering why anyone would report differences in decoding (as I have seen more than once in this forum).

Grazie wrote on 7/10/2022, 12:03 PM

Eh? So when I use the NVENC it isn't good?

Musicvid wrote on 7/10/2022, 12:20 PM

@bvideo This forum is rife with speculation without accountability. It would be ridiculously easy to design a quantitative test to rule out such a possibility, yet I am unaware that anyone has done this. I have little interest, as a result of visual (subjective) impressions. Since you have taken an interest, perhaps you are the right guy to test it.

bvideo wrote on 7/10/2022, 1:18 PM

@Musicvid it will be even easier to just go with my gross understanding of the nature of decoding, eminently enhanced by yours and @fr0sty's comments. If anyone has uncovered bugs in a decoder, I hope they will post something accountable about it.

Musicvid wrote on 7/10/2022, 2:13 PM

Until then, I wouldn't give it much thought unless you see something unexpected.

john_dennis wrote on 7/10/2022, 3:57 PM

@bvideo

I actually test this crap from time to time. I tested the idea today with HEVC sample from this thread.

Executive Summary

  • At the micro level, decoding with my AMD Radeon RX480 using UVD/VCN does not produce inferior results compared to decoding with the legacy HEVC decoder in Vegas Pro 19-643.
  • At the macro level, decoding with my AMD Radeon RX480 using UVD/VCN does produce different results compared to decoding with the legacy HEVC decoder in Vegas Pro 19-643.
  • For each decode method, the output compares favorably with the contents of the Vegas timeline.
  • The output from the different decode methods is not identical.

I'll show my unedited work, hangs and all.

Disclaimer

I have never made any claims about Vegas decoder effectiveness. I only care about truth, justice and the Amurikin way, to paraphrase LBJ.

@Musicvid @fr0sty

Howard-Vigorita wrote on 7/10/2022, 4:55 PM

@bvideo Only an issue with hevc source clips unless you also select "Legacy hevc decoding". But it's not that that big a deal as the quality only drops from great to good. Which is actually better for previewing because it's quicker. Not an issue at all with avc source clips.

Btw, none of the gpu/igpu hardware decoders differ in quality... I suspect they all buy their decoder chips from the same folks. Only thing I suggest is to try and spread the load if you have both a gpu and an igpu. Select the gpu in video preferences and the igpu in i/o for decoding.

RogerS wrote on 7/10/2022, 7:16 PM

Bvideo, it's either a bug or limitation of HEVC GPU decoding's implementation in Vegas and Howard did quantitative analysis if you care to search for it.

For me performance with non legacy HEVC is bad so it stays checked anyway.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD, Windows 11 Pro 64 bit

Dell XPS 15 laptop (2017) 32GB ram, NVIDIA 1050 (4GB) with latest studio driver, Intel i7-7700HQ with Intel 630 iGPU (latest available driver), dual internal SSD (1TB; 1TB), Windows 10 64 bit

VEGAS Pro 19.651
VEGAS Pro 20.411
VEGAS Pro 21.208

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Musicvid wrote on 7/10/2022, 8:42 PM

Thanks, @john_dennis for confirming my earlier unscientific impressions. I did notice some minor differences, but nothing I could label as being "inferior."

Former user wrote on 7/10/2022, 8:51 PM

@Musicvid it will be even easier to just go with my gross understanding of the nature of decoding, eminently enhanced by yours and @fr0sty's comments. If anyone has uncovered bugs in a decoder, I hope they will post something accountable about it.


@bvideo Look at howard's post here, and also my replies that expand on the problem. It is a problem with Vegas HEVC GPU decoder. It is not a problem of GPU decoders in general, nor Vegas's AVC GPU decoder. It is my opinion it should not be used.

https://www.vegascreativesoftware.info/us/forum/hevc-performance-quality-with-vp19-build-643--135951/

Many people in the past have complained about low quality HEVC encoding, no further info given, but possibly their source material was HEVC using GPU DECODER, and they encoded HEVC, so they wrongly concluded low quality MagixHEVC, either software or GPU encode. This most likely isn't true, unless Howard proves otherwise in more ground breaking research.

 

bvideo wrote on 7/10/2022, 9:31 PM

@john_dennis (your test) That's not just a few bits here and there. That's a big mismatch in either edges or levels.

I tried related experiments on two HEVC sources, each decoded by both QSV decode and Legacy HEVC w/no GPU. I made sure my source files were set to "full" range. Rendered to uncompressed avi. I don't have NVENC to try.

In the experiment with source 'A' the two decoded->rendered outputs matched as far as any compositing difference could show.

In the experiment with source 'B', in the resulting difference, I found some difference frames were black, while most showed signal, seemingly at edges. But this time, my source was variable frame rate 28.249 to 32.040 fps, which admittedly is a poor choice for a good decoding comparison, but revealing anyway. The two rendered files were identical in several frames, but differed in most frames. I can only assume that frame interpolation was done differently by the two decoders. I would have thought the frame rate matching would have been done by the common engine in Vegas. My conclusion is that anything in the decoder that does frame rate matching (or resampling?), or that maybe causes Vegas to deliver frame sequences differently to its rendering engine, could definitely create differences between decoders. This has no real bearing on quality because two such renders do not truly match up frame-for-frame, and can't be compared in quality. In my first experiment, the source frame rate exactly matched the render, so easy.

Former user wrote on 7/10/2022, 10:08 PM
  • At the macro level, decoding with my AMD Radeon RX480 using UVD/VCN does produce different results compared to decoding with the legacy HEVC decoder in Vegas Pro 19-643.
  • For each decode method, the output compares favorably with the contents of the Vegas timeline.
  • The output from the different decode methods is not identical.

@john_dennis That's a problem. It should not be different but acceptable(to you). The problem must come down to HEVC GPU decode only being capable of single core decoding. It is not reading the full uncompressed data from the GPU decoders, because it simply can't, it already has enough of a problem trying to play smoothly while giving inferior results.

I speculate you could have quality as good as legacy decoder, but you would pay in timeline performance. It's a strange thing there are so many threads of users guessing why a technical problem is the way it is when a developer could interrupt to give us the 100% facts and deliver a roadmap to fixing. They should also be correcting any misinformation.

bvideo wrote on 7/10/2022, 10:56 PM

A followup on my experiment above: when I rendered from the 29.97 HEVC source file to 29.97 avi, in both renders I got a result that was two frames shorter than the source. The QSV decode skipped the first two frames and the legacy decode skipped the last two frames. I don't trust anything any more.

Edit: correction - when I switch between "Enable legacy HEVC decoding" checked w/ no GPU and "... legacy ..." unchecked with auto QSV, the timeline ruler notches change. Why this happens and whether this causes my render time anomalies I have no idea.

Last changed by bvideo on 7/10/2022, 11:16 PM, changed a total of 1 times.

Dedicated: Vegas 19, 20, & 21, i7 8700k (UHD 630), 32 gig ram, ssd & 2 spinning units.

General purpose: i5 11400 (UHD 730), 32 gigs, ssd, 1 spinning unit (Vegas 20 & 21)

Former user wrote on 7/10/2022, 11:56 PM

A followup on my experiment above: when I rendered from the 29.97 HEVC source file to 29.97 avi, in both renders I got a result that was two frames shorter than the source. The QSV decode skipped the first two frames and the legacy decode skipped the last two frames. I don't trust anything any more.

 

I"m not sure where you're going with this testing, maybe you could be saying comparing videos for quality analysis using software is actually flawed because the frames in the original camera file are not in sync with HEVC SO4 decoder encodes, but are in sync with Legacy decoder encodes resulting in different results?

That's worth proving or disproving.

 

 

john_dennis wrote on 7/11/2022, 12:11 AM

@bvideo

"That's not just a few bits here and there. That's a big mismatch in either edges or levels."

You are absolutely correct. That's a big mis-match.

"The QSV decode skipped the first two frames and the legacy decode skipped the last two frames."

I have a hunch that's the root of the malcontent with Vegas HEVC decoding.

@Former user

I never made any comment about what I find acceptable. I just stated my observations. Since the HEVC that I used in the experiment was camera video, I have no way of knowing which decode more accurately matches the source. The only valid experiment is to produce HEVC from a known source, then, compare the output from each decode methodology to the known source.

I can't stress enough how little I care about HEVC. This is all a thought experiment to me.

Former user wrote on 7/11/2022, 1:59 AM
 

I have a hunch that's the root of the malcontent with Vegas HEVC decoding.

@Former user

I never made any comment about what I find acceptable. I just stated my observations. Since the HEVC that I used in the experiment was camera video, I have no way of knowing which decode more accurately matches the source.

@john_dennis I misread this line

For each decode method, the output compares favorably with the contents of the Vegas timeline.

You made the comparison of the encode with what is seen on the timeline after decoding, not a comparison with original file. 'compares favorably' sounds subjective in this situation based on viewing alone. I don't have a detailed visual subjective opinion on quality yet between the 2 encodes.

If you used the HEVC software encoder with SO4 decoder at high quality settings, and kept bringing the encode back onto the timeline for re-encode, did that 10x for SO4 and Legacy if there is a problem with SO4 it should be subjectively evident to all by viewing, this also rules out MagixHevc hardware encoding as a contributing factor

bvideo wrote on 7/11/2022, 10:05 AM

A followup on my experiment above: when I rendered from the 29.97 HEVC source file to 29.97 avi, in both renders I got a result that was two frames shorter than the source. The QSV decode skipped the first two frames and the legacy decode skipped the last two frames. I don't trust anything any more.

 

I"m not sure where you're going with this testing, maybe you could be saying comparing videos for quality analysis using software is actually flawed because the frames in the original camera file are not in sync with HEVC SO4 decoder encodes, but are in sync with Legacy decoder encodes resulting in different results?

That's worth proving or disproving.

 

 

@Former user Originally I wanted to reproduce @john_dennis' results to have a closer look at the differences at the frame level. In doing that I came across multiple phenomena that I wasn't expecting. The result in my post at #14 where I decoded a variable frame rate HEVC and found differences in frame sequencing hints at how HEVC decoders could legitimately differ. And yes, software quality measures are bound to be misleading in this case, but what can you expect when the source frame rate differs from the render frame rate.

The results after that (using a source file with a constant frame rate matching the project & render frame rate), such as shortened renders and timeline ruler anomalies, have been mysterious and hard to exactly determine the formula to reproduce. I was able to determine that in spite of these anomalies, when two renders, QSV and Legacy HEVC, are properly lined up frame to frame, their frames match exactly as far as the timeline composite/difference can show. Software quality measures would be misleading here, too, if anyone stumbles on the same anomalies I did.

I haven't come up with the precise sequence of steps to reproduce, but it seems something like: if I save a project with my HEVC source on the timeline, then go to Options->FileIO and change the legacy HEVC setting and GPU setting, then exit, then click on the saved project, the time ruler comes up different* and the event on the timeline has been extended with two extra frames by way of looping. If I open Vegas first, then open the project or recreate the project, it looks like it did when I saved it. It doesn't make sense and I can't reliably reproduce it. So now I don't know where I am going with this testing.

* when I say timeline ruler comes up different, e.g. there are only 4 frames from 0:0:12:25 to 0:0:13:00 (file is 390 frames long).

Musicvid wrote on 7/11/2022, 11:47 AM

But this time, my source was variable frame rate

In empirical testing, you have introduced what is known is an "untrapped variable," which is significant in terms of its effects on the outcome.

In order to process VFR frames of varying durations, they must be resampled using some form of interpolation, thus invalidating the data for that portion of your experiment.

In the experiment with source 'A' the two decoded->rendered outputs matched as far as any compositing difference could show.

I believe that, and not just because it agrees with my visual impressions; it also agrees with my instinct, not knowing all of the variables of course.

A better way is to use metrics, which include PSNR, SSIM, VMAF, in order of complexity. I have found, however, that most people do not use them correctly because they introduce their own variables; your observation of different frame offsets between samples is the perfect example that nobody pays attention to, and leads the savants and their pronouncements into la-la land. We have seen that several times on this forum, I have pointed it out, was met with denial, and I simply won't trust others' data until I have attempted to replicate their test environment or better yet, create my own.

"The QSV decode skipped the first two frames and the legacy decode skipped the last two frames."

I have a hunch that's the root of the malcontent with Vegas HEVC decoding.

me quoque

@john_dennis

john_dennis wrote on 7/11/2022, 3:10 PM

@bvideo @Musicvid @Former user @Howard-Vigorita

For those who would like to tinker with a "known quantity" HEVC source, you can download the file from here:

https://drive.google.com/file/d/10Kq8P-DB3eKR-YGAlteEfdRiyHwqur0z/view?usp=sharing

I'll show my results when I get a round tuit.

Created from Shutter Encoder. Audio corrected 44ms because of the silly offset in FFMPEG for PCM audio in AVI wrapper.

bvideo wrote on 7/11/2022, 3:44 PM

Here's something new: I've created an HEVC source file using vegas, same as above. (See media info) It was made from a timeline 13 seconds long (13.013xxx), so 390 frames. When I bring that back into Vegas, it is said by Vegas to be 00:00:12.28, so two frames shorter. The first two frames are missing on the timeline. Then I set "Enable legacy HEVC decoding" on, quit and restart Vegas. Then when I drag the same file onto the timeline, it appears as 00:00:13.00. I can see those frames in the file using an external viewer, and they are not missing when I drag it in with "Enable legacy ...". I believe this is the source of my problems rendering I reported in #16.

General
Complete name                            : C:\Users\....\Documents\wheels decode tc.mp4
Format                                   : MPEG-4
Format profile                           : Base Media / Version 2
Codec ID                                 : mp42 (isom/mp42)
File size                                : 20.6 MiB
Duration                                 : 13 s 13 ms
Overall bit rate mode                    : Variable
Overall bit rate                         : 13.3 Mb/s
Encoded date                             : UTC 2022-07-11 20:28:31
Tagged date                              : UTC 2022-07-11 20:28:31Video
ID                                       : 1
Format                                   : HEVC
Format/Info                              : High Efficiency Video Coding
Format profile                           : Main@L4@High
Codec ID                                 : hvc1
Codec ID/Info                            : High Efficiency Video Coding
Duration                                 : 13 s 13 ms
Bit rate                                 : 13.1 Mb/s
Width                                    : 1 920 pixels
Height                                   : 1 080 pixels
Display aspect ratio                     : 16:9
Frame rate mode                          : Constant
Frame rate                               : 29.970 (30000/1001) FPS
Standard                                 : Component
Color space                              : YUV
Chroma subsampling                       : 4:2:0
Bit depth                                : 8 bits
Bits/(Pixel*Frame)                       : 0.211
Stream size                              : 20.3 MiB (99%)
Language                                 : English
Encoded date                             : UTC 2022-07-11 20:28:31
Tagged date                              : UTC 2022-07-11 20:28:31
Color range                              : Limited
Color primaries                          : BT.709
Transfer characteristics                 : BT.709
Matrix coefficients                      : BT.709
Codec configuration box                  : hvcCAudio
ID                                       : 2
Format                                   : AAC LC
Format/Info                              : Advanced Audio Codec Low Complexity
Codec ID                                 : mp4a-40-2
Duration                                 : 12 s 992 ms
Bit rate mode                            : Variable
Bit rate                                 : 192 kb/s
Channel(s)                               : 2 channels
Channel layout                           : L R
Sampling rate                            : 48.0 kHz
Frame rate                               : 46.875 FPS (1024 SPF)
Compression mode                         : Lossy
Stream size                              : 299 KiB (1%)
Language                                 : English
Encoded date                             : UTC 2022-07-11 20:28:31
Tagged date                              : UTC 2022-07-11 20:28:31

 

Last changed by bvideo on 7/11/2022, 3:45 PM, changed a total of 1 times.

Dedicated: Vegas 19, 20, & 21, i7 8700k (UHD 630), 32 gig ram, ssd & 2 spinning units.

General purpose: i5 11400 (UHD 730), 32 gigs, ssd, 1 spinning unit (Vegas 20 & 21)

bvideo wrote on 7/11/2022, 4:20 PM

@john_dennis Your reference file: both of my renders matched up. (QSV and "Enable legacy decoding" w/ no GPU.)

One thing was a little odd: when I added your reference file to Vegas configured to "Enable legacy ..." it showed the red border at the end, and the endpoint was not aligned with the frame boundary. A script showed the length to be .5999 MS short of the next frame boundary. When I added it to Vegas configured to "Auto GPU QSV ..." it showed no red boundary, and the script said it was exactly 00:00:13.18.

Evidence is showing that the two decoder configs drop the same HEVC event on the timeline differently.

Edit 7/13/2022: On my other PC, I also saw the red border, and the script showed slightly off from an exact number of frames. Two differences in my PCs: the 2nd PC still has an older build of Vegas 19 (550) and has Intel UHD 630 graphics. My PC in this thread is using build 643 and has Intel UHD 730. All other behaviors I reported in this thread are the same on both PCs.

Last changed by bvideo on 7/13/2022, 9:31 PM, changed a total of 3 times.

Dedicated: Vegas 19, 20, & 21, i7 8700k (UHD 630), 32 gig ram, ssd & 2 spinning units.

General purpose: i5 11400 (UHD 730), 32 gigs, ssd, 1 spinning unit (Vegas 20 & 21)

Musicvid wrote on 7/11/2022, 7:02 PM

Your reference file: both of my renders matched up. (QSV and "Enable legacy decoding" w/ no GPU.)

Without feeling compelled to verify your results, the apparent score is Tortoises 1, Hares 0.

when I added your reference file to Vegas configured to "Enable legacy ..." it showed the red border at the end, and the endpoint was not aligned with the frame boundary. 

I'm not able to duplicate that. Are you sure QTF was enabled when you added John's file to your timeline? Is it the same with your own sample file? FWIW, my only GPU is also Intel.

Thanks again for taking a nonpassive approach to answering your question.