VEGAS PRO 22 HAVE THE WORST AV1 DECODING: CHROMA SUBSAMPLING 4:1:0

vagnerdesouza wrote on 12/27/2024, 1:43 PM

Magix Vegas Pro 22 has a big serious problem: It decode videos with AV1 codec with Chroma Subsampling 4:2:0 as if it had Chroma Subsampling 4:1:0

It is imperative that the Vegas Pro developers know about this. This issue simply ruins the work of professionals who use AV1 encoded files in Magix Vegas Pro.

I have been a professional videographer since 2000 and have been using Vegas since 2004. It took me many days to compile this material due to my limited time. Please be kind and think carefully before making destructive criticisms.

Please watch the private video hosted in YouTube and in the description download a ZIP file with the samples from the cloud.

Comments

RogerS wrote on 12/27/2024, 10:05 PM

@vagnerdesouza Thank you for these very detailed tests- great work on them! It must have taken a long time to prepare all this.

First, I highly recommend against encoding to AV1 using the built-in encoder in VEGAS Pro. It has blocking and other artifacts where there are color gradients making it less useful for testing. The AV1 encoders which you can access through Voukoder in VEGAS work much better.

1. Which decoder are you using?

Right click on the media, hold down shift and go to properties and then general. See what it says under streams.
If you are using the GTX 1050 Ti GPU, it doesn't decode AV1 media so you will be using a CPU fallback decoder.

I think this is the source of the problems.

2. VP 22.194 test with NVIDIA decoding

In VP 22 under media properties I have:

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'AV1'
  Codec vendor: 'NVIDIA NVDEC AV1'
  Codec fourcc: 0x3231564e [NV12]
  Data rate: 0 (bytes/sec)
  Text data rate: ''

I don't see the color artifacts with NVDEC.

3. VP 21 CPU decoder

With VP 21.208 AV1 hardware decoding wasn't yet implemented, it was introduced with build 300.

Here the AV1 files look terrible while the HEVC source looks fine.

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'AV1'
  Codec name: 'Magix AV1 Codec'
  Codec fourcc: 0000000000 error
  Data rate: 0 (bytes/sec)
  Text data rate: ''

HEVC decodes with

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'HEVC'
  Codec name: 'Magix HEVC Codec'
  Codec fourcc: 0000000000 error
  Data rate: 0 (bytes/sec)
  Text data rate: ''

4. VP 22.194 Intel hardware decoder

In preferences, file io in VP 22 I changed the hardware decoding to Intel QSV with my XE iGPU (comes with 13th gen CPU). Now it says for AV1:

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'AV1'
  Codec vendor: 'ONEVPL AV1 (GPU)'
  Codec fourcc: 0x3231564e [NV12]
  Data rate: 0 (bytes/sec)
  Text data rate: ''

Still no artifacts:

I also don't see the major issues 21 has with the balls.

21.208 (left- artifacts) vs 22.194 (right- fairly clean)

5. VP 22.194 Hardware decoder set to none

CPU decoding- still not good.

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'AV1'
  Codec vendor: 'DAV1D AV1'
  Codec fourcc: 0x32315659 [YV12]
  Data rate: 0 (bytes/sec)
  Text data rate: ''

6. Conclusion

There is an issue with chroma precision of the Magix AV1 and DAV1D AV1 software decoders.

I'd recommend users that don't have GPUs capable of decoding AV1 to avoid this format for now.

Hopefully the developers can address it.

vagnerdesouza wrote on 12/28/2024, 6:24 AM

Hello, RogerS. Thank you very much for analyzing and responding, and so quickly.
I do use an NVidia GTX 1050 ti. I did an AV1 decoding test in Vegas Pro 22 with GPU on and GPU off, and there really is a difference, I compared it with zoom in Affinity Photo. Even so, the quality is still poor.

When I render in AV1 with HandBrake, it uses CPU, not GPU. I use a powerful AMD Ryzen 9 processor with 32 threads running at 4gHz. When I render in H.264 or H.265, I use TMPGEnc Video Mastering Works with CPU only (x.264 and x.265). NVEnc does not use B frames, only ippp... and this drastically influences the quality of the final video in H.264 or H.265.

My MKV videos with AV1/Opus codecs have been playing perfectly on more modest computers, Android and iOS phones. Even on MacBooks. I use AV1 (SVT) with HandBrake rendered by CPU. Please, see https://t.me/vagnervideo/170 in my Telegram channel.

Regarding my use of an NVidia Geforce GTX 1050ti, the question is: Why do other programs, with the same hardware, decode and display AV1 perfectly (DaVinci Resolve, CapCut, VLC Media Player, MPC-BE, PotPlayer etc) and VEGAS PRO cannot do what all these do?

I have a perpetual license for Vegas Pro 21 and I intend to soon acquire an NVidia RTX 3060, but for now these are just plans. My country, Brazil, currently has the most devalued currency on the planet.

RogerS wrote on 12/28/2024, 6:58 AM

@vagnerdesouza I don't want to be rude but the GTX 1050 isn't capable of decoding AV1 so there is something wrong with your decoding test: https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new

I am not recommending a new GPU. (I had a GTX 1050 mobile myself until recently).

As I wrote there is a problem with CPU software decoding in VEGAS. I showed you how to find the decoder- hold down shift and go into the media properties. Other programs use their own software decoders. It appears the Magix decoder used in VEGAS has low precision for chroma. Thanks to your tests we can make the developers aware of this and hopefully they'll have time to fix it later with VP 22 or VP 23.

I would recommend using HEVC/h265 with Handbrake if you intend to edit in VEGAS with the GTX 1050 GPU. There's no substantial difference between HEVC and AV1 for file size and quality and the VEGAS decoder works better today in both 21 and 22. You can save your money and get a new GPU later as the GTX 1050 can decode h265 in hardware.

Also VEGAS can render to the same codecs that Handbrake can if you use Voukoder. x264, x265, AV1 with SVT, etc. Newer NVIDIA GPUs than the 1050 can use B-frames as well.

Last changed by RogerS on 12/28/2024, 7:50 AM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Howard-Vigorita wrote on 12/28/2024, 11:08 PM

On my Intel 11900k system, vp22 av1 decoding came in a smidgen better than hevc which surprised me. Last time I did this in the early days of vp21 it came out the other way, as I recall. I'm using the lossless hevc clip linked in my signature as the reference. It's something I shot in zraw which is based on hevc-lossless natively and outputs mov from the zraw converter. I transcoded the lossless-hevc to av1 with ffmpeg6 using its AOM codec which I think is still the only codec that can make lossless-av1... lossless-av1 is almost 10x the size of lossless-hevc, btw. I then loaded each clip into vp22 and rendered with the Magix Hevc MainConcept preset at 240 mbps cbr to take deviation measurements from the original hevc. The only default I changed was the decoder in i/o, trying each of the 3 gpus in my system: Amd 6900xt, Nvidia 4090, and Intel Arc a380. Results are all spectacular:

I think that Vegas defaults to hybrid decoding... some of it's cpu-based with the gpu being called on to speed up calculations. Maybe the old 1050 doesn't calculate so great. If you got a free slot, try dropping in a dirt-cheap a380... I got mine in an x4 slot and it works fine for decoding and qsv rendering.

RogerS wrote on 12/29/2024, 12:26 AM

@Howard-Vigorita Did you test AV1 playback with hardware decoder set to off? Try to get the fallback software decoder to playback AV1 and check both visually and numerically. As the OP's video shows the results are dismal.

Howard-Vigorita wrote on 12/29/2024, 10:06 AM

@RogerS Can only measure and compare decoding quality as reflected in a render. Visual involves a lot of other things like the monitor, display adapter, its connection, sync, resolution, and rgb pixel conversion settings, for instance. Playback for either lossless is about 2 fps via my BenQ 4k hdmi-kvm'd to my 6900xt and not improved by hardware decoding. But I just tried setting Vegas decoder to OFF and upgraded ffmpeg & ffmetrics to the latest and greatest and the measured results are the same.

One difference I did notice with decoder OFF is that the av1 version of the 30 sec lossless projects took 3 minutes to render with cpu utilization spiking up and down to 100% and low utilization on the gpu connected to the hdmi. Compared to 2 minutes for the hevc one with cpu utilization at a solid 100%. With hardware decoding, both took about 1-1/2 minutes with spiking cpu utilization and allot of 3D utilization on the gpu set for decoding.

RogerS wrote on 12/29/2024, 10:11 AM

What about saving a single frame with the screenshot feature or rendering an image sequence to avoid the vagaries of video compression and playback. Judging still images is not dependent on resolution, sync, etc.

I am starting to wonder if these quantitative tests are fit for purpose if they can't discern large visual artifacts resulting from encoding and decoding. From the AV1 render blocks to this AV1 chroma smear these are visible image flaws that are a much bigger issue than mere compression artifacts to my eyes.

Last changed by RogerS on 12/29/2024, 10:15 AM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Former user wrote on 12/29/2024, 7:40 PM

@RogerS VP22194, The results are identical to the point that using Voukoder and CRF for encode the file sizes are identical. Visually speaking they're identical. So obviously that' s a bit suspicious, so I did the encodes again while checking GPU decode, everything checked only difference is GPU decoder OFF is longer to encode.

RogerS wrote on 12/29/2024, 8:00 PM

Can you look at media properties while holding shift and see what decoder is being used? If it's still using a hardware decoder the test isn't meaningful.

You might have to disable the GPU in device manager or test on a system incapable of AV1 decoding.

Last changed by RogerS on 12/29/2024, 8:19 PM, changed a total of 1 times.

Custom PC (2022) Intel i5-13600K with UHD 770 iGPU with latest driver, MSI z690 Tomahawk motherboard, 64GB Corsair DDR5 5200 ram, NVIDIA 2080 Super (8GB) with latest studio driver, 2TB Hynix P41 SSD and 2TB Samsung 980 Pro cache drive, Windows 11 Pro 64 bit https://pcpartpicker.com/b/rZ9NnQ

ASUS Zenbook Pro 14 Intel i9-13900H with Intel graphics iGPU with latest ASUS driver, NVIDIA 4060 (8GB) with latest studio driver, 48GB system ram, Windows 11 Home, 1TB Samsung SSD.

VEGAS Pro 21.208
VEGAS Pro 22.239

Try the
VEGAS 4K "sample project" benchmark (works with VP 16+): https://forms.gle/ypyrrbUghEiaf2aC7
VEGAS Pro 20 "Ad" benchmark (works with VP 20+): https://forms.gle/eErJTR87K2bbJc4Q7

Former user wrote on 12/29/2024, 8:23 PM

Unless you're saying Vegas can use the GPU decoder in the GPU decoder OFF position while not using any GPU decode, then it was off. Your instruction for finding the decoder doesn't work.

RogerS wrote on 12/29/2024, 11:22 PM

Yes, VEGAS does override user choices at times, though I would think it would show up under decoding activity.

Sorry about the poor instructions for finding the decoder. While holding down shift, right-click on media, go to properties, click on general and then let go of shift. Below streams you'll find streams (debug).

Here in VP 22.194 on my laptop I just set hardware decoder to off on file io. Looking at the streams it changed from NVDEC AV1 to:

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'AV1'
  Codec vendor: 'DAV1D AV1'
  Codec fourcc: 0x32315659 [YV12]
  Data rate: 0 (bytes/sec)
  Text data rate: ''

I took the user's Handbrake AV1 SVT file and output frames from MPC Black- I don't see obvious chroma artifacts on object edges in this file.
I loaded it into VEGAS in a 1080p 29.970p 8-bit full project.
To match Howard's settings I then rendered to Magix Hevc MainConcept preset at 240 mbps cbr.

Could you please compare this vs the user's source file?
https://www.dropbox.com/s/a7wb4hn3d9iri0i/Fish%20AV1%20to%20HEVC%20CBR240%20MC%20in%20VEGAS.mp4?dl=0

As a control I also did HEVC to HEVC in VEGAS and visually it looks identical.

Next, for my own sanity I did a test with an old printer test image. With ShutterEncoder converted it to AV1 and HEVC both with CQ of 17. I loaded both into VEGAS Pro 22.194. The edges of the faces have significant color artifacts with AV1, far less with HEVC and none with the source.

Top track is AV1: https://www.dropbox.com/scl/fi/zk2c9xf7jpqa223se431a/PrinterEvaluationImage_V002_Rec709-AV1.mp4?rlkey=7w66o3bm6dutsn8ovh0aqikdh&dl=0

Bottom track is HEVC: https://www.dropbox.com/scl/fi/t8h1ps06s1sflas1757nr/PrinterEvaluationImage_V002_Rec709_H.265.mp4?rlkey=xim6w2cc9dauzliqi2f05oyle&dl=0

Source jpg: https://www.dropbox.com/scl/fi/kvzc8xpb201xbfoufcxzf/PrinterEvaluationImage_V002_Rec709.jpg?rlkey=k0bh9sn0iydovgiwehq5biidv&dl=0

If you can't enable this AV1 decoder I can do a test render for you of whatever you like if you tag me.

Howard-Vigorita wrote on 12/30/2024, 10:58 AM

I am starting to wonder if these quantitative tests are fit for purpose if they can't discern large visual artifacts resulting from encoding and decoding. From the AV1 render blocks to this AV1 chroma smear these are visible image flaws that are a much bigger issue than mere compression artifacts to my eyes.

@RogerS There are limitations in just going by clip-wide stats. They're most useful detecting systemic issues. And also suggestive as an encoding/decoding signature when all the metrics come up identical. For instance, older versions of Vegas returned matching high-metrics if legacy-hevc was checked compared to newer versions with experimental-hevc unchecked. With matching but much lower metrics for the opposite.

The limitation is that if one clip has a decoding or render anomaly only affecting one or a small group of frames, its overall metric will be lower but you won't know where to look unless you compare the numbers side by side for each frame. And even then, you won't know where in the frame to look because ffmpeg only calculates on whole frames. I know ffmpeg can write the individual frame-metrics into a folder and I assume that the ffmetrics tool uses those individual scores to make it's charts. But the tool doesn't have a function to highlight atypical deviations on single frames or sections. I suppose one could eyeball 2 charts and look for downward spikes or trends not common to both. Some better way to help zoom in on glitches would be a useful enhancement to the tool.

RogerS wrote on 1/1/2025, 7:56 AM

Great points, good to understand limitations of such metrics in mind.

It sounds like on your and UltraVista's system VEGAS may be using the same AV1 decoder despite user settings. MxCompound seems to be able to call upon multiple GPU and software decoding libraries available to it potentially making it harder to compare apples to apples. OP's and my software apple is not so tasty!

Just tested in my RTX 2080 desktop and it also uses this decoder with the same edge artifacts

Streams (debug)
  Raw stream: 0
  Video stream: 0
  Codec name: 'AV1'
  Codec vendor: 'DAV1D AV1'
  Codec fourcc: 0x32315659 [YV12]
  Data rate: 0 (bytes/sec)
  Text data rate: ''