Why is HEVC legacy enabled by default?

bitman wrote on 9/11/2022, 7:16 AM

I was testing my new system (and freshly installed Vegas post 19 and 365 v20) with a GoPro clip of mine, and playback was not smooth (to put it mildly). I opened up File I/O in preference, and I noticed HEVC legacy was enabled by default and HW decoder is also default auto Intel UHD 770.

After disabling the HEVC legacy decoding, restarting Vegas playback was smooth for the same clip.

So why would Vegas be holding back on performance on state-of-the-art new HW?

I know there is/was a lot of discussion in this forum on the So4reader, I assume it is related to that?

I understand there could be legacy systems out there (prior intel UHD versions), NVIDEA decode or issues with AMD, to enable the legacy HEVC decoding, but since Vegas knows which processor and graphics card is on board (they can even search for drivers), they could at least adapt the default setting 'on the fly' to a more suitable one, honed to your system. But maybe I am wrong and setting defaults 'on the fly according to your system' is not implemented...

Maybe an idea to improve Vegas is to add a tool to optimize your system and flip some switches ...

APPS: VIDEO: VP 365 suite (VP 22 build 194) VP 21 build 315, VP 365 20, VP 19 post (latest build -651), (uninstalled VP 12,13,14,15,16 Suite,17, VP18 post), Vegasaur, a lot of NEWBLUE plugins, Mercalli 6.0, Respeedr, Vasco Da Gamma 17 HDpro XXL, Boris Continuum 2025, Davinci Resolve Studio 18, SOUND: RX 10 advanced Audio Editor, Sound Forge Pro 18, Spectral Layers Pro 10, Audacity, FOTO: Zoner studio X, DXO photolab (8), Luminar, Topaz...

  • OS: Windows 11 Pro 64, version 24H2 (since October 2024)
  • CPU: i9-13900K (upgraded my former CPU i9-12900K),
  • Air Cooler: Noctua NH-D15 G2 HBC (September 2024 upgrade from Noctua NH-D15s)
  • RAM: DDR5 Corsair 64GB (5600-40 Vengeance)
  • Graphics card: ASUS GeForce RTX 3090 TUF OC GAMING (24GB) 
  • Monitor: LG 38 inch ultra-wide (21x9) - Resolution: 3840x1600
  • C-drive: Corsair MP600 PRO XT NVMe SSD 4TB (PCIe Gen. 4)
  • Video drives: Samsung NVMe SSD 2TB (980 pro and 970 EVO plus) each 2TB
  • Mass Data storage & Backup: WD gold 6TB + WD Yellow 4TB
  • MOBO: Gigabyte Z690 AORUS MASTER
  • PSU: Corsair HX1500i, Case: Fractal Design Define 7 (PCGH edition)
  • Misc.: Logitech G915, Evoluent Vertical Mouse, shuttlePROv2

 

 

Comments

Former user wrote on 9/11/2022, 8:03 AM

@bitman Hi, I untick the Legacy HEVC, I don't use that type of file so don't want it poss getting in the way 🤷‍♂️

It'd be interesting to see your results from the Benchmark test, you have a faster CPU & RAM, you have Intel & i have AMD but we both have 3090's, slightly different versions,

Going by the Benchmark test i would have bought an RX GPU rather than an RTX if I was gearing my PC for Vegas, I don't think Vegas uses the RTX GPU very well, @todd-b can tell you more but last time i looked my PC would be 13th or 14th on that test list 😒 (specs in my Signature)

Dexcon wrote on 9/11/2022, 8:11 AM

I read @bitman's comment as asking why legacy HEVC is enabled as a default setting. IMO, legacy settings should be optional rather than being default settings. If it's needed for a specific reason, then it can be enabled.

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

fr0sty wrote on 9/11/2022, 9:48 AM

Perhaps they consider the new decoder for HEVC to be in "beta" form, not stable enough across many systems to be on by default. I am not seeing reports of enough people having trouble with the new decoder to think that may be true, but I'm also not the one reading the crash reports they get, nor do I deal with the customers as part of QA, so I wouldn't call that an educated guess.

Last changed by fr0sty on 9/11/2022, 9:50 AM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Howard-Vigorita wrote on 9/11/2022, 10:45 AM

I had the same issue with my 11900k/uhd750 system. Had to keep legacy-hevc decoding unchecked till my final deliverable render which takes much longer but yields significantly higher quality. The legacy-hevc lib probably needs to be updated to recognize and use Intel 11th and 12th gen igpus... but the fact they call it legacy suggests that's not going to happen. The good news is it did recognize Intel Arc when I threw one in as a 2nd gpu and disabled the igpu in bios.

Former user wrote on 9/11/2022, 4:41 PM

I was testing my new system (and freshly installed Vegas post 19 and 365 v20) with a GoPro clip of mine, and playback was not smooth (to put it mildly). I opened up File I/O in preference, and I noticed HEVC legacy was enabled by default and HW decoder is also default auto Intel UHD 770.

After disabling the HEVC legacy decoding, restarting Vegas playback was smooth for the same clip.

So why would Vegas be holding back on performance on state-of-the-art new HW?

I know there is/was a lot of discussion in this forum on the So4reader, I assume it is related to that?

Another Question: Why is SO4 default for AVC, but not for HEVC?

The likely answer is SO4 AVC decoder is multithreaded, it will use all your CPU if required to decode a file, while SO4 HEVC decoder can only use a single core of your CPU. using Nvidia and AMD GPU's as HEVC GPU decoder many 10bit files don't have any GPU decode, this is a Vegas bug, if if you used the GPU decoder you'd have no GPU decoding, plus 1 core of your CPU doing the decoding, but if you chose Legacy decoder all your CPU cores would do decoding.

Some 10bit files can't be decoded by Legacy decoder, so the SO4 decoder is used even if Legacy decoder is activated. This makes sense, I'd rather Vegas do this for me instead of manually changing the decode options myself, but also means many 10 bit HEVC codecs for many Vegas users have no GPU decode and use a single CPU core decoding.

Vegas knows which processor and graphics card is on board (they can even search for drivers), they could at least adapt the default setting 'on the fly' to a more suitable one, honed to your system. But maybe I am wrong and setting defaults 'on the fly according to your system' is not implemented

That makes more sense for modern Intel CPU's, but your GPU decoding is still limited by the inability for the GPU decoder to access enough CPU resources. As silly as it seems, especially when editing, not doing a simple playback test Legacy can give a overall better result with more complex HEVC files such as those that have no P frames, but are 87% B frames, due to them being more computationally taxing.

The short answer, HEVC GPU decoding is handicapped by a lack of access to CPU, this possibly causes more instability and crashing compared to AVC GPU decode.

fr0sty wrote on 9/11/2022, 10:15 PM

using Nvidia and AMD GPU's as HEVC GPU decoder many 10bit files don't have any GPU decode, this is a Vegas bug

If it is a 10 bit 4:2:2 file, GPUs do not support decoding it in hardware.

NVENC/NVDEC only supports 4:2:0 and 4:4:4 AVC and HEVC

Intel Quicksync on 11th and 12th gen processors can only do AVC 4:2:0 but can do 4:2:0, 4:2:2, and 4:4:4 HEVC.

Last changed by fr0sty on 9/11/2022, 10:20 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Former user wrote on 9/11/2022, 10:33 PM

@fr0sty The classic example of the HEVC 10bit 420 color GPU decode bug is Sony XAVC-HS. It's not a limitation of the GPU's decoder. The OP is using his IGPU 770 for decoder, which doesn't have this problem, and has 422 10bit decode, but he's hindered by the extra CPU processing requirments necessary for the GPU decoding off that extra information. Like 420 color some 422 codecs will behave worse than others due to complexity

VEGASDerek wrote on 9/12/2022, 8:21 AM

Our new HEVC decoder is not stable enough at the moment for us to comfortably be able to turn it on by default. We are working on it and hope to have this option turned on by default soon so our users can experience the performance benefits of this new decoder.

bitman wrote on 9/13/2022, 5:48 AM

@VEGASDerek Thanks for the update! Much appreciated.

FYI, the GoPro clip was HEVC 8bit 420 color 

Last changed by bitman on 9/13/2022, 5:51 AM, changed a total of 1 times.

APPS: VIDEO: VP 365 suite (VP 22 build 194) VP 21 build 315, VP 365 20, VP 19 post (latest build -651), (uninstalled VP 12,13,14,15,16 Suite,17, VP18 post), Vegasaur, a lot of NEWBLUE plugins, Mercalli 6.0, Respeedr, Vasco Da Gamma 17 HDpro XXL, Boris Continuum 2025, Davinci Resolve Studio 18, SOUND: RX 10 advanced Audio Editor, Sound Forge Pro 18, Spectral Layers Pro 10, Audacity, FOTO: Zoner studio X, DXO photolab (8), Luminar, Topaz...

  • OS: Windows 11 Pro 64, version 24H2 (since October 2024)
  • CPU: i9-13900K (upgraded my former CPU i9-12900K),
  • Air Cooler: Noctua NH-D15 G2 HBC (September 2024 upgrade from Noctua NH-D15s)
  • RAM: DDR5 Corsair 64GB (5600-40 Vengeance)
  • Graphics card: ASUS GeForce RTX 3090 TUF OC GAMING (24GB) 
  • Monitor: LG 38 inch ultra-wide (21x9) - Resolution: 3840x1600
  • C-drive: Corsair MP600 PRO XT NVMe SSD 4TB (PCIe Gen. 4)
  • Video drives: Samsung NVMe SSD 2TB (980 pro and 970 EVO plus) each 2TB
  • Mass Data storage & Backup: WD gold 6TB + WD Yellow 4TB
  • MOBO: Gigabyte Z690 AORUS MASTER
  • PSU: Corsair HX1500i, Case: Fractal Design Define 7 (PCGH edition)
  • Misc.: Logitech G915, Evoluent Vertical Mouse, shuttlePROv2

 

 

Former user wrote on 9/13/2022, 6:13 AM

@bitman Hi, would you share one of those clips just out of curiousity?

rgr wrote on 1/28/2024, 12:36 PM

Our new HEVC decoder is not stable enough at the moment for us to comfortably be able to turn it on by default. We are working on it and hope to have this option turned on by default soon so our users can experience the performance benefits of this new decoder.

I have a question - after selecting this option for H264 in loaded MP4 files (h264), the audio track is shifted by 1 frame. I'd rather avoid it. Question - in which version (with or without the option checked) is the audio track loaded correctly? The problem affects both Vegas 19 and 21.

Is there any sample MP4 file I can use to check it myself?

rgr wrote on 1/28/2024, 1:07 PM

Option unchecked.

After the first recoding (test 5 frames), the soundtrack does not look like the original one, but there is something there.
After the second pass, there are already holes (picture).
After the third run it was empty.

rgr wrote on 1/28/2024, 1:12 PM

Option checked.

Now even after three passes the soundtrack is in place.