I was testing my new system (and freshly installed Vegas post 19 and 365 v20) with a GoPro clip of mine, and playback was not smooth (to put it mildly). I opened up File I/O in preference, and I noticed HEVC legacy was enabled by default and HW decoder is also default auto Intel UHD 770.
After disabling the HEVC legacy decoding, restarting Vegas playback was smooth for the same clip.
So why would Vegas be holding back on performance on state-of-the-art new HW?
I know there is/was a lot of discussion in this forum on the So4reader, I assume it is related to that?
I understand there could be legacy systems out there (prior intel UHD versions), NVIDEA decode or issues with AMD, to enable the legacy HEVC decoding, but since Vegas knows which processor and graphics card is on board (they can even search for drivers), they could at least adapt the default setting 'on the fly' to a more suitable one, honed to your system. But maybe I am wrong and setting defaults 'on the fly according to your system' is not implemented...
Maybe an idea to improve Vegas is to add a tool to optimize your system and flip some switches ...