Decoding 4:2:2 10bit HEVC (I believe it's HEVC) footage from my Canon R5 specifically shot in 60fps in Vegas is nearly impossible. Sluggish performance and constant crashes. 24fps is fine. Based on help I've received in this forum in the past I believe the issue is that Vegas itself is not optimized for this type of footage in general, or not optimized for Intel CPUs, as someone mentioned. Can anyone else confirm, perhaps someone with a Ryzen system? Or even a 13900K? Any performance difference when trying to decode this type of footage specifically in the timeline? Or is this just a typical Vegas lack of optimization thing regardless of what platform you're on? And yes I know I can transcode or make proxies, I'm just trying to see if it can be avoided.