As of a couple years ago, Intel was the only graphics card offering hardware acceleration for HEVC/H.265 422 10bit Encoding & Decoding. I haven't been paying attention lately...are there any cards from AMD or NVIDIA that offer this now?
Decoding hevc 4:2:2 is Intel's claim to fame for its 11th gen+ desktop igpus and Arc gpus. I believe they're the only processors that can do that. I don't recommend shooting 4:2:2 hevc, however. Unless you have a camera sensor whose physical geometry can actually capture the extra elements used by 4:2:2. I have two 3-ccd HD cameras like that, which were by bread and butter for about a decade. But unfortunately they went out of production when 4k sensors came along. Because the tradeoff in chip density wasn't worth it; more pixels look better to earthlings than fewer pixels with more color elements than they can discern. I don't think any 3-ccd sensors have been made since then by anyone. Except maybe for tv & film studios whose budgets are way beyond the means of ordinary mortals.
Btw, I have an Arc a770 and my 4k cameras give me the option to shoot hevc 4:2:0 or 4:2:2, so I've tried both. Ultimate quality ends up the same after edit and render because those cameras all use sensors that only capture the elements used by 4:2:0. They synthetically upscale missing elements needed to output 4:2:2. Doing the upscale later in the edit chain is apparently just as good, if not better. The result was expected and intuitively obvious. With no possible offsetting quality gain, the larger storage space required by 4:2:2 footage adds insult to injury. As does the Arc decoding performance; although it's faster than cpu decoding, it's noticeably slower compared to 4:2:0 for both playback and rendering.