DJI Mavic 4 PRO drone seems to only create 10 bit video files, and they are too slow to edit or render. So are there any options other than converting them to 8 bit before using in Vegas Pro?
Strange, my DJI Osmo-pocket 3 produces 10bit HEVC 4k mp4 files, which I assume are the same as your DJI Mavic 4 Pro video files, those files are playable and editable in VP22/VP23 without any restrictions compared to the 8bit files from my earlier Osmo-pocket.
According to Google AI comparison, the 2080 has a dedicated processor for HEVC 10 bit encode/decode which makes it faster for those tasks, even though the 6800xt has more raw power. I guess it's time for an upgrade! Thanks for the question, Roger.
My pleasure. I just upgraded from a NVIDIA RTX 2080 Super to a RTX 5070. Once VEGAS adds support for it the 50XX GPUs will be able to even accelerate 10-bit 422 media (only Intel GPUs/iGPUs are capable of that) so I thought the card would be future proof.
I had a good experience with MSI with my motherboard and last GPU so went with their basic model as I wanted to keep down costs.
Maybe see reviews of build quality on YouTube, etc. I don't believe there are any significant performance differences among models, so it's just construction and cooling ability.
I did the same and went with Asus because I have good history with them. So if Vegas Pro doesn't support the 5000 series, do you think there will be any improvement until they do?
So by "advanced decoding" you mean encode/decode 4:2:2? I'm primarily interested in 4:2:0 in both 4 and 6k in 10bit. The 5080 I ordered will take a month to come, but I'll post whether 10 bit HEVC encode/decode is being used. I mostly need it about a year from now when I'll be shooting a trip on the Dempster Highway to the Arctic Ocean. So hopefully there will be support by then, if not now.
By advanced decoding I mean decoding 4:2:2, all modern NVIDIA GPUs do 10-bit 4:2:0 (my 2080 did that). The 5080 and 5090 also have multiple decoders I believe, to do even more media streams simultaneously.
Personally I don't see a point to 4:2:2 encoding- if you need to keep the image pristine for compositing use an intermediate format like ProRes. There's no need for this precision for viewing.
The AMD 6800xt has 10 bit HEVC decoding, but I guess Vegas never used it because it is terribly slow. So good news Vegas does support it for NVIDIA. I don't need 4:2:2.
32bit color for 10bit media still looks to be slow in Vegas23, for 4K footage color graded I get 30fps 32bit, 130fps 8bit. So that's disappointing. Is the engine improvement complete or still more work?
When working with media I had flipped 180 degrees horizontally it would only play at 7fps in 32bit mode (color graded)
The AMD 6800xt has 10 bit HEVC decoding, but I guess Vegas never used it because it is terribly slow. So good news Vegas does support it for NVIDIA. I don't need 4:2:2.
Is that what you see in Windows task manager/performance (I assume there's a place to see decoding activity for AMD; there is for Intel QSV and NVIDIA NVDEC)? If disabling hardware acceleration in preferences file io gives you a different playback framerate than having it on AMD then it's doing something.
Of the 3 I believe NVIDIA is the most powerful for decoding.
@Alan-Smithee Engine development is ongoing. Hopefully 32-bit and HDR modes get further development next.