Comments

RogerS wrote on 9/13/2025, 9:14 AM

None of the resolution or quality settings perform any better in VEGAS?

A proxy file in VEGAS should at least edit okay (render will still be slow).

3POINT wrote on 9/13/2025, 1:58 PM

Strange, my DJI Osmo-pocket 3 produces 10bit HEVC 4k mp4 files, which I assume are the same as your DJI Mavic 4 Pro video files, those files are playable and editable in VP22/VP23 without any restrictions compared to the 8bit files from my earlier Osmo-pocket.

RogerS wrote on 9/13/2025, 8:38 PM

Hmm in that case is the difference the NVIDIA 2080 Super vs AMD RX 6800xt decoding performance with 10-bit in VEGAS?

wilri001 wrote on 9/13/2025, 10:16 PM

According to Google AI comparison, the 2080 has a dedicated processor for HEVC 10 bit encode/decode which makes it faster for those tasks, even though the 6800xt has more raw power. I guess it's time for an upgrade!
Thanks for the question, Roger.

RogerS wrote on 9/13/2025, 10:30 PM

My pleasure. I just upgraded from a NVIDIA RTX 2080 Super to a RTX 5070. Once VEGAS adds support for it the 50XX GPUs will be able to even accelerate 10-bit 422 media (only Intel GPUs/iGPUs are capable of that) so I thought the card would be future proof.

wilri001 wrote on 9/13/2025, 10:46 PM

Roger, I think I'll go with the 5080. But there are so many manufactures, what has your research shown to be the best?

RogerS wrote on 9/14/2025, 12:23 AM

I don't think I know enough to answer that.

I had a good experience with MSI with my motherboard and last GPU so went with their basic model as I wanted to keep down costs.

Maybe see reviews of build quality on YouTube, etc. I don't believe there are any significant performance differences among models, so it's just construction and cooling ability.

wilri001 wrote on 9/14/2025, 8:12 AM

I did the same and went with Asus because I have good history with them.
So if Vegas Pro doesn't support the 5000 series, do you think there will be any improvement until they do?

RogerS wrote on 9/14/2025, 8:50 AM

I'm using a 5070 with VEGAS so it is supported. You can see its performance in the benchmarks in my signature.

I think support for its advanced decoding features is just a matter of time.

wilri001 wrote on 9/14/2025, 12:16 PM

So by "advanced decoding" you mean encode/decode 4:2:2? I'm primarily interested in 4:2:0 in both 4 and 6k in 10bit.
The 5080 I ordered will take a month to come, but I'll post whether 10 bit HEVC encode/decode is being used.
I mostly need it about a year from now when I'll be shooting a trip on the Dempster Highway to the Arctic Ocean. So hopefully there will be support by then, if not now.

RogerS wrote on 9/14/2025, 9:19 PM

By advanced decoding I mean decoding 4:2:2, all modern NVIDIA GPUs do 10-bit 4:2:0 (my 2080 did that). The 5080 and 5090 also have multiple decoders I believe, to do even more media streams simultaneously.

Personally I don't see a point to 4:2:2 encoding- if you need to keep the image pristine for compositing use an intermediate format like ProRes. There's no need for this precision for viewing.

wilri001 wrote on 9/15/2025, 12:26 AM

The AMD 6800xt has 10 bit HEVC decoding, but I guess Vegas never used it because it is terribly slow.
So good news Vegas does support it for NVIDIA. I don't need 4:2:2.
 

Alan-Smithee wrote on 9/15/2025, 1:46 AM

32bit color for 10bit media still looks to be slow in Vegas23, for 4K footage color graded I get 30fps 32bit, 130fps 8bit. So that's disappointing. Is the engine improvement complete or still more work?

When working with media I had flipped 180 degrees horizontally it would only play at 7fps in 32bit mode (color graded)

RogerS wrote on 9/15/2025, 5:58 AM

The AMD 6800xt has 10 bit HEVC decoding, but I guess Vegas never used it because it is terribly slow.
So good news Vegas does support it for NVIDIA. I don't need 4:2:2.
 

Is that what you see in Windows task manager/performance (I assume there's a place to see decoding activity for AMD; there is for Intel QSV and NVIDIA NVDEC)? If disabling hardware acceleration in preferences file io gives you a different playback framerate than having it on AMD then it's doing something.

Of the 3 I believe NVIDIA is the most powerful for decoding.

@Alan-Smithee Engine development is ongoing. Hopefully 32-bit and HDR modes get further development next.