- What is your Sony(?????) Vegas version and buildnumber? - What is your used hardware? - What are your sources (MediaInfo please)? - What is your project format? - What (customized?) rendertemplate do you try? - What is the used driverversion for your Nvidia GPU? If possible use screenshots to show us
@peter-d5816 Make sure you uncheck the box for legacy-hevc decoding. Not sure if Nvidia will decode 8k 420 hevc but that setting will at least give it a shot.
Btw, I've found 8k on the s23 disappointing. It's limited to 30p and 24p and only one lens option... 4k goes up to 60p which looks better, decodes easier, and gives the option to choose any of the 4 lenses. And if you can get by with 4k 30p, there's double the low-light capability and sharper focus compared to 8k 30p.
Does the NVIDIA decoder do 8K HEVC? If not it will all be on the CPU and everything else will have to wait for the decoding.
Apparently Nvidia does decode 8k hevc. Just not particularly well on my Xeon using a 1660 for decoding. Here I was trying to play an 8k 30p Fuji clip with no proxy and getting 0.626 fps compared to about 6 fps on the Arc machine:
@Howard-Vigorita It's Vegas's implementation nothing wrong with your Nvidia GPU or it's decode. Interesting to see the Arc decoder working better on Vegas though.
(8K30 project)
You can see by the decode graphs what it should look like when working correctly in Resolve, in comparison with those jagged lines in Vegas which happen when Vegas can't process all the frames and drops them instead
I just tried it in Resolve... will post details in the other thread. Also tried playing the 8k clip in the default Windows Media Player... next to no gpu utilization and very little cpu either... but looks spectacular full screen.
@Howard-Vigorita in Resolve the 8K timeline used10GB Vram, getting up to 12GB when playing, so potentially you'll have problems due to your cards 6GB Vram but it would be unrelated to the Nvidia Decoder, more to do with swapping memory between Vram and system memory.