Up until now, I have rendered source material that has been 1080p. Been very happy with performance both on Vegas 14 and now Vegas 18.
So I started to film in 4k lately, and sh*t, the performance hit it took in Vegas 18. It's not rendering of basic clips, but transitions. If I take a 10sec clip, cut in, remove some in the middle, merge them (i.e slide each 4sec clip over each other so I get a cross transition) the rendering of this part (transition) slows to 1-2fps.
So... I thought, wow, what a difference between 4k and 1080p, so I tried to film in 2.7k, and guess what, a nice 40-100fps even during transitions! Basically no difference from 1080p.
Now, maybe this is logical, but seems weird, going from 1080p to 2.7k no difference, but 2.7k -> 4k a huge huge pretty much unusuable difference.
Is this a 4k Vegas bugg? or is it just basic logic of more data to process?
My computer is no slouch for an amateur, Ryzen 3800, 32Gb, fast Ram, Geforce Gtx1070Ti, NVMe disks, separate source file disc, render disc and system disc, all on NVMe.
What's your verdict? ( I will just film 2.7k right now, 4k is unuseable and 2.7k gives a bit more detail to play with at least out of my camera)