I am just wondering why Vegas does not use 16 bit deep color (I suppose per color channel) like Magix introduced in their other NLE VPX (video Pro X). We are now stuck in using 8 bit Integer (OK if your source is 8 bit and do not do much color correction) or the more pro (but more system taxing) 32 bit in 2 flavors. 16 bits would seem like a good compromise between computational speed and quality. Although I am not sure if VPX' 16 bit color calculations are integer or float (I could not find info on that).
As a side note I found the latest VPX (e.g. 10 years anniversary) edition time line editing stuttering vs Vegas Pro 15 on the same machine and using the same sourcefile under certain preview window layouts. It could be a bug, but it could also be the that 16 bit is more taxing …