I hate to have to revisit this topic. I had a long post about it here and thought it was resolved. It's not resolved, but weird.
Now I've noticed preview performance gets worse and worse, and eventually the glitches I wrote about come back. It gets worse the more passes it makes over stills that have color grading applied. So trying no pan motion envelope now, and all defaults in the color grading. Also instead of looping over and over a short timeline of two stills, I've also tried a timeline with 16 different stills, no overlaps, with and without panning motion. It's now a 1080 project with 16 stills, 5 seconds per still.
It's all about GPU measurements now. And I'm only testing stills at this point. If I don't have the color grading panel assigned to anything, the GPU uses about .5 Gigs of its 7.8 Gigs of so-called shared RAM (Intel UHD 630) and very little of its processing power. When I assign the color grading panel to just one event, the GPU goes up to .8 Gigs after closing the CG panel and before starting playback. I start playback and as soon as playback reaches that event, the GPU jumps to 1.2 Gigs. OK, so color grading needs GPU ram.
Now I enable color grading on all 16 events. As it plays through those events, the GPU starts out showing 45-50% busy in 3D at first. As it goes on, the % GPU utilization gradually goes down to below 20% and the memory usage goes up to 7.2 Gigs by the time it reaches the 8th event. When it reaches the twelfth event, the GPU ram drops back to 1.0 Gigs! Rewinding and playing from the beginning, the GPU time stays around 20% and GPU ram shows cycling again all the way back up & down. Because I have no motion in this version of the project, I can't tell if the GPU is really doing all it needs to, but I am suspicious.
Now I rebuild the project with a mild panning envelop on each still and no CG. The GPU needs constantly 1.3 Gigs and between 32% and 45% utilization to play all the way through just panning, and it is mostly smooth looking. The whole system is using 4.8 Gigs.
Then I add CG to all 16 again and close the CG panel. Again, null settings. I see 1.3 Gigs of GPU ram and 5.0 Gigs of system ram in use. When I play this, it starts out OK then gradually drops more and more frames while eating GPU ram, and using less GPU processing %. Now also watching the amount of system ram used, it appears to grow by the same amount as GPU ram. So it really is shared ram. I run it in a loop over those 16 stills. Even as it cycles back to low GPU ram usage, the GPU processing % keeps going down slowly until after several minutes it has reached 17-18% with jumpy playback and more of those glitches I reported in the previous thread. But I now notice that the system ram did not go down with the GPU ram. So as the loop goes on, the system ram keeps growing.
It's not a very simple memory leak, because the shared memory as seen by the GPU gets periodically reclaimed somehow, but not in a simple way; e.g. not when simply stopping playback. But system ram is not getting reclaimed. But it is also some kind of performance leak, because the GPU becomes less utilized and the playback becomes more jumpy.
As one more data point regarding ram, I deleted the timeline, and invoked "new project" without exiting Vegas. The system ram usage stayed at 8.7 gigs, almost 4 gigs higher than when I started the CG portion of the previous project.
One more unfortunate observation: after closing Vegas various times to start the project fresh, a Vegas process would often stay in memory with no windows and start using up a whole CPU core until I killed it with task manager. I think this only happened when I added CG to several events and played through them a bit and later (even much later) closed Vegas.