I've never had any luck using 32 bit to render except on very small projects. I think I tried it once in V9 and had the same problem so I gave up. It seems like false advertising to list it as a feature since I've heard others complain about the same problem.
because my renders would start, it made me think it was a memory issue, so to reduce the memory usage i disabled cores in taskmanager, this got the renders to work, sometimes on 2 cores, or if intensive must be 1 core.
i achieved the same result in the vegas render thread pref.
so prob this is a known issue, so why is this not published in some way so it's easy for users to find and experiment with workarounds?
I suspect what's happening is 32bit mode is pushing up the saturation which is giving the keyer a better chance of working. You should be able to achieve the same outcome by using say the Secondary Color Corrector before the Chroma Keyer. Going back to 8 bit mode will also make Vegas more responsive and less painfull to work with as you adjust the keyer.
For badly lot screens I've found it best not to sample the screen color. Instead use the appropriate preset and work from that. A badly lit screen when sampled will return values that can be all over the place, the presets set the keyer to work with a color. Kind of like the difference between using a spear and using a net.
i have noticed it best to use pure green with the keyer, and i have been cleaning it further with the secondary colour.
i have been experimenting a lot with it cos of advice here, but i really dont understand it. its not intuitive and while i get results, i dont understand why!
Grazie:
Putting it another way: Why ARE you using 32-bit processing?
Eventhough I could ask the same question, a better one might still be: Why does this setting not work?
I have not seen anyone on this forum be able to render using 32bit.
"I have not seen anyone on this forum be able to render using 32bit."
I edited and rendered a 3 hour stage show in 32 bit. Source footage was from V1P. Afterwards I looked at what I'd achieved and realised I could have just bumped the saturation up and saved a lot of rendering time.
Why doesn't it work too well even when you wrangle the issues is I guess because it uses more memory and more CPU. This isn't something unique to Vegas either. AE slows down dramatically in 32bit float. What's sorely missing in Vegas is a 16bit integer mode for working with 10bit YUV footage.