Sure, I think I will have the 10bit equipment in the near future very likely, and will be able to compare the outcome with a 10bit UHD display with both Edius and Vegas with both a quadro card but also an Intensity 4K hopefully. And I also expect that the preview will be 8bit from Vegas very likely. However I do not know if somebody else has done some tests or measurements here. Beside that I wonder what is your experience with that topic in more detail.
Wolfgang the topic you raise comes up every other month in the forums...people have commented ad nauseam... search the term 10 bit in two years worth of forum pages..
The topic is more convoluted in answering you than you might think. trust Vegas for 8bit output.
Vegas is 10 bit capable. Vegas supports EXR in a 16-32 bit floating point color space. That is so much more than 10 bit it's not even funny. The question really comes down to how good the workflow chain maintained. Is the camera producing the correct signal, are the correct codecs being used and processed in the correct mode, does GPU, display and cabling maintain the 10 bit signal.
Don't believe everything you read. EXR, DPX, TIFF, HDCAM, CINEFORM, XAVC are all 10 bit capable in Vegas. If you are stuck in the past on some old version of vegas, or stuck on dnxhd, QuickTime, huffy, or magicyuv as a nessesary, then you will have problems.
If vegas is 8bit only, explain ACES capability to forum.
It helps if you add the free LUT plug in. To show 10bit color, you need GPU with 10bit support (Quadro or Firepro) and monitor (ex HP Dreamcolor). But you can support 10bit on 8bit display using LUTs.
Vegas has used 10bit precision internally for quite some time.