Let me start by saying I have at best a basic understanding of how to interpret a Vectorscope. I'm using VP18 for all of these examples:
1) blank 8-bit video levels (legacy) project, add a video track, then generated media/solid color. I'll pick blue. I interpret this as a (0,0,255)RGB image; Vegas calls it (0.0,0.0,1.0,1.0). Vectorscope with studio RGB in settings *unchecked* shows blue saturation of 90 (%?), which seems odd -- wouldn't this be 100% saturation?
If I change the generated media to solid color --> red, Vectorscope now shows a red saturation reading of 120 (%?) which mystifies me twice. Why is saturation for a 100% red color reading numerically different from saturation for a 100% blue color? And again why aren't both colors reading 100% saturated?
I'm assuming the reticules/boxes are indicating broadcast-safe saturation levels, which apparently varies depending on the color?
2) add an output computer-to-studio levels conversion to the project. Now the solid color sat readings shift to within their respective (I assume) broadcast-safe targets, which makes sense to me even if the numbers themselves don't.
3) change vectorscope settings by checking the studio RGB box. Now the 100% solid colors jump again outside the target boxes, which I guess means a 100% solid color corrected to video levels is still outside the safe range. I guess that makes sense.
4) change pixel format to 8-bit (full range), uncheck the studio RGB box in scopes settings, remove the levels correction from the video output bus. Solid blue now reads 90 (%?) and solid red reads 120 -- same readings we initially got in #1. Which seems strange to me. Shouldn't these readings match #2 (ie, with the levels correction added)?
Any help understanding this much appreciated.
???