The Vegas Creative Software Support Community – Find help here
What Values for 235 W and 16 Black
Former user
wrote on 12/18/2011, 9:30 PM
With the new media generator in V11, (math is not my strong point), since the range is 0 to 1. What values do I put in to get a 7.5 black and 100 White (235 and 16)
I'm finally trying to do real work on a post-10 version of Vegas and I can't get a true [16,16,16] Solid Color preset to stick, no matter what I try (including giving it a new name).
If I match a solid color to something that's truly RGB16 with the eye dropper it will stay as that in a saved project, but when I save it as a preset and retrieve it, it rounds to 0.06, which is RGB15.3. Takecolor measures it at RGB15 and the waveform shows it at around -1%.
>> wonder whether this might be part of the OFX standard? Every OFX FX I have (Sony & Third Party) does it's color picking in the same manner, i.e. to 2 decimal places. <<
In the OFX Presets in %userprofile%\Documents\OFX Presets\com.sonycreativesoftware_solidcolor\Generator the values are stored to 6 decimal places (and they are correct):
What I've done is to create a video white and video black .png file in Photoshop since Adobe never went to this weird color value system. Here's a dropbox link if you want to download them. They are 1080p.
As I operate at all sorts of resolutions and aspect ratios I've made a png just 1px by 1px which will fill the display in any project by turning the "Maintain Aspect Ratio" switch off.
Most of the bulk of this thread has old posts. I am currently on VegasPro13 build 373 and it was my understanding that this issue of color selection strategy has silently been tweaked in the code built up by SCS to this point. I could be wrong and I may not understand what the previous post persons are trying to do.
Here is what I see today - as mentioned at top of this thread - going into ProType titler where the eyedropper color selector values can be readily evaluated in choice of two different value systems- you get the exact equivalency values. Select the value scheme that you need to work inside of the color media generator and it holds.
So how does any of this make sense? Sometimes I've given RGB values by clients for a specific color. Who in their bright mind came up with this color sceme and what value does it hold over the old way?
Ok, but do any other editing programs use it? None of my Adobe program use it. I just don't see the use in the graphics/editing world. It seems like they took something completely useful and broke it.
Actually many excellent graphics programs, and NLE workflows particularly with color grading do use a variety of translations for setting values. Most of the best interfaces are designed to report to the user which selected scheme they are using.
What has changed in last versions of SCS software is that they have NOT communicated very well in the interface, and so I point out that ProType titler continues to use the best documented scheme interface - whereas other parts of last VegasPro versions have merely stripped the interface. However, all of the backend in coding remains the same.
I agree with you completely Marc. It's a step backwards and illogical.
Today I've found that if I type 3 decimal places into the box, as musicvid suggests, that will hold in a preset. But if I enter more than 3 decimal places, it rounds back to 2 decimal places:
0.063 - preset sticks at 0.063 (but only shows 0.06 in the box)
0.0627 - rounds to 0.06
0.06274 - rounds to 0.06
0.062745 - rounds to 0.06
match with eyedropper - rounds to 0.06
Weird. I can't understand how code would accept 3 decimal places but round more than 3 decimal places back to 2 decimal places.
So the closest you can get to true [16,16,16] is when the preset xml file shows these values:
Here's my handy and free RGB<->Decimal converter I posted the last time we had this discussion:
There is compelling historical precedent for both scales (and that's all they are).
Neither is going away any time soon.
0-1 decimal scale comes from analog photo densitometry, dating all the way to the 1920's.
0-255 binary (RGB) dates to the earliest days of 8 bit digital imaging (BMP and AVI), simply because they needed binary representations for color and luminance.
So, we're mapping [0,255] to [0,1] and vice versa.
DEC = RGB/255
RGB = DEC*255
Ah, so the SCS developers were pandering to all the analog photo densitometrists. Cynical marketing move ;)
Thanks for that link. And in that thread Marco linked to something that I spent ages searching for yesterday, finally convincing myself that I'd dreamed it. Antares/Satevis' application extension CorrectColorInput.dll which I found a copy of on a Russian Vegas forum. Install the dll in %userprofile%\Documents\Vegas Application Extensions. This allows you to enter RGB values in the form 16/16/16 or 16/16/16/0. It still won't come back accurately as a preset though.
Now here's a thing (again discovered via Antares/Satevis on the German forum). Did you know that even without this extension you can type hexadecimal strings straight into Vegas' colour box? If you type 0x101010 it gives you accurate RGB[16,16,16]. And 0xEBEBEB gives you RGB[235,235,235]. Again it's not accurate when saved and retrieved as a preset. Apparently entering values in the form 16;16;16 also works on some English operating systems, but not here in Windows 8.1.
Well... at least give us the option of black and white defaulting to 16-235 since most software players/browsers expect studio levels when playing back video (16-235) and do the expansion to 0-255 automatically in software. I believe most other video editors also default black and white to 16-235 (I know premiere does).
Looks like floating point math, but that's not the origin. It's a log function (base 2 iirc?).
It's seems right for Vegas to work with decimal input, but not necessarily by default. One should be able to pick their input scale type and precision in preferences and go with it. It's worth noting that the first digitally-shot movies didn't start showing up until 2001, so anyone who worked in the industry up until then thinks in decimal scale. I was lucky enough to see digital scan converters in their earliest days in film labs, so thinking in either Decimal or RGB 8 bit is not difficult for me.
It's worth mentioning that defaulting the input to two rounded decimal places in recent versions of Vegas appears to want to limit us to only a million colors (100^3) of the 16+ million available (256^3) in an 8 bit project, is my math right?
A step backwards, indeed.
It's the same type of historical albatross we've had making the transition from VU->dBFS, IRE->RGB, English->Metric, and a few others. It's going to be around for a while.
[I]"Ah, so the SCS developers were pandering to all the analog photo densitometrists"[/I]
Not quite because film density is non linear and in a different way to how 8bit video uses gamma. Kodak's 10bit Log Cineon system defined reference black as 95 and reference white as 685. Doing a simple conversion using a factor of 4 is going to lead to issues.
No one uses Cineon files today however the standard DPX files typically use the same curve. To confound the issue here even more many cameras today use a variety of "film gamma" curves, even the very cheap BMPC has that option.