10-bit color displays - impossible in Vegas 12?

Comments

Steve Mann wrote on 2/26/2014, 8:05 AM
I was looking up something else and came across this article. http://www.imagescience.com.au/

From the article:

This is a confusing issue for a lot of people. This article tries to simplify and explain the issue. (In some ways it's over-simplified for the sake of clarity rather than getting bogged down in too much detail).

The key thing is not to confuse the bit depth of your video cards output signal with the bit depth of your monitor's Look Up Tables (LUTs - 6,8,10, 12 or even 14 bit with the very latest Eizo CG and NEC PA monitors). Also, your digital image files have a bit depth that is a separate issue as well.

First, what is bit depth in this particular context? Well, it's a measure of how many discrete values the system can do its’ processing with - and more is better. For example, any 6 bit system has just 64 signals to play with - meaning there are only 64 possible adjustments you can make to this signal. Put very simply, you can choose 31 (might be too red) or 32 (might be too blue), but there's no concept of 31.5 (which might be just right). With 8 bit systems, you have only 256 signal levels to play with - and this is the normal scenario for video card output signals - your computer can only ever output a value of between 0 and 255 for each of Red Green and Blue, which combined form a specific colour. 99.99% of computers and monitors on the planet work this way.

Once the signal actually reaches the monitor, and for example lets choose (128, 0, 0) which is a medium strength red - then the monitor uses it's LUTs (look up tables) to choose which colour to actually display for this signal. With, for example, a 6 bit LUT, there are only 64 shades of red available. So choices for the actual tone the monitor displays are very, very limited and it's basically impossible for the monitor to choose a correct colour (as odds are Red 31 is not right, and neither is Red 32). Move up to an 8 bit monitor, and the choice is improved somewhat as there are now 256 reds to play with - this increases accuracy, as Red 127 might be a bit too strong, 128 is closer, 129 is too strong, so 128 is chosen as the best option. But odds are this is still not enough finesse to get the right colour. (When you calibrate a monitor, this is what is happening - the calibrator tells the monitor to display Red 127, Red 128, Red 129 etc, measures them, and creates a table of what actual colours these values represent - this table is then used to know what signal to send the monitor to later get the right colour).

This is the way it works for almost all normal scenarios, except for quite cheap or quite expensive monitors - 8 bit video cards and 8 bit LUTs are standard. Also, the monitor LUTs are single dimension - that is they only work on one colour at once. Good monitors now have so called 3D LUTs which allow them to adjust R, G and B simultaneously, which helps as colour error is rarely just along one axis.

As monitors get more expensive, the LUTs get better, with 10 and 12 bit being most common in higher end monitors. This means the signal quality (really the amount of signal finessing that can be done) moves up - 10 bit means 1024 levels, 12 bit means 4096 and 14 bit means 16,384 levels - and the best is 16 bit with over 65000 levels - basically, vastly more precision is available in the mapping of input tones from the computer to output tones on the monitor (remember - because the video card signal is 8 bit, these values range from 0 to 255).

However, the bottleneck in this system is the video card - it can only output 8 bit signals (because there are three channels (RGB) means a total palette of approximately 16.7 million colours - which sounds like a lot but there's still only 256 pure greys in there). To solve this bottleneck, systems are moving toward having 10 bit output from the video card. Meaning 1024 possible signals for each of R, G and B, or a palette of over 1 billion colours (1024 along the pure grey axis).
OldSmoke wrote on 2/26/2014, 8:41 AM
I dont think the Video Card is the bottle neck. http://www.geforce.com/Active/en_US/en_US/pdf/GTX-580-Web-Datasheet-Final.pdf The GTX570/580 are 10bit capable but can the Vegas-Windows pipeline handle 10bit?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

Lovelight wrote on 3/4/2014, 10:58 PM
I suspect something is up with Vegas. Adobe does not have this problem on the same machine.
riredale wrote on 3/5/2014, 12:14 AM
That article in Imagescience referenced a few comments above is ignoring an elephant in the room.

I use a Dell 2412 monitor. It has a 6-bit processing system. Does that mean it can only show 64 different levels, as the article states? No. The monitor uses a form of temporal dithering, which is invisible for all practical purposes. So it displays 16M colors, just like a conventional 8-bit processor does.

So a 6-bit processor need not be dismissed out of hand. Depends on what else the processor can do.

Dell calls this process A-FRC.
Steve Mann wrote on 3/5/2014, 2:11 PM
I am curious - why is 10-bit preview so important??