10-bit color displays - impossible in Vegas 12?


DiDequ wrote on 1/31/2014, 9:42 AM
atom12, just google search images -> sunset and watch alld first pictures.
I cannot see banding. But you can answer those are 10 bits pictures ? but my monitors are only 8 bits capable.
Can you read my grayscale text better on your 10 bits monitor or not compared to a 8 bit screen?
Your camera shows banding with sunset does not imply all cameras will produce banding.
Eye is somewhat strange : if you do not use straight lines steps, you will very hardly notice banding.
Horizontal and vertical banding are different too, but this is a manufacturing screen problem.

I do not know what is true and what is wrong, all your messages here did not help me yet.

larry-peter wrote on 1/31/2014, 9:56 AM
I'm confused by your answer. I do have a 10 bit capable CRT display. Any still image that been resized or manipulated in any way can have some form of dithering applied during processing that will minimize banding when displayed in 8 bit. It's easy to create a 32 bit gradient in Photoshop that will show banding in an 8 bit Vegas project. Change to 32 bit float and the banding goes away. How can my eyes see the difference if their color resolution is less than 8 bits? Are you saying this is all the fault of the monitor?

Edit: And yes, I can see the text in either my 8 bit or 10 bit monitor. When I open it in Photoshop it says its an 8 bit/channel .png. ??? Why would you expect a 10 bit monitor to display it any differently?

DiDequ wrote on 1/31/2014, 10:08 AM
just because I used a 8 bits picture, I have not used jpg compression, which is known to add banding.
The background has a level of 125-125-125
and the text 126-126-126.
Some of us say this difference is big enough to show very easily on 10 bits screens, and I tend to believe people saying our eyes can hardly see this difference. This is why I asked you the question.

You can get pretty good results removing the banding in 8 bits using a mask, using monochromatic noise at a 3% level... Human eyes see lines, as I wrote and the noise added "break" these lines.
and reduce your final compression - if possible - (Youtube and banding are friends)This is another subject...

DiDequ wrote on 1/31/2014, 10:20 AM
Edit: And yes, I can see the text in either my 8 bit or 10 bit monitor. When I open it in Photoshop it says its an 8 bit/channel .png. ??? Why would you expect a 10 bit monitor to display it any differently?

Because according to the wider gamut (see A B and C points on the above gamut picture for the same color) , you should get some little more "contrast?"
larry-peter wrote on 1/31/2014, 10:23 AM
I understand what you're saying. I have also done a lot of experimenting with gradients and noise to reduce banding. It is something I have become sensitive to because my attention is on it. I have had clients with me when I fretted over a banding issue they couldn't see. When I pointed out the areas of my concern, they were able to see it also (maybe not smart on my part). My point is, if our perception was limited to 8 bit color perception or less, why would we even be attempting methods to reduce 8 bit banding?

This happens even with 8 bit cameras recording to a uncompressed file. It's not always due to compression, as the Photoshop gradient test will show.

Edit: And I don't claim to be an expert on imaging technology, but logic tells me that the ability to see the text is more a function of the contrast ratio available to the monitor rather than the bit depth. If your monitor is properly calibrated from black to white, the 10 bit monitor gives the ability for 3 luminance steps to exist between the two values of your background and text, but I don't see why a 10 bit monitor would inherently provide more contrast. A 24 bit bit audio signal is not inherently louder than a 16 bit signal.
larry-peter wrote on 1/31/2014, 11:23 AM
The more I think about it, I believe the only reason we can live with 8 bit at all is because of the compressed contrast available in our viewing methods. On a bright day I can see the detail in the concrete reflecting the sun as well as in the shadows under a vehicle. If our monitors displayed these same luminance levels and corresponding contrast ratio, 8 bit video would be intolerable to watch.

Marco. wrote on 1/31/2014, 11:51 AM
Of course it only works because of our viewing of dynamik range works logarithmically. But 8 bit isn't sufficient anyway. You need 9 to 10 bit.
riredale wrote on 1/31/2014, 11:58 AM
At risk of beating this poor horse to death, of course one needs more than 8-bit resolution if one is to faithfully recreate the brightness gamut of human vision. But when was the last time you used a monitor that had black blacks and yet the whitest whites were as bright as snow in sunlight or white sand on the beach? The reality is that what our eyes can discern is immaterial; what matters is what the display device (specifically, the customer's display device) can discern.

That's NOT to say that in some instances it makes sense to do internal NLE processing in more than 8 bits because of the cumulative errors from multiple manipulations resulting in artifacts that ARE visible.
larry-peter wrote on 1/31/2014, 12:13 PM
Riredale, agreed. And my original point was to contest the claim that our eyes couldn't resolve color at the 8 bit level. I didn't even address the statement of a 6.5 stop dynamic range for our eyes. But as Marco, I and others believe, our current display devices have surpassed the dynamic range where 8 bits is enough to satisfy (at least some of) our eyes when watching these displays.
rmack350 wrote on 1/31/2014, 12:42 PM
I think Vegas is simply sending the preview window's output to your display. I assume that the preview window is always 8-bit BUT if you set your project to 32-bit color mode this is what (my) preview window displays:

Project: 1280x720x128, 23.976p
Preview: 1280x720x128, 23.976p

Note the "128". They're saying that the preview is 32 bits/channel (32 x 4 = 128). Now, whether that's true or a conceit...I don't know.

One thing to note about 32-bit float mode is that ANY 8-bit filter in the chain will convert the current frame of picture to 8-bit. So you probably want to eliminate filters in your test, which you've probably done.

Another possible stumbling point is to determine what Vegas is doing with your source file. I think Vegas definitely has support for hi color image files but I don't know if Tif is one of them. Tif may be handled by Quicktime, which might be complicating things.

Ideally, it'd be nice to have some guidance from SCS.


<Edit>There's not much I can test here, but I created the gradient file, mostly following your instructions except that I created a file in 32-bit float in photoshop. I then saved as PSD and output an EXR file and a 32-bit float Tif. I then brought them all into Vegas. Vegas tells me the PSD and EXR are 128-bit and the Tif is 96-bit (no alpha channel).

The PSD displays dark. Toss that as an option. The EXR and Tiff are identical. As for banding...my displays here are definitely 8-bit. Vegas is set to 32-bit float. I can't see banding in these gradients when displayed at full size, either in the preview or on the secondary monitor. If I reset Vegas to 8-bit I see banding in these images.

The fact that these look different is probably a matter of when and how the output becomes 8-bit. In an 8-bit Vegas project the image is probably made 8-bit when Vegas first reads it. In the 32-bit project...either Vegas is making the output 8-bit at the very end of the chain (and doing a better job than it does in the 8-bit project) or it's actually sending out raw pixels to the display and something post-Vegas is doing the conversion.


rmack350 wrote on 1/31/2014, 1:57 PM
A little follow up, AKA "reading the help file"

If you look at the help section titled "Preferences - Preview Device Tab" you'll see that it describes 10-bit output settings for AJA and BMD cards. There's no mention of it for the Windows Secondary Display.

Doesn't say you can't do it, just that you CAN do it with those cards. Actually, I think outputting 10-bit to a 10-bit capable graphics card should be hi onthe list for the next version of Vegas, which of course will be version 13.

I think Bob already said most of this.

NormanPCN wrote on 1/31/2014, 2:22 PM
Project: 1280x720x128, 23.976p
Preview: 1280x720x128, 23.976p

Note the "128". They're saying that the preview is 32 bits/channel (32 x 4 = 128). Now, whether that's true or a conceit...I don't know.

Vegas is just telling you how they are computing and storing the preview frame. This is their internal render which is independent of what gets sent into the Windows graphics subsystem.

The 32-bit floating point per channel color thing is a video editor thing. Windows uses integer color only. Windows 7 and later do support deep color. Up to 16-bits per channel. To use deep color is "special" and the application needs to do use specific APIs and data structures to use color beyond 8-bit per channel.
farss wrote on 1/31/2014, 2:40 PM
[I]"According to wikipedia, it is only 6.5 stops, not 30 - and 8 bits is enough..."[/I]

That's the static ratio, the same Wiki mentions a range of 20 stops as the dynamic contrast ratio. It's not that hard to measure, consider the lowest level of illumination at which we can see [I]something[/I] and the brightest we can tolerate before damage and the answer is around 30 stops.

Human vision is complex and there's several ways in which we try to create a pleasing image, the most basic is tone mapping but that doesn't work so well with high dynamic range images. An alternate technique uses anchor points but that too has issues. Both methods' limitations are imposed by our display devices, ones that may no longer apply.

rmack350 wrote on 1/31/2014, 5:01 PM
Thanks Norman,

Yes, that clarifies some ideas. Those numbers for Preview refer to what's being stored in RAM (and in cache) for the current frame. Then I'd assume that Vegas converts it to 8-bit before sending to the Windows graphics subsystem. It looks to me like Vegas can also send out 10-bit to AJA and BMD cards.

32-bit float is a processing thing. Doesn't have to be limited to video editing but it seems like it's well suited to nondestructive FX chains like you'd use in video fx. It'd also work nicely with Photoshop's adjustment layers, for example, but is less useful when you bake an adjustment into an image layer. One of the features of 32-bit float is that you can send values way out of visual bounds with one filter and recover them with the next filter.

Incidentally, when I made a 32-bit float image in photoshop and then created a gradient, the color values in the color picker looked a lot like what Vegas is using, but they also included analogous adjustments in RGB so you could work in the values you like. Vegas should be doing that.

I looked very briefly for info about using 10-bit displays. This was the first thing I found:

It describes how to get Photoshop CS4 and 5 to output 10-bit color and that might be suggestive of what Vegas *isn't* doing.

NormanPCN wrote on 1/31/2014, 5:28 PM
I have only ever used Photoshop in 16-bit mode. Yes, Photoshop has a 32-bit floating HDR mode, but unless you are doing HDR then it is overkill given even the best DSLRs only capture 14-bit.

As someone who came from the Photo world, using Photoshop, to video just this year, it seemed silly to me that video editors jumped from 8-bit integer to 32-bit floating. Serious overkill and a processing hit compared to supporting a 16-bit mode like Photoshop and After Effects it turns out. It is what it is and we use what we have available.

In the photo world you can do multiple exposures and create HDR in post and will need 32-bit floating. In Video it is always going to be a single exposure and you just don't need the serious range of 32-bit floating. One mans opinion.
stephenv2 wrote on 1/31/2014, 5:43 PM
While I appreciate all the discussion, my original post was about can Vegas display greater than 8-bit on secondary display and/or preview (we know it can internal process). I have a working setup that allows Photoshop and Premiere to do so.

My testing shows it cannot be done other than by external AJA/Decklink - unless someone can show it can or SCS can chime in with info, it appears the answer is no. Hopefully this will be addressed in v13.
rmack350 wrote on 1/31/2014, 6:16 PM
I think the answer is "No". I'd suggest that you go here and suggest it.

stephenv2 wrote on 1/31/2014, 6:30 PM
Some else can make a product suggestion. This is my final reason for finally moving away from Vegas as my primary NLE. I've used it since 2000 but I need a number of features not supported on Vegas for my work these days and unfortunately I'm not a fan of product suggestions. I've asked for surround monitoring greater than 5.1 for 4 years now to no avail.
rmack350 wrote on 1/31/2014, 6:33 PM
Yes Norman,

I used 32-bit float in Photoshop strictly for poking at this problem. Honestly since I work all day, every day, in Photoshop and nothing I do requires more than an 8-bit workflow, I don't even use 16-bit.

However, 32-bit float is a very different animal from any integer format. In a 32-bit float chain of filters you can slam all your pixel values up to pure white and then pull it all right back down again with the next filter. You can't do that with an 8, 10, or 16-bit integer filter. Not that you'd want to do *that* exactly, but the point is that it's pretty darn close to non-destructive.

And as a bonus, 32-bit float gets you a final integer output at any of the other bit depths below it. Also, as a side effect, you could get less steppy gradients in your output, but the point is to store a set of values that are nearly infinite and also out of the visible range.

A good short explanation:


DiDequ wrote on 2/1/2014, 5:29 AM
stephenvv, I understand your decision.
Could you please tell us what NLE editor you will choose ? Good luck for you.

For all people thinking we can see many millions of colors, read this page

All those ellipses make plenty colors than our human eyes will be unable to differenciate !!!
even considering the 3x actual size plotted scale !

And I do not understand why we should a dynamic contrast ratio in a video world (Wiki mentions a range of 20 stops as the dynamic contrast ratio), just because our eyes need several seconds to minutes to adapt , 6.5 stops, not 20 ! If I am wrong, please give me more arguments, I'm always happy learning things and appreciate your help.

farss wrote on 2/1/2014, 7:06 AM
And I do not understand why we should a dynamic contrast ratio in a video world"[/I]

I'll assume there should be the word "need" in that somewhere.

I could ask why do we need a frequency response of 20 to 20KHz and a dynamic range of 120dB to reproduce music. In fact if all the composer wanted us to hear was his tune, a monophonic MIDI file would be adequate.
On the other hand there's musical instruments whose lowest note is 8Hz and composers have written music that uses that. Should we ban them from doing that because only few systems can record or reproduce those notes or should we strive to give the artist the largest pallet possible?

I say we should, in every field of artistic expression, I have no idea how an artist might use it but history does show every advance in the science behind all art forms has led to greater scope of expression.

As for video specifically, haven't we just come from the decade where the industry strived to make video as good as film so more producers could afford to use the richer pallet that film had to offer without the expense?

DiDequ wrote on 2/1/2014, 9:21 AM
Should we ban them from doing that because only few systems can record or reproduce those notes or should we strive to give the artist the largest pallet possible?

Ok, now I understand; using music to explain your point of view was great !
8 hz music can be feeled by our bodies.
7,83 Hz is the Schuman earth frequency. Some scientists say all animals can feel it.

I forgot the word "use" .. why we should use a dynamic contrast ratio in a video world... "need" was better. I know it is not easy to try to understand a foreigner that try to speak in your language...
stephenv2 wrote on 2/1/2014, 11:35 AM

I'm moving to Premiere CC with ProTools HD combo. I will miss Vegas's audio features, video track FX and general UI, but I'm also working more on collaborative projects and need cross-platform support, 10-bit color display, 7.1 (and greater) surround mixing, better large project management especially audio replacement.

I will keep an eye on Vegas's new versions as I will likely keep my copy current as I hoping new version at NAB - but at this point I don't think the changes I need are forthcoming. I really do miss the pre-Sony days of Vegas. Take what has happened to Acid Pro - the Sony days have been a mixed bag. I always like Sonic Foundry's independence and spirit and to me, that's been lost.
videoITguy wrote on 2/1/2014, 1:18 PM
Our original OP question was not so much answered as to what he sees about this concern answered in the Premiere competition. I suspect although this thread has offered a lot of food for thought for everyone - there is still a large gap in understanding what solution to this question the OP has come up with.

Of NOTE, however is the listing just posted above of what business concerns the OP has focused on over the competition in the NLE market. One thing I did not see that he addressed, is whether the pricing structure of the cloud offer is suitable to his business plan. It is not what I would want to use. Hence for that reason, I remain uncommitted to putting up operation to the latest iteration of Premiere.