The skinny please on x.v.Color

Streamworks Audio wrote on 4/27/2009, 3:18 PM
I have googled and googled till I cannot google no more!

I am trying to find the low down on x.v.Color - I know that it uses full 0-255 values in a 8bit word to define color (opposed to 16-235 in BT.601 and BT.709) - but what I am trying to figure out is why Vegas needs to be set to a 32 bit project in order to 'expose' all the values in the video stream.

My camera is a simply Sony AVCHD cam... it offers the x.v.Color option, and of course compresses to an AVC stream... but isn't AVC still only an 8bit codec? If so why would Vegas need to be changed from 8 bit to 32bit? If I import an 8bit MPEG2 stream or an 8bit DV stream, changing the project from 8 to 32 bit produces no visible changes. However with my x.v.Color streams you can see a change when switching the projects bit depth. So does that mean the stream from my cam is more than 8bits (i.e a 10bit stream)?

Just trying to get my head aroun dhow Vegas deals with x.v.Color - even more so if it is supposed to still be 8bit.

Chris

Comments

RBartlett wrote on 4/27/2009, 4:24 PM
I thought this was fairly well explained (for x.v.color) whether or not you have a PS3.

http://www.edepot.com/playstation3.html
(scroll down to the section "PS3 Color Space (sRGB to x.v.Color)")

Sony x.v.Color (and the international standard for x.v.YUV) uses non-linear math to make use out of the values 0-15 and 236-255.

So much so that while you can try to map x.v.Color back into 8bit, you'll cause erroring that may be noticable. Going 32bit in the NLE aids this.

Deep color displays that can accept x.v.color or x.v.YUV may be able to correctly represent your work:
Going to 30,36 or 48bits in an HDMI 1.3 hookup display that can be driven correctly by your deck or that can itself scale the x.v.color encoded 8bit data into these linear ranges would be more representative.

AVCHD camcorders sell this feature but have other challenges such as max. bitrate, glass/lens imperfections and the physical characteristics of quite small imager blocks. Nevertheless, the mass market of consumer camcorders assures that the technology can be made available (rather than not) to the professional end over time. Albeit that the prosumer versions usually have a significant premium they at least also satisfy the failings of the consumer cams.

So it isn't that you shouldn't bother with x.v.Color. Especially if you are playing your trimmed work directly from the camcorder or via the USB slot of your PS3 (via HDMI) into your x.v.Color capable (possibly) Sony TV/FPD. Delivering x.v.color through other means could be a challenge. Also, hoping to get a great deal out of any extra latitude you ought to gain will take some appreciation of your entire workflow and quality control. For example, recording a calibration card ought to be part of your thinking so you can then test how this turns out when Vegas' AVCHD reader has a bash at positioning the values against the floating point (32bit) dynamic range.

I've used x.v.color on a CX6EK into Vegas in 32bit mode with good results. Well, it is more important to say that the results were not bad considering everything else. You see, I use my AVCHD camera as a point and shoot with little calibration ever being performed (perhaps some backing off if I spot zebra but that is about it). I don't take additional lighting or a DP with me when I'm using this soda can camera!

Personally, if I had an x.v.color combined with 'deep color' capable TV then I might still keep this additional functionality switched off on the camera. I perceive it to be a toy selling feature, not unlike the 5.1 surround or some of the hot shoe accessories. Nice that Sony try though. It would be a sad day if they ever stopped putting these fancy options into their cheaper units. I had a great time experimenting and contemplating the benefits of the mode.

Maybe higher dynamic ranges (even from 8bit non-linear/represented codecs) will catch on in the years ahead and change my perception on their usefulness (beyond experimentation or perhaps to combat an operators poor calibration).

Pure arm chair researched info. Do get a second opinion before proceeding and honing your workflow.
blink3times wrote on 4/27/2009, 4:47 PM
"I perceive it to be a toy selling feature, not unlike the 5.1 surround or some of the hot shoe accessories. "

Well I for one ENJOY the 5.1 sound. It works pretty well too.
Streamworks Audio wrote on 4/27/2009, 5:13 PM
Well I am not sure if my TV does HDMI 1.3 - manual does not say... nor can I remember if the cables I used are 1.3 compliant... and not keen on taking the tv off the wall to check ;-)

I do use a PS3 to watch videos... with my cam I can just plug it into the PS3's USB and watch it directly off the cam.

That is a great article, but I am still confused - an 8 bit word translates into 256 possible values (just as 8bit audio gives you 256 possible volumes levels). Now with MPEG2 (DVD), MPEG4 (DivX, XVID etc) ,VC-1 (Windows Media 9) and AVC used on Blu Rays, they will only use 219 possible values (16-235). I am not sure if the codecs clip these levels (if there are illegal values outside these levels) I imagine they would. However from what I read on Wikipedia and other sites, x.v.Color just makes use of all the 256 levels (of course if an x.v..Color source was re-encoded to mpeg2 stream then the level would be clipped or adjusted).

So again, I am not sure why Vegas would require to be set to 32bit floating point, if the colors were still captured as 8bit words by an 8bit codec.

Cheers,
Chris
RBartlett wrote on 4/27/2009, 11:23 PM
I believe the AVCHD reader in Vegas is already capable of fully reversing the math used for x.v.color (possibly following the x.v.YCC standard too but not necessarily the panasonic high-profile implementation on the 'format'). If you interpret a dissimilar/non-linearly represented 8bit format then you need more dynamic range if you want to continue with it. Otherwise you need to normalize it back or simply crush or clamp it.

x.v.Color is an achievement and I'd imagine that if your TV supported the necessary it would be documented somewhere useful (a logo on the rear case, the manual (you may have it saved or maybe able to obtain a PDF version online).

I'm being a bit negative with my perspective on x.v.color and 5.1 surround on these camcorders. I'm just that kind of Brit. Sorry about that My underwhelmed-gene goes active like it is on hair-trigger. :-) It all works as described and can give pleasing results. Moving on to say that it improves production values? Well, I won't go quite as far as to applaud them in that regard. There goes that hair trigger again...

I'd hope that your TV supports x.v.color and deep color (DACs/pixel controllers) and that your own PC running Vegas in 32bit-mode is responsive and doesn't impact your use of filters/FX etc. VegasPro9 may be beneficial if you have any reservations continuing on from what you can try today Chris.

I can't yet find documented evidence that Sony or MainConcept support AVC/AVCHD taken from the 8bit formats off these camcorders (memory-sticks/HDDs). I can only recall that I saw more dynamic range in the meters I was checking. But it is all too easy to misinterpret what has gone on whenever funky math gets involved.
Streamworks Audio wrote on 4/28/2009, 12:16 AM
Well I re-learned something today... I totally forgot about.... after reading some great articles on Glenn Chan's site today - different codecs will decode to different RGB levels depending on the projects bit depth.... for example an MPEG stream when opened in an 8bit project will decode to Studio RGB (16-235) where as when opened in a 32bit project it will decode to Computer RGB (0-255) - I am guessing same goes with AVCHD decoded files. This would be causing the change the overall color and look of the video when changing from 8bit to 32bit.

Now I would have to find out if my TV can display 0-255. The x.v.Color files that I did add to the timeline in a 32bit project did show 0-255 is the histogram... so I am guessing that Vegas can read all 256 values from the x.v.Color file.

Chris
GlennChan wrote on 4/28/2009, 7:27 AM
It's not really so much as the TV being able to decode "0-255" (you don't mention the units... which are important; it's like saying the temperature without saying whether it's Celsius or Fahrenheit).

What xcYCC color does is that it uses illegal values to hold colors that lie outside the normal color gamut. So if the TV had a large color gamut, and it can recognize that the source is wide gamut (I'm not 100% sure how that works), then it will apply signal processing to map those illegal values into wide gamut colors. Some colors will fall outside what the TV can display, so the TV also has to implement processing to map those colors into colors it can display.

2- I think that xvYCC is a bad idea. It will have problems in practice...
Streamworks Audio wrote on 4/28/2009, 10:25 AM
Yeah - either the TV can show those colors or it can't. I am thinking depending on the TV, it will either map the illegal values to ones that are legal (in the case where the TV cannot display values below 16 and above 235). Or perhaps it might just clip those values? I image cheaper TVs would clip and units with better engines would re-map.

I don't imagine that the player would play a part in this... I.E would not look at the values it is outputting and either clip or re-map if needed? I could be wrong... the PS3 (my player) has a Full RGB HDMI Output option... one would think that this is for XMB and Games... however perhaps when set to 'Limited' video files that are Computer RGB would be remapped by the PS3 to be Studio RGB before going to the HDMI/TV?

So in short - x.v.Color when recording with my camera is going to be full 0-255 (Computer RGB) - and when brought into a Vegas 32bit project - Vegas will decode it to Computer RGB... if it's an 8bit project then it is decoded to Studio RGB?

If my player (PS3) can support x.v.Color (which I assume means its decoding to and outputting Computer RGB) and if my display (LG Plasma TV) can display Computer RGB - then would it not make sense to keep my work (just home videos and stuff) in Computer RGB all the way through?

I could see if my videos were being distributed and did not know what the player/display capabilities would be, then yeah stick to Studio RGB (converting it in Vegas) - but if it just for my home viewing then I cannot see where x.v.Color would be a bad thing?

My 2 cents ;-)

Chris
GlennChan wrote on 4/28/2009, 7:10 PM
I think you have things confused.

xvYCC Color has little to do with studio RGB versus computer RGB. xvYCC has to use Y'CbCr values (Y'CbCr = YCC).

I am thinking depending on the TV, it will either map the illegal values to ones that are legal (in the case where the TV cannot display values below 16 and above 235).
That's confusing.

Just ignore the numbers 16 and 235 for now (especially when you aren't paying attention to the units... it's like people arguing whether water freezes at 0 or 32).

How xvYCC will work is that it will record one or two of the channels below black level, or above white level. On normal TVs, these illegal values will get clipped.
On a TV that supports xvYCC, signal processing will be applied to these values.

As far as what the TV can do, a normal TV will be close to the standard gamut (either SMPTE C or EBU or Rec. 709). It can't display colors more saturated than the standard red, green, and blue. On a wide gamut TV, you call it wide gamut because the TV *can* display colors more saturated than the standard red, the standard green, and the standard blue. However, there will be some colors that cannot be displayed.

So suppose the whole chain supports xvYCC. The source material may contain values that call for any visible color and many colors that can't be displayed on the TV. So some processing has to occur that will map the undisplayable colors into the closest displayable color (or you could just let values clip, but that looks terrible).

1b- xvYCC uses illegal values to represent the colors that lie outside the normal color gamut, e.g. reds that are more saturated than the standard red.

2- In Vegas, you would have to make sure that you don't clip values below whatever black level is and whatever white level is.

So to do that, you would change into 32-bit mode.

You also have to ensure that everything else in the signal chain won't clip illegal values. e.g. mpeg-2 codec, your media player, your TV, etc.

3- In Vegas, the proper black level might be at 16 RGB or it might be at 0 RGB. Don't get confused there (even though it's kind of stupid... there should be one standard set of levels, not two...).

Values below 16 RGB are not necessarily illegal in Vegas (as in, they won't necessarily be converted into an illegal Y'CbCr value), because it depends on what levels your destination codec is expected.

------------

The really simple answer to make sure you get xvYCC color: make sure no clipping is happening.

In Vegas, that means you should use (A) a 32-bit float project and (B) render to a codec that won't clip illegal values and records Y'CbCr.
It might be more complicated than that if you start applying color correction.
Streamworks Audio wrote on 4/28/2009, 10:02 PM
Hi Glenn,

Not really confused once I realized that a 32bit project will decode to Computer RGB. I think the terms I was using were incorrect... trying to simplify it a little.

So an x.v.Color recording would need a 32bit project as an 8bit project would introduce clipping? I noticed it more with my camera in the whites. I found it interesting that in one of your articles you mention that most consumer cameras record 'Super White' or something like that. I do notice that.

My TV is far from top of the line - so I am sure it only supports Rec.601 or Rec.709, meaning a Gamut of 16-235.... not sure what the PS3 can do. Either way - something is going to clip the 'illegal' values... either the PS3 or the TV I would assume. So no point in having them (x.v.Color) there to begin with I guess.

If I have x.v.Color files now and wanted to covert them to Rec.709 Gamut, would the preset "Computer RGB to Studio RGB" work? Or should I use the Broadcast colors plugin?

Sounds like x.v.Color is still a ways away yet.

Chris
GlennChan wrote on 4/28/2009, 11:30 PM
16-235 is not a gamut.

If I have x.v.Color files now and wanted to covert them to Rec.709 Gamut, would the preset "Computer RGB to Studio RGB" work?
That has nothing to do with xvYCC.

Or should I use the Broadcast colors plugin?
That would clip and/or limit illegal colors... which would get rid of wide gamut colors.

I am trying to find the low down on x.v.Color - I know that it uses full 0-255 values in a 8bit word to define color (opposed to 16-235 in BT.601 and BT.709)
I don't think that's how it works, but I don't have the exact numbers + formulas. It probably uses the 16-235 Y' range for legal values... and then I'm not sure about the chroma.

---

I'm not 100% sure on how it works in practices and I don't have the equipment to find out.

Somehow, the player is supposed to talk to the TV and they figure out if the source material is xvYCC or not, what the primaries of the TV are, and who should do the signal processing. Uh... something like that.

Presumably, your material should be flagged as using xvYCC if that's what it is (otherwise the TV won't know and assume it's normal material)... I don't know how that works.

Vegas also should be designed to avoid clipping. e.g. make the 32bit mode work well, get 3rd party plugins to be 32-bit.

And then if you want to make DVDs for normal distribution, you should probably make it broadcast safe and this will kill the wide gamut colors. (And you have to make sure the colors on that DVD look good.)

The simpler approach would be to recognize that highly saturated colors are pretty rare in our world and we won't gain much by going down this road... which will cause a mess...
Streamworks Audio wrote on 4/28/2009, 11:41 PM
Yeah but wouldn't be better if there is something I can do to these clips in Vegas to make them 'legal' for my TV instead of the player or TV doing it? As they are not legal now.
GlennChan wrote on 4/28/2009, 11:47 PM
Yeah but wouldn't be better if there is something I can do to these clips in Vegas to make them 'legal' for my TV instead of the player or TV doing it? As they are not legal now.

Well you can do things the 'old'/normal way... and just ignore all this xvYCC stuff which is not (AFAIK) primetime yet. Sony is trying to put the TVs in place first and pave the way for xvYCC.

Doing things the normal/old way:
e.g.
http://www.glennchan.info/articles/vegas/color-correction/tutorial.htm
farss wrote on 4/29/2009, 3:34 AM
From what I can see at this stage of the x.v.Color saga Sony is saying it should only be used if you have a matching display device. I assume their intent is for it only to be used for playback from the native footage to the display.
I'd imagine you could do a cuts only edit of the footage in Vegas and all the values would be preserved however no doubt you'd loose whatever flags are meant to be there to indicate to the display that it should handle the vision differently.

Hm, a bit more research and I'm thoroughly confused as we also have 'Deep Color' on HDMI 1.3 which appears to use 12bit values. Worse, it would seem that the higher data rates require higher speced HDMI cables or is that just another to sell expensive cables.

It'd also seem that displays that are so enabled might indeed do the wrong thing if not fed this deep color signal i.e. there is nothing that tells the display it's getting a different kind of video and to handle it correctly.

All in all it sounds like a bit of a mess. Somehow though I'm not so certain how long we can just ignore it.

Bob.
GlennChan wrote on 4/29/2009, 12:08 PM
I don't think deep color will be a problem??? I'm not 100% sure on the technical details, but I suspect that it's backwards compatible with older systems. e.g. if you have a 12-bit signal, you can talk to the TV and ask if it supports 12-bit input... if not, you dither than 12-bits down to 8-bits and send the TV that.

2- Deep Color / higher bit depth and xvColor (bigger color gamut) go hand in hand. Because your bits are now spread out over more colors, you want a higher bit depth to go along with that.

Somehow though I'm not so certain how long we can just ignore it.
Here's the thing: Sony is producing wider gamut LCDs right now because they can / LED backlighting makes it easy.

If you throw normal signals at the LCD and do things properly, you won't see any of these wide gamut colors. So... they will oversaturate the colors (except for flesh tones and maybe some other colors) so that the display's entire color gamut can be used. This is a "feature" (they call it live color creation).
Fotis_Greece wrote on 4/29/2009, 12:33 PM
Glenn, to just finalise a little bit this confusing issue
Do you suggest we shoot, edit and render (32bit) using x.v.color just to have "higher quality" for future reference even if our current TV does not support it?

or in other words would a x.v.color material would look worse in a non x.v.color TV compared to the "standard" color range?
GlennChan wrote on 4/30/2009, 8:10 PM
or in other words would a x.v.color material would look worse in a non x.v.color TV compared to the "standard" color range?
The way to find out would be to test???

Sorry I don't have the equipment.

2- If you legalize your output (which makes sense for current TV + player combinations), then you'll clip off all the wide gamut colors / values.

If you shoot with wide gamut colors I don't think it hurts.

(This my lazy answer:) The simple practical answer would be to ignore this stuff because the benefits are subtle and the practicality of it is a few years out.