Sony camcorder x.v.Color and Vegas 32bit projects?

Fotis_Greece wrote on 6/4/2008, 3:03 PM
Hi, I am old to Vegas but new with HD. Just got this HDV camcorder HC9 and was wondering If I should select to use x.v.Color in recordings. Is there any connection between this and the Vegas 32bit projects? (I know 32bit still has some problems with crashes when rendering but hopefully there will be a fix).
I also read in the cam's manual that I should use x.v.Color only if my TV supports it (which my LCD TV doesn't). So in other words will using this along with 32bit will get any better color and other goodies such as blacker than black?

Thanks in advance

Comments

GlennChan wrote on 6/4/2008, 11:13 PM
I don't think blacker than black is really a goodie as negative light doesn't exist in the real world and monitors won't produce it.

In xvYCC colors, values formerly considered illegal / outside normal range are used to store wide gamut colors. To have those colors pass through, you need to make sure those values don't get clipped. 32-bit will help with this as it has a much larger range of values it can represent.

2- You also want to check that any FX aren't clipping colors. Some will clip away xvYCC colors.

3- If your TV doesn't support xvColor then you can't see the benefits right now.

*I haven't played with this stuff myself, that's just the theory. There's not a lot of that kind of equipment out now... most broadcast monitors (including those >$20k) don't handle it..... (yet; it should not be difficult to implement).
farss wrote on 6/4/2008, 11:36 PM
I don't know so much about the "There's not a lot of that kind of equipment out now..." bit. Just about every consummer HDV palmcorder from Sony has it. Wouldn't be hard to imagine there's more xvYCC cameras on the planet than XDCAM cameras.

Not that the above means much but I'm left wondering where Sony are going with this. If they think it's such a great thing why isn't it in their better kit.

Bob.
GlennChan wrote on 6/5/2008, 1:30 AM
It might be that we don't really have a good way of distributing it.

To use a format that uses any chroma subsampling would be sketchy... highly saturated colors make chroma subsampling artifacts visible... going even more saturated than normal would make things look worse. Interlaced 4:2:0 chroma would probably not look so hot (effectively, the chroma performance is about a quarter vertically). And I believe almost all broadcast is interlaced 4:2:0 so that rules that out.

Ideally for a wide gamut system, you want a 4:4:4 distribution system. For broadcast, that probably won't happen (it would require upgrading the infrastructure away from single-link 4:2:2 SDI to 4:4:4, changing the encoders, and changing everybody's settop boxes to handle 10-bit 4:4:4).

But you can do it from the live output off the camera via HDMI, and (I am guessing) with Bluray using one of the high quality profiles playing back on something like a PS3 (where you could upgrade the firmwire to handle that format). I'm guessing that's how Sony (and other manufacturers) plan on doing things.

2- Right now with the displays, a lot of the material feeding it is going to be non-wide-gamut material. So what the display manufacturers will likely do is to oversaturate the colors (except for memory colors) so that it shows off the display's wide gamut. You might disagree with this because it completely changes the creative intent of the DP and the director.
farss wrote on 6/5/2008, 4:37 AM
I think you've nailed it. The camera documentation is quite specific about using "X.V.COLOR" only when the footage will be played back on a "x.v.Color-compliant TV".
Further warns that color may not be reporduced correctly on a non "x.v.Color-complaint TV".

Point 2) is interesting, from what you're saying the display device will have no way of knowing what it's being fed and get it wrong one way or the other. Possibly for broadcast a flag could be sent to tell displays what they were being fed and to adjust accordingly. That's probably wishfull thinking given the number of TVs that aren't even displaying the correct aspect ratio.

Bob.
Cliff Etzel wrote on 6/5/2008, 9:00 AM
I had assumed that leaving x.v Color on all the time wouldn't do any harm - Are you saying that if you want to shoot for potential broadcast, web video, etc, that it should be turned off?

The info on this is vague and I'd like a more concrete answer on this since I shoot with a couple of HC7's currently.

Cliff Etzel - Solo Video Journalist
bluprojekt | SoloVJ.com
Spot|DSE wrote on 6/5/2008, 9:26 AM
Leaving xv color on will not create problems in any display, AFAIK. All our cams (including EX) always have xv enabled, and we've tested images (color charts) with it on and off. On an SD monitor, no diff can be detected. On our HD monitors fed with HDMI, xv is supported, and there is a visible difference of course, with xv on/off in the stream.
Eventually, everything will support this format, IMO. So, like shooting on HD for the past few years giving us HD masters to go back to, now we've got xv data embedded in the stream to go back to, eventually, if we need it.
There is a lot of information out there on deeper gamuts, Deep Color (which isn't the same as xv) and broader spaces. BTW, it is possible to oversaturate with xv color if you've set the camera up for saturated color and xv on, but are monitoring on a non-xv capable monitor.
Laurence wrote on 6/5/2008, 9:41 AM
Does enabling xv color change your workflow at all? Will it render fine through generations of m2t, Cineform, mpeg4 and Flash video?
Nx wrote on 6/9/2008, 9:40 AM
Same question as Laurence. I have a HDR-HC7 with X.V. color enabled and intent to view the final work on a Sony LCD which supports X.V. Color.

What would be the workflow to preserve the X.V. color (capture to mt2 with Vegas, rendering with which Codec ?, printing to tape to HC7 or bluray file on DVD with bluray player that support X.V. color ??)).

Thanks in advance !

Nx