What exactly is this R'G'B' that Vegas use

farss wrote on 2/18/2008, 8:41 PM
Pardon the technical question but this R'G'B' thing gets used all over the place in regard to Vegas but there's more than one variant of "RGB" e.g. sRGB and Adobe98. I understand (well I think I do) the difference between sRGB and Adobe98 but the statement I'm reading here and elsewhere is that RGB is a smaller colorspace than Rec 709's Y'CbCr and yet Adobe98 is a significantly larger colorspace than anything video can normally display.

Bob.

Comments

GlennChan wrote on 2/18/2008, 9:01 PM
Ok so there are actually variations on R'G'B'. In practice (for video work at least), it's ok to pretend that these differences do not exist. I would actually advocate ignoring these differences.

Some of the differences are:

Primaries (specified in chromaticity co-ordinates):
The exact shade of red, green, and blue. i.e. what exactly do they mean by red. They specify that in chromaticity co-ordinates (an objective way of specifying the exact shade).

sRGB and Rec. 709 are the same; the original NTSC primaries and Adobe RGB are the same
Other sets of primaries are the EBU ones, SMPTE C, and P3 (dcinema).

Transfer function
sRGB, Rec.709, and Rec. 601 all have different transfer functions

RGB is a smaller colorspace than Rec 709's Y'CbCr
Yes. See page 5 of this article:
http://www.poynton.com/PDFs/Merging_RGB_and_422.pdf

It's R'G'B' color space plotted against Y'CbCr. The Y'CbCr cube is bigger.

Adobe98 is a significantly larger colorspace than anything video can normally display
Yes, mostly.

Adobe RGB uses a shade of red is much more pure/saturated than say SMPTE C red. An even more saturated red would be that of a laser (or the red you see reflected off a CD).

Adobe RGB uses the same primaries as the original NTSC primaries (quite obsolete). Adobe RGB is a pretty academic color space (IMO) because it doesn't really have a use in video.

As Poynton tells it, he discovered that Photoshop's default color space used the same numbers as the original NTSC primaries. So he sent Adobe an email about it. They couldn't just get rid of it (because that would break old Photoshop files) so they renamed it to "Adobe RGB".

2- On the other hand, some manufacturers are coming out with wide gamut color systems. And they kind of work with existing systems (but you need to turn all the clippers off).

The classic interpretation of Y'CbCr is that negative values (when converted to R'G'B') are illegal and don't really define any colors. Only colors inside the R'G'B' cube are legal.

In the wide gamut interpretation, negative values define colors outside the R'G'B' gamut. So by interpreting the negative values, you can define shades of red that are more pure/saturated than SMPTE C red.
I don't know if these new interpretations have necessarily been standardized though.
farss wrote on 2/18/2008, 9:39 PM
I grasp how the "just what is red" part of the puzzle fits. This relates in some ways to the response of the sensors and what the red pixels in the display display. e.g. using a red LED to display red would produce a different red compared to the red phosphor in most CRTs. All that I grasp.

However from page 5 of the article you referenced:

"Of the 16.7 million colors available in studio R’G’B’, only about
2.75 million are available in Y’CBCR."

Given that in Vegas we can create colors that are out of gamut that would seem to agree.

Ah but, next sentence:

"If R’G’B’ is transcoded to Y’CBCR, then transcoded back to R’G’B’, the resulting R’G’B’ cannot have any more than 2.75 million colors!"

Also :

"Only about 1/4 of the available Y’CBCR volume represents colors; the rest is wasted. This transform is performed before 4:2:2, 4:2:0, or 4:1:1 chroma subsampling."

OR, maybe I'm completely misreading that. More colors doesn't mean a bigger range of colours, it just means less granularity. Which is why elsewhere you said Y'CbCr could produce banding although in practice it mostly doesn't.

Looking at Poynton's diagram it's hard to tell what's inside what!

Bob.
craftech wrote on 2/19/2008, 11:17 AM
I have a question about the Debugmode Frameserver settings.

I have assumed that the choice when frameserving from the Vegas timeline to an app should be RGB 24. I seem to recall that this is native for Vegas anyway.

Is that correct?

John
GlennChan wrote on 2/19/2008, 10:10 PM
"If R’G’B’ is transcoded to Y’CBCR, then transcoded back to R’G’B’, the resulting R’G’B’ cannot have any more than 2.75 million colors!"
There he is talking about precision. Due to rounding error, some colors get lost.

The rounding error is related to the different 'volumes' / sizes of the gamuts.

2a- Suppose you had to represent temperature with integers from -127 to 128.

Suppose your units were Kelvins and Fahrenheit.

When you go from Fahrenheit to Kelvins and back, you get a lot of rounding error.
32 F --> 273.15 K --> (rounds to) 273 K
31 F --> 272.59 K --> (rounds to) 273 K

You can figure the rest out.

2b- You can also see that -127 K is not a temperature that exists in the real world (that's below absolute zero). So that value is pretty much wasted. Y'CbCr is like this to some degree.... negative light doesn't exist so many values are wasted.




Ack... the Y'CbCr cube isn't drawn. But you can draw it yourself or use your imagination... it just goes along the axes.
Bill Ravens wrote on 2/20/2008, 5:55 AM
Bob,
Everything you always didn't want to know about RGB to R'G'B' conversion:
http://www.babelcolor.com/download/A%20review%20of%20RGB%20color%20spaces.pdf
Thanx to Steve Mullen for the ref.

In very simplistic terms, R'G'B' is RGB, gamma corrected for the way the human eye sees color. So, it's interesting to note that the RGB the camera sensor block "sees", is instantly converted to YUV for writing to the record media, then to R'G'B' by the display so a human can see it in colors they think are familiar. Mathematics, being an imperfect science, causes errors in the conversions.
farss wrote on 2/20/2008, 2:24 PM
Thanks, bookmarked.
I do understand what happens. The camera uses a RGB sensor, gets converted to Y'CbCr, many things happen and ultimately it gets displayed as RGB. There's something wrong with that picture!
Ideally throw away the Y'CbCr thing, everything should stay as RGB if that's what the input and output devices are using. Does require a lot of bandwidth though.
Conclusion. RGB can describe everything the sensor sees and everything the display can display, that's how the things work. Y'CbCr is a technical fudge based on the limitations of how human vision works. Where it can get very messy is if two systems are using different primaries. This is one problem I grappled with for a long time. The missing bit of information was that different systems use different primaries, converting between the two (e.g. Cineon to 709) can be difficult.

Bob.
Bill Ravens wrote on 2/20/2008, 3:15 PM
well, not entirely correct. the biggest reason YCbCr is used for the storage format, I believe, is because it lends itself to compression, ie 4:2:0 or 4:2:2. It's easy to throw away some of the chroma info and keep all of the luma. It saves on storage space, as well as being easy, ie fast, to encode, decode. Imagine trying to compress pure RGB info....rather cumbersome, if it can be done at all, and certainly computationally extensive.

You're right about the Spyder, tho. Sometimes I write faster than I think..;o)
The best one can do is set their monitor luma according to the pluge bars. The latest test pattern seems to be ARIB multi-format test patterns. Color depiction is set in the application overlay, so it overwrites whatever you have in the ICC profile setting, anyway.
farss wrote on 2/20/2008, 4:00 PM
"Imagine trying to compress pure RGB info....rather cumbersome, if it can be done at all, and certainly computationally extensive. "

Cineform RAW, R3DCode. Admittedly both of those are using wavelet compression and yes the decode is very CPU intensive. Recording the raw 14bit output of say the EX1s sensors would sure gobble up a lot of disk space but wouldn't be at all computionally intensive in the camera. Moving the hard work from the camera to post is becoming the norm. Moore's law means more work can be done in post and at a lower cost. Storage space is still expensive, more so if you need high throughput and high reliability but it too is coming down in price, just not as fast as silicon.

I think the advance in camera technology though are one thing, top shelf optics and all that goes with them are still extremely expensive. We may see advances in image processing combined with very fast silicon bought to bear to address these issues as well. We already have in camera CA correction, that could just be the beginning. I sure hope so. As more than a few have discovered, a RED or SI-2K is pretty afforable today. The glass to go on the front makes the saving in the cost of the camera almost irrelevant.

Bob.