studio RGB, computer RGB - still not getting it

Mindmatter wrote on 8/20/2015, 2:40 PM
I apologize in advance - I know this this is quite a redundant subject - but I read about it what I could find, only to end up just as confused as before. The fact that my new Eizo monitor has sRGB, Adobe RGB etc presets doesn't make it easier. I can actually get a normal ( good looking ) picture when setting it to sRGB but have the preview set to computer RGB - I'm just not sure if that make any sense. I just want my footage to look good...

So, this is what I think I understood so far:

In Vegas, I work in computer RGB, and my files from the camera (these days mostly) come in the 0-255 range.. . right(?)
Vegas displays that exact range and lets me use superblacks and superwhites, making things look nice and full.
Now, I understand there are devices that can't reproduce that range but stop at 16 and 235, making my nice and full footage look rather washed out and grey.

SO: if I work and grade in sRGB, narrowing my visible range down, my footage looks flat, and I have to somehow compensate for that by trying to still have a nice and full and contrasty image although my range is narrower?? Or do I just "visually" grade my cRGB footage in sRGB to make sure my 0 and 255 don't look too bad in case they are viewed in sRGB, but actually keep the levels 0-255?

So say I render in sRGB, when the rendered files are viewed on computers, aren't they displayed 0-255 anyway?? If so, why should I reduce things to 16-235 only to have them reexpanded to full range afterwards? Why should I cut off my superwhites and superblacks if the camera delivers them in the first place? Isn't SRGB an outdated standard by now? Until now, I never even bothered as I have never seen a significant enough difference in my renders when viewed on different screens or the internet.

Question, questions... boy I really don't get it..
My neurons feel rather entangled right now , but I've ploughed through forums and articles about this, and I'm still stuck , so any help is greatly appreciated.
Thanks!

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

Comments

musicvid10 wrote on 8/20/2015, 3:41 PM
My solution is simple and foolproof.

-- Do all your editing and grading in Vegas' native RGB preview space.
-- Just prior to YUV rendering (most of it is), apply the Computer RGB ->Studio RGB Levels filter to the output.

You don't have to understand the theory for this to work. Output is also broadcast legal, even for PBS, all else being compliant.

There are now scripts and plugins to do the work for you, but this is simple and WYSIWYG.

wwaag wrote on 8/20/2015, 3:50 PM
+1 for MusicVids suggestion (I learned it from him.). It really does make life simple inside of Vegas. Everything on the timeline 0-255. Then apply the Computer to Studio RGB levels Fx on the output bus and you're done. Rendering using an internal encoder or frameserving to Handbrake is all the same.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia 1050ti graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. Current cameras include Panasonic FZ2500, GoPro Hero5 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

Satevis wrote on 8/20/2015, 4:03 PM
Note that "sRGB" isn't an abbreviation of "Studio RGB". These are very different things. sRGB is a color space definition. It defines primary colors, a color gamut, and a gamma transformation. That's entirely unrelated to the Studio-RGB/Computer-RGB issue which deals with what digital values represent the lowest or highest value in a channel.

As for Studio-RGB/Computer-RGB: That's simply a matter of following the convention of the target format. Chances are your target format says "16 is black, 235 is white". Then it's the content producer's task to export the material in a way that respects this range and it's the player's and the display's task to map this range to whatever their hardware uses. Maybe their black is 0 or their white is 1024 or maybe they're analog and use voltages instead of digital numbers. That needn't concern the producer. On the other hand, if the producer builds, say, a slideshow from JPEGs that have a 0-255 value range, then it's their job to map these to the 16-235 range of the target format.

As for superwhites, these are, by definition, out of range, ("super white" = "over white"). Placing nominal white at 235 creates some headroom that cameras may use to differentiate between pixels that are way overexposed (such as 254) and ones that are just slightly above the maximum (such as 240). The point is that this gives you the option to "repair" some of the overexposure in post, such as by lowering the overall image brightness. If you don't process the image and keep those values above 235, then a player will see them as beyond white and most likely just clip them to white.

Vegas' internal preview doesn't do any conversions, so 235 will not actually be white and 16 will not actually be black. This creates the infamous "washed out" look but does show you whether a bright area is clipped beyond repair or may have some salvageable detail left.

Your two basic options are:
1. Work in Computer RGB: Convert material to Computer RGB, convert back to Studio RGB when rendering.
2. Work in Studio RGB: Keep Studio RGB material as it is but possibly add a temporary conversion to Computer RGB for previewing. Remember to remove that conversion before rendering.

The Preview Levels extension is meant to make option 2 a bit more manageable. This has the advantage that for the final render, material that already is in Studio RGB does not get converted to Computer RGB and back. The disadvantage is that you need to convert Computer RGB sources such as still images or media generators.
Mindmatter wrote on 8/20/2015, 4:05 PM
thanks but...WHY?

Why should I just cut off the extra range , thus making the footage look bad? How come I can grade in a specific color space and make decisions according to what I have and see there, and then transform it to a different color space that makes those decisions based on colors that don't exist there obsolete? Or is the 16 black just looking as black in sRGB as the 0 superblack in cRGB?
Sorry but I won't sleep until I've untied the knot...

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

Mindmatter wrote on 8/20/2015, 4:10 PM
Thanks Satevis, I only saw your post after I had written my last reply.
So does that mean that my values above 235 and beneath 16 are still somehow encoded in the studioRGB render, ready to be displayed if a player or device says it's capable of doing so?

Or are they simply calculated away cutting off a certain amount of dynamic range in the process and making the image look worse?
If I know for sure that my final output is going to be viewed on computers or beamers only, do I really have to convert it to studio RGB all the same?

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

astar wrote on 8/20/2015, 4:46 PM
My insanity is different.

My footage comes in as sRGB, I edit in Vegas as normal applying the sRGB to cRGB filter on the external preview device (right click preview windows > preview device pref.) This gives me 2 views of the material, sony preview window shows SRGB, and external shows cRGB. I also have sRGB set on the video scopes, as this mirrors the levels seen on the external display. Then I just render as normal which maintains the sRGB levels to the output format.

YouTube and most device video players convert sRGB signals to cRGB for display. So when rendered content is uploaded to YouTube or displayed on a device, the conversion filter people are applying is automatic, and I get the levels I intend. Rendering to .WMV does this conversion during render, and displays back the same cRGB, which YouTube also understands and maintains the cRGB levels in playback.
Satevis wrote on 8/20/2015, 5:25 PM
If you use option 2 from above (or a 32-bit project), then superwhites are encoded into the stream and a playback device would be able to handle these as it sees fit, yes.

Still, Studio vs. Computer RGB isn't a question of the output device but of the video format. DVD, Bluray, and YouTube all place the nominal peak of a channel at 235. If you encode a 235 onto a DVD, your computer software player will display a 255 (because that's the peak value of your computer display). If you encode a 240, it will display 255, and if you encode 254, it will still display 255, so there's little to be gained here. WMV, as astar pointed out, uses a different conversion placing white at 255. When rendering to WMV, you indeed need to convert your Studio RGB footage to Computer RGB for it to come out as intended.

If you output Computer RGB into a format that expects Studio RGB, then that's equivalent to increasing the contrast and saturation until the darkest 6% and the brightest 9% of the image are cut off. Some people do see this as a quality improvement, but that's really just an accidental way of doing color correction.
Red Prince wrote on 8/20/2015, 6:32 PM
[i]thanks but...WHY?[i]

For historic reasons. In analog video (TV) the extra values were used for marking where a line begins and things like that (horizontal and vertical sync, closed captions, etc). When video engineers created their version of a digital standard (as opposed to computer engineers who look at things differently), apparently they didn’t quite realize you can dedicate parts of a file data to all that metadata, which would never be confused with video data.

Whatever the reason, it makes no sense to computer engineers, but we are stuck with it. At least for the time being.

Note that not all digital video file formats use this limitation. Those created by computer engineers (such as many codecs used in the AVI files) use the full 0-255 range, while those created by video engineers (e.g., the various versions of MPEG) do not.

It’s a mess. You’re not the only one confused/puzzled by it. To add to the confusion, while computer monitors display the full 0-255 range, some digital TV sets can only handle the limited studio range, while others can be switched between the two sets of ranges. Similarly, some cameras shoot in the full 0-255 range, others in the limited studio range and some even somewhere in-between.

At any rate, you should do all your editing in the full 0-255 range, or better yet using the 32-bit floating point, and only export to the so called studio RGB when producing a file in a format that expects it.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

balazer wrote on 8/20/2015, 6:32 PM
The previous explanations are not quite correct.


When working in Sony Vegas Pro with the pixel format set to 8-bit:

* When Vegas reads Y'CbCr video formats that support an extended range of values (MP4, MTS, MPG, YUV AVI, etc.), Vegas maps the video format's black point (Y'CbCr=16,128,128 in the case of an 8-bit format) to an RGB working space color value of 16,16,16, and the video format's white point (Y'CbCr=235,128,128 for an 8-bit format) to an RGB working space color value of 235,235,235. When rendering to one of those formats, the level mapping is exactly the reverse.

* For other formats, e.g. still images or RGB video, Vegas maps the format's black point to RGB 0,0,0 and the format's white point to RGB 255,255,255.

* Media generators produce a 0-255 range.

* The preview window shows the entire 0-255 working range mapped directly to the display, with no scaling or remapping of levels.

* Video scopes, by default, show the entire 0-255 working range.

* A full-screen preview device can show the entire 0-255 working range, or if you check the "Adjust levels from Studio RGB to Computer RGB" box in preview device preferences, it will scale 16-235 in the RGB working space to 0-255 for the display.


This behavior causes a few complications:

1) The preview window shows a 0-255 range, but standard video files are read into a 16-235 range. To display the 16-235 range correctly, you can add a Levels filter with the Studio RGB to Computer RGB preset. The filter should be disabled before rendering.

2) Still image import, RGB video import, and media generators will be read into an RGB working range of 0-255, which doesn't match the 16-235 RGB working range that Vegas reads standard video files into and that most of Vegas's video renderers expect. You can make those media match by adding a Levels filter with the Computer RGB to Studio RGB preset.

3) Some cameras (Apple, GoPro, some Canon EOS cameras, and many tablets and phones) record H.264 video with a Y' channel range of 0-255 and Cb and Cr channel ranges of 1-255, instead of the Rec.709 standard ranges of 16-235 for the Y' channel and 16-240 for the Cb and Cr channels. These videos will typically have the video_full_range_flag set to 1 in the H.264 stream's VUI. But Vegas does not read the video_full_range_flag. Vegas always interprets the Y'CbCr levels according to Rec.709. Adding a Levels filter to the input with a Computer RGB to Studio RGB preset will approximately map the levels into a 16-235 range to match Vegas's decoding of cameras that use standard Rec.709 ranges and to match the range expected by most of Vegas's renderers. But it's only approximate. Scaling in RGB is not equivalent to reading the correct ranges in Y'CbCr.


Frankly the easiest thing to do is to change the pixel format to 32-bit floating point (full range). Then most of the complications go away: Vegas maps 0,0,0 to the format's black point, and 1,1,1 to the format's white point, for all formats, and for the preview window, and for full-screen preview. Reading of full_range video files is the only thing that would need to be handled specially.

You can also work in ACES. The input and output color space transformations take care of the level mapping for you.
MikeLV wrote on 8/20/2015, 7:35 PM
Let me ask, what is the deliverable you're trying to produce? Unless it's something so mission critical that black and white cannot be out of range, then my advice is to do what looks good to you on whatever calibrated equipment you have. If you're making DVDs for people, or you're putting videos on youtube, you have zero control over brightness/contrast settings of the viewer's monitor that's watching your video. You could have perfect black and white levels and it could look like junk on their screen because the brightness and contrast are set way too high. Otherwise you'll drive yourself mad trying to figure it all out and no one is going to notice anyways.
Rich Parry wrote on 8/21/2015, 1:11 AM
Mindmatter,

I'm as confused as you regarding brightness levels so thanks for the question, but I am also confused by your reference to "beamers", what the heck is a beamer?

Rich
Mindmatter wrote on 8/21/2015, 1:38 AM
I just love this forum!

Many thanks to all of you who took the time to help and offer explanations. Still computing though...

Rich, I just realized a "beamer" might be term used only in Europe for a digital ( home theater ) projector.

http://store.sony.com/home-theater-projectors/cat-27-catid-all-tv-home-theater-projectors

AMD Ryzen 9 5900X, 12x 3.7 GHz
32 GB DDR4-3200 MHz (2x16GB), Dual-Channel
NVIDIA GeForce RTX 3070, 8GB GDDR6, HDMI, DP, studio drivers
ASUS PRIME B550M-K, AMD B550, AM4, mATX
7.1 (8-chanel) Surround-Sound, Digital Audio, onboard
Samsung 970 EVO Plus 250GB, NVMe M.2 PCIe x4 SSD
be quiet! System Power 9 700W CM, 80+ Bronze, modular
2x WD red 6TB
2x Samsung 2TB SSD

Rich Parry wrote on 8/21/2015, 1:43 AM
musicvid10,

When you say, "Just prior to YUV rendering (most of it is), apply the Computer RGB ->Studio RGB Levels filter to the output", what exactly do you mean? I typically add Levels or Color Curves to every event to trim the output to 16-235. It there is one place I can go to limit the output on ALL video events on ALL video tracks, I'd would like to learn how.

thanks in advance,.
Rich
john_dennis wrote on 8/21/2015, 3:17 AM

"It there is one place I can go to limit the output on ALL video events on ALL video tracks, I'd would like to learn how."
Any FX applied here affects all tracks on the timeline.

Last changed by john_dennis on 4/6/2018, 10:27 AM, changed a total of 2 times.

My main system:
Motherboard: Asus X99-AII
CPU: Intel i7-6850K
GPU: Sapphire Radeon RX480-8GB
RAM: Corsair Dominator (4 x 4 GB) DDR4 2400
Disk O/S & Programs: Intel SSD 750 (400 GB)
Disk Active Projects: 1TB & 2TB WD BLACK SN750 NVMe Internal PCI Express 3.0 x4 Solid State Drives
Disk Other: WD Ultrastar/Hitachi Hard Drives: WDBBUR0080BNC-WRSN, HGST HUH728080ALE600, 724040ALE640, HDS3020BLA642
Case: LIAN LI PC-90 Black Aluminum ATX Full Tower Case
CPU cooling: Corsair Hydro series H115i
Power supply: SeaSonic SS-750KM3 750W 80 PLUS GOLD Certified Full Modular Active PFC Power Supply
Drive Bay: Kingwin KF-256-BK 2.5" and 3.5" Trayless Hot Swap Rack with USB 3
Sound card: Crystal Sound 3 on motherboard. Recording done on another system.
Primary Monitor: Asus ProArt PA248q (24" 1920 x 1200)
O/S: Windows 10 Pro 190943
Camera: Sony RX10 Model IV

https://www.youtube.com/user/thedennischannel

Warper wrote on 8/21/2015, 3:42 AM
Rich,
Press puzzle icon on preview window, it will open Output FX chain. That's where he adds his Levels filter.
Warper wrote on 8/21/2015, 4:03 AM
Or is the 16 black just looking as black in sRGB as the 0 superblack in cRGB?
They are the same black. Also, it is not superblack in computer RGB, it's just black.

Why should I just cut off the extra range , thus making the footage look bad?

So does that mean that my values above 235 and beneath 16 are still somehow encoded in the studioRGB render, ready to be displayed if a player or device says it's capable of doing so?
They do, but generally they are not ready to be displayed. Some software players allow you to display extras, but there is no garantee

musicvid10 wrote on 8/21/2015, 9:25 AM
"To display the 16-235 range correctly, you can add a Levels filter with the Studio RGB to Computer RGB preset. The filter should be disabled before rendering."
I'm not understandiing how doing my preview, editing, and fine grading in 6.84 bits per pixel and 10,503,459 available colors offers an advantage over working in full 8 bits and native 16,777,216 colors. The improved technique we developed in 2011 to address that has the added benefit of being fully compliant (chroma and luminance) in broadcast environments, where the older method balazar described is not.

balazer wrote on 8/21/2015, 12:29 PM
Your math is wrong, but anyway, there is no inherent advantage to working with a 16-235 range, except that's how Vegas is mapping the video into the working space when you are working in an 8-bit project. It's Vegas's choice, and it's arbitrary, but we are stuck with it. Any additional scaling you try to perform by adding more filters for rendering will cause a loss of quality in the final output, due to the limited precision at every stage. Anyone who cares about maximizing quality should not be working in 8 bits.

But I don't know what improved technique you are referring to. If you can tell me what you mean, I can comment.
Rich Parry wrote on 8/21/2015, 12:34 PM
Warper et. a;.

Thanks for the pointer to the global video output "puzzle icon" to control all video events.

Rich
musicvid10 wrote on 8/21/2015, 12:42 PM
Balazar,
See the second post in this thread, show me the correct math, and explain how filtering the output inherently causes more loss than doing it anywhere else in the chain in order to achieve the same net levels.

The rationale is I don't want to see any extra banding during editing and grading -- if its got to happen, save it for the very last. There are a few here, including a star, satevis, and wwag, who have wrapped their heads around this, but it took me a couple of years and hundreds of hours of tests to get it. I did it exactly the same as you for years.
balazer wrote on 8/21/2015, 12:55 PM
The method you describe in the second post would only produce the correct levels for certain combinations of cameras and output rendering formats. Which camera are you using and which format are you rendering to?

For most cameras, that is, the ones that use Rec.709 ranges, and for typical output formats like MP4, no added filters are necessary for the levels to be mapped correctly in the rendered output.

A 16-235 range is equivalent to 7.78 bits per color channel. log(235-16+1)/log(2)
A 16-235 range has 10,648,000 possible colors. (235-16+1)^3

musicvid10 wrote on 8/21/2015, 1:03 PM
Does "most cameras" include the five billion devices who shoot 0-255 without a [i]full_range{/i] VUI flag that most decoders don't understand anyway?

7.78/8*(256)^3 = 16,315,842.56 discrete colors. So your top formula isn't working.
Far different from your color count, assuming we believe your 10,648,000 figure.

Try 220/256*8 for your top formula instead. That math (6.84 bpp) matches your professed color count.
balazer wrote on 8/21/2015, 1:30 PM
The cameras that use Rec.709 ranges include DV, HDV, and AVCHD cameras, Panasonic cameras (mostly. depends on recording format and settings), Sony cameras, and virtually all professional camcorders.

The ones I know of that use Y'=0-255 and Cb and Cr=1-255 are Apple, GoPro, at least some Canon EOS cameras, and a number of cell phones and tablets.

So there's quite a split. You really need to be specific about which camera you are using.


You can't scale bit counts the way you are trying to do. 4 bits doesn't have half as many possible combinations of values as 8 bits.

Edit: Canon cameras; qualification about Rec.709 ranges.
musicvid10 wrote on 8/21/2015, 1:36 PM
Errm, my math works on your terms. While one of us is overthinking, one of us is checking our answers.
Rather than point out a few other possible misperceptions in your posts above, I'll stand by my history of work on this topic, and wish you happy editing!