Expected Behavior? (Another one of those levels threads)

Editor17958 wrote on 1/5/2017, 7:02 PM

I'm trying to wrap my head around video levels, like many before me, and I'm somewhat confused on what the expected behavior from Vegas is. I know from reading and much googling that Vegas doesn't handle levels conversions the way other editors do, and also IIRC what you feed vegas is what it shows you.

So with that in mind, if I feed it 16 - 235 range video (say a commercial blu ray, or some YV12 footage from my capture card that inexplicably is not full-range) it's going to render out 16 - 235 video, and technically it hasn't touched it ?

So if my intended target playback device is a full range computer screen, I pretty much need to be slapping that Studio - > PC RGB preset onto my video, yes?

It's left me initially puzzled and with a secondary question in this case. If Vegas isn't touching it, why do I get different visual results depending on what I render to? i.e I typically export footage into a lossless AVI, be it Lagarith or UT Codec, or whatever. I always output them into the YV12 colorspace also. When I do this with Vegas (as opposed to say, Virtualdub, or another application that may be parsing an Avisynth script) what I end up with is a video that looks washed out with 16 - 235 range levels.

However if I export from Vegas in Magix ProRes, or Sony AVC, the file looks as one would expect it to look with PC levels applied. Although importing that back into vegas still tells me it has a 16 - 235 range.

 

Is there a metadata issue here? Some flag that encoders like x264 / ffmpeg (libx264) don't set properly? Its the only thing I can think of in this scenario. When I play back an MKV or MP4 I've muxed straight off a commercial Blu Ray w/out touching the source files, I get the expected PC range. When I export to a lossless AVI from Vegas I lose that, and no matter how I flag that AVI when I feed it through x264 or ffmpeg, nothing seems to stick as far as levels conversion. But if I export to ProRes or SonyAVC (and probably several other internal presets) instead of going the AVI route, it plays back fine.

Diving deeper into this tangent still, playback/rendering issue? I use MPC-HC setup with MadVR, although in the past I just used EVR(custom) and when turning color correction on it does appear to do a levels conversion. I mean I guess its possible but I would expect all playback sources to be broken if it were a wider problem with MadVR itself?

 

On the one hand, this bugs me from an intellectual point of view. I -don't- understand what is going on and can't figure it out, but I just have to know what is happening to produce X results under Y conditions. Because if I can pull a commercial 16 - 235 video off a disc and get proper PC levels when played back on a PC, then I should be able to produce such a file myself after editing it.

On the other hand I feel like "meh"?? My primary target is PC screens, not Televisions running in the limited TV range. So stuff like youtube or local PC playback. Should I just slap the StudioRGB - > PC RGB filter on all my work and call it a day? I only worry about the video being horribly dark or something weird for people with certain setups.

 

Comments

GJeffrey wrote on 1/5/2017, 7:12 PM

To get proper video level (when the levels are already sRGB in Vegas preview), you should make sure that the conversion to YV12 doesn't clamp the luminance any further.

For example, using avisynth, make sure that the conversion matrix is set to PC.709

Editor17958 wrote on 1/5/2017, 7:32 PM

Is this in reference to the import stage, or handling the lossless AVI after the output stage?

GJeffrey wrote on 1/5/2017, 7:38 PM

It's in reference on how you convert your AVI lossless file to YV12 before encoding to x264.

I typically export footage into a lossless AVI, be it Lagarith or UT Codec, or whatever. I always output them into the YV12 colorspace also

To make things easier, output your AVI file in RGB. You will then be able to handle the YV12 conversion yourself.

Red Prince wrote on 1/5/2017, 7:43 PM

Sadly, using the video levels of analog television in digital video is the dumbest thing ever (and I’m referring to the video standards, not calling anyone here dumb). There is absolutely no reason for it. And it is a bad thing.

In analog TV, theoretically there are no discrete steps from one level to the next, so picking the levels as they did was not hurting anything, and they needed it to store some metadata in those levels outside their standard. It was a clever solution for adding color to an originally gray signal, back in the 1950s.

In digital video, reducing the levels reduces the number of available colors. And there is no reason to mix metadata in with the image data. Unfortunately, the standard was created by engineers whose mindset was that of analog TV instead of digital computers.

I really wish, they would finally drop this nonsense. This is not the 1950s. Why are they pushing this inferior outdated thinking on the 21st Century? Especially now that even cinema went digital.

Whatever your source format and whatever your delivery format, all editing should be done in the full 32-bit mode with black being a 0 and white being a 1. This allows for superior editing technology. For example, instead of the outdated methods of darkening an image by just multiplying something by a value smaller than 1, a much better way is to calculate the square of the value (and of course blend it with the original if that darkens the image too much). The square of 0 is 0, the square of 1 is 1, and the square of all in-between values is less than the original. As a result the image gets darker overall while preserving the blacks and the whites.

And that is just one example of how it makes perfect sense mathematically to edit in the full 32-bit range regardless of what range the input and output files use. Many mathematical functions keep the zero a zero and the one a one.

Last changed by Red Prince on 1/5/2017, 7:45 PM, changed a total of 1 times.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Editor17958 wrote on 1/5/2017, 8:11 PM

Thanks for the advice GJeffrey,

Red Prince, I really agree with everything you've said. I wish they would have killed all the stuff with the transition to Blu Ray at least, but it seems people will do anything in the name of profits (=backwards compatibility etc).

It'll be great if I live to see the day where there is no garbage with dithering, or color banding and any of that stuff produced anymore.

Musicvid wrote on 1/6/2017, 5:23 AM

Interlacing and TV leveling will be with us until every remsining crt is gone from the face of the earth.

Blu-ray would not even be here today if it was digital-only, although the technology favors it.

 

Jam_One wrote on 1/6/2017, 8:05 AM
So if my intended target playback device is a full range computer screen, I pretty much need to be slapping that Studio - > PC RGB preset onto my video, yes?

Yes. Watch your steps, I mean scopes.

 

why do I get different visual results depending on what I render to?

Good question has got half the answer within, and so does this one.
Because different formats have got different "traditions of interpretation". How's that? -> For example, it is most common in a lower-priced custom segment to expect the decoding of MJPEG with TV->PC conversion midway. You may think of it like creator of the software kind of thought of housewives incapable of determining "which socket is for which plug", and they are to see bright & contrasty pleasing image. Not "dull & discolored", you get it?...

...what I end up with is a video that looks washed out with 16 - 235 range levels

Looks WHERE ?... That a very important point. You watch your rendered product in some... let's call it player. Players are created by humans with their fancy thoughts&beliefs. And your player does something to your video upon decoding&rendering. These are the two crucial places to investigate. Both decoder & renderer can assume this or that format of video requires expanding of its levels range. In case of MPC HomeCinema you can reach both places of interest and 'enforce law&order' with your hands. Other players won't give you such an opportunity.

However if I export from Vegas in Magix ProRes, or Sony AVC, the file looks as one would expect it to look with PC levels applied. Although importing that back into vegas still tells me it has a 16 - 235 range.

So, the video is OK, and player needs fine-tuning.

I, personally, use EVR Custom Presenter. As the most versatile one. Altogether with Color Management (Relative colorimetry; Gamma = 2.35).

wwjd wrote on 1/6/2017, 12:55 PM

Don't worry, HDR is on its way!

Musicvid wrote on 1/6/2017, 9:06 PM

Jam one,

Go through your device chain, and turn OFF "Dynamic Contrast" or its equivalent setting in your:

Graphics card

Monitor

Players and playback devices

Hd teevee

Post back with new results, if any.