Been reading the extensive threads from Nick, Glenn et al ( just who IS "Al" anyway .. ?), can some kind soul gentle remind me "why" YT does the clamp?
YT doesn't clamp or clip anything. You can upload 0-255 to YT and download the file as re-encoded by YT and get back the same levels intact.
What happens though is your display driver will most likely convert 16-235 to 0-255 so it is displayed correctly on your monitor. In the process anything outside 16-235 is simply clipped to 0-255. There's a gotcha here though, Through the control panel of at least the nVidia cards you can change this.
Why all this?
Well it's all meant to work simply. If you digitised say a VHS tape and simply uploaded it to YT the levels would be 16-235 and the display driver detects that video is being displayed in a certain part of the screen when you view something from YT and applies a transform so black in the video matches black from all the graphics on the rest of the screen and does the same at the white end with everything in between setup to match.
et al. = and others. This abbreviation should always be written with a period behind al, which is short for alli (or something like that) just as etc.(et cetera = and so on) is always written with a period, even when used in the middle of a sentence. BTW, et cetera is pronounced phonetically and not ex-setra, as news commentators and politicians do.
Ever noticed that your online videos have more contast than the ones you uploaded? Are the highlights "blown out" to pure white? Is detail in the shadows getting lost and becoming flat black? This is because for many video formats you need to adjust the levels of your video before you upload it.
May i jump in?
Its not getting 'clamped' as such, think of it as 'expanded'.
And its not YT or Vimeo, its any player connected to an RGB display. That includes local player applications like Windows Media player and AVS for example. You don't even need to upload anything to see this happen. DVDs and BluRay players are included.
So, 16 (legal black) gets expanded to 0, and 235(legal white) expands up to 255.
Stands the reason, if you have a level of say 10, its already expanded to 0 - you see nothing, so you may think its being clamped to 0, but its expanded to 0 really, obviously you can't go below 0, minus is not allowed, and you cant go above 255 in an 8 bit colour space.
Bottom line, ALL output from Vegas or any other NLE needs to be within legal 16-235 for correct playback on YT, Vimeo, DVDs, BRs, local players, the lot.. Its the players themselves, not the online service that does the expansion. Studio -> Computer RGB expansion as its known. Any better?
Paul.
Former user
wrote on 12/2/2012, 8:11 AM
Okay, since you can't go below 0 or above 255 and you upload a file with 0-255, then doesn't that mean that nothing will change? It will look like what you uploaded? It will only change if you upload 16-235.
That problem starts with Vegas not displaying your video correctly in the first place.
Say not knowing better you try to fix your video, then it looks right but it's wrong.
Once you upload it to YT and then watch it back on your PC exactly what Nick describes happens.
Or if you've shot with some of the vDSLRs it looks right in the preview monitor but comes back at you from YT wrong with the same effect Nick describes.
As Glenn has said many times, Vegas is fundamentally broken.
I use a secondary display monitor and use Vegas's control for that device so I do see my video correctly. Within the calibration limits of my office PC's monitor what I see back from YT matches what I'd seen on my monitor. I (think) I know what I'm doing but heaven help those who just bought Vegas and expect it to be easy.
I agree with what Glenn has also said many times. We should be banging on SCS's door to get this fixed but I do worry that it's so late in the day that having it fixed ow would create as much confusion for those who've just lived with it.
@Dave Thats the point, your 0-255 file will 'play' wrong - with a higher contrast than intended.
Because anything below 16 will expand 0 and anything above 235 will expand as 255. Which is wrong.
The only way to defeat this expansion is to present the levels correctly in the first place ie. 16 to 235 max limits.
Grazie, Youtube utilizes the Flash Player on your system. That's where levels mapping takes place. Bob mentioned that you can download a video from Youtube and it will be the same levels as you sent up.
When a player is given a video, it looks for flags. If it sees nothing, or RGB flags, it should expect and play back 0-255 levels.
When it sees a YUV flag (really 601/709), it expects TV levels (16-235) and decodes to 0-255. Except hardware players will still decode to 16-235 for analog devices and sets.
There are exceptions, of course, but the vast majority of players work this way.
It's a legacy consideration, but the handling is unlikely to change in our lifetimes. I was very careful in my tutorial to point out that the players expand the levels, but that has been misinterpreted as if Youtube is doing something.
@DaveT2, the players are dumb. They don't know the levels and don't scan for them. All they can see is how the video is flagged. If the levels don't square with the flag, stuff happens.
"Is there an International agreement for those 16<>235 figures"
Yes, and its 0 IRE for 16 and 100 IRE for 235.
But as i said, all the expansion is taking place in our players only. Its not a flash / web issue at all. ALL players do this including and not limited to Computer player apps, DVD and BRs.
To quote myself: all output from Vegas or any other NLE should be within 16-235 (0 IRE to 100 IRE) in order to play back correctly on a RGB display.
Legacy? not really, composite video is still widely used today in lots of applications and these video levels that are still in use. Probably not for long though.
Vegas isn't broken. Almost any compositing and finishing system (like Nuke, Smoke) works on a 32 bit float-point base where 0 (float)=0 (8 bit)=0 (10 bit)=0 (12 bit)=black and 1 (float)=255 (8 bit)=1024 (10 bit)=4096 (12 bit)=white. Also grade 1 monitors for compositing, color correction and grading usually should be adjusted to be capable of displaying full range luminance levels (with a switch for the 0.063-0.922 range).
If you want cRGB to sRGB preview adaption in Vegas Pro all you need is using external monitoring and for output use a filter. It's as simply as that.
There is an immense amount of confusion about how exactly Vegas handles video levels. Other NLEs do not have this problem, they just work. Simple as.
Here is one example of a thread that seems simple at first, then gets crazy with workarounds. Black 16 tracks on the timeline, text and graphics with 235 peak whites, Sony Levels fx that does not always stop your levels going illegal... and more. Add to that multiple camera sources and you're in trouble.
Its true, Vegas is not broke... not in the strictest sense, because it never handled video levels correctly in the first place @V1 therefor it cannot be 'broken' !.
Its true though, If you know all the workarounds, and its not just a cRGB to sRGB filter, then yes, you can force Vegas to behave. But my heart goes out to the new guys.. Its a complete mess. Been there, done that etc.
It's not broken and Vegas Pro actually handles video levels correctly. Maybe not in the way many would expect but correctly anyway. You do expect Vegas Pro to process levels only to adapt the final preview. But the base of Vegas processing levels is to maintain the maximum quality available by the given levels. Again, most of the high-end systems work the same way. If you need high quality mattes, corrections, gradings it's just a no-go to use systems which processing is based on a 0.063-0.922 range (which equals 16-235 for 8 bit).
I'm pretty aware there are very different points of view about this topic. But me and many others do very appreciate the way Vegas Pro processes video levels.
You are perfectly entitled to your opinion Marco, there is no issue with that whatsoever. If the video levels in Vegas seem correct to you, then job done.
But, as you are aware, many of us would very much disagree with that opinion.
Feel free to read that thread i posted. And search for others, there are many confused posts from users asking similar questions... Why are my by blacks crushed on playback? Why are my whites peaking? What should my waveform monitor options be set to anyway? Indeed this very YT thread is a direct result of lack of coherence about Vegas video methods and levels. Result -> confusion. And quite a lot of incorrect videos being produced by Vegas users without them even knowing it.
I feel this should change. The subject of incorrect Vegas video levels has been around the block more times that i care to count... Come on.
I doubt you would want SCS to change the Vegas internal video level processing which e.g. would break the whole compositing math. I think the only optimization needed is a couple of clearer options for output (preview and rendering). It's all there but it might be made easier to use.
No one wants or needs to break anything.. If implementations are done right, we all win. We can have correct rendering levels and previews without breaking compositing maths. Granted, its not that simple. Nevertheless, it should be done.
Yet again, Its up so SCS to listen to users and act.
"I doubt you would want SCS to change the Vegas internal video level processing which e.g. would break the whole compositing math"
The compositing math is already broken.
Black = 0 (0% IRE)
White = 1 (100% IRE)
Mid Grey = 0.5 (50% IRE)
Try doing something as simple as additive compositing in Vegas and you get the wrong answer. Video black plus video black should give video black. It doesn't if you do it in Vegas, you get grey.
It gets worse when an alpha channel is involved. Over the years here I've seen users post compositing work done with Vegas where they've used clouds and fog from stock libraries and there's grey fringes on the clouds and fog that should not be there.
In an attempt to fix this when SCS introduced a 32bit float mode they sort of got it right. The "sort of" part is Vegas wasn't applying the correct tranform back when the values went to the encoder. As a result the encoded output from Vegas was quite different between 8bit Int and 32bit Float. This added another dimension to the previous level of chaos and confusion.
Just recently I've noticed another problem.
For many years Vegas would let me use Generated Media with the RGB values of 16,16,16. That gave me correct video black (0% IRE). Vegas would display that correctly in the waveform monitor as 0. Now in V11 and V12 the waveform monitor shows it as 1% IRE. Perhaps again they've tried to work around a fundamental problem and broke something else.
I previously said Vegas is "fundamentally broken".
As I see it the Y'CbCr <> RGB conversion is a fundamental component of Vegas and it is wrong, it is broken. Therefore the system is fundamentally broken. When a system has a fundamental problem generally the way to fix the system is to fix that problem otherwise any attempt to fix it by working around it leads to even more problems that cascade through the system.
Of course what Vegas or any NLE does internally doesn't matter at all, so long as the user is never exposed to the internals it doesn't matter. There is a possibly good technical argument for why Vegas does that conversion differently to everyone else. That is a much more technical discussion that I really lack the knowledge to dive into. All I can say is that Vegas does have a very good reputation for having the lowest loss when8 bit video is round tripped through its pipeline.
The problem though is the user is exposed to the internals and the result then is Vegas is by no means a simple application to use and get the correct outcome from.