Does water freeze at 0 or 32? The answer is that it freezes at 0° C or 32° F. People on this forum constantly spread misinformation because they don't pay attention to the context.
Projectors and TVs generally expect proper video levels. (There are some bizarre cases where this may not be the case, but it's probably not relevant.)
What you need to do is to render proper levels out of Vegas.
In Vegas, some codecs expect 16-235. Other codecs expect 0-255. You need to feed the codec with the levels that it expects. Then it will produce proper levels. (Unfortunately, it's unclear in Vegas as to which codecs expect what. If Vegas handled levels conversions for you, you wouldn't need to know.)
2- Huh? Is Adobe or Vegas correct?
A simplified way of looking at things:
Adobe is correct.
Sometimes Vegas is correct, sometimes it isn't. It's a mess.
3- My DSLR records video @ 0-255 (according to Canon)
This may technically be correct but misleading.
Your camera records in Y'CbCr (0-255).
This gets converted to RGB.
Y'CbCr and RGB are not the same!!! Just because the material is 0-255 Y'CbCr to begin with does not mean that it will decode to 0-255 RGB. It really depends on what codec is being used. In Vegas, it may decode to 0-255 RGB or it may decode to 16-235 RGB.
In other NLEs, you don't worry about this stuff because the NLE handles it for you.
Here is one way that you can ensure proper levels.
- ONLY render to the Sony MPEG 4 codec. Using a different codec can change everything.
- Do not use 32-bit floating point.
I could be wrong but your camera footage ought to come in as 16-235 RGB. (EDIT: I think I spoke too quickly here. This may not be true.) When you render your project out without doing anything to your footage, your levels will be correct.
16 is black. 235 is white.
Your Video Preview window will be wrong. What you see is *not* what you get. There are workarounds.
When you render your project out, your levels will be correct. If you play back that video in something like Quicktime, Windows Media Player, etc. your levels should look correct. (Unless the software screwed up or your video card settings were changed, which is rare.)
Glenn, based on the results of my survey, Canon "video" camera footage will come in as 16-235 but Canon "stills plus video" camera footage (including dSLRs) will come in as 0-255.
Another interesting thing I discovered the other day was that the YouTube TV app on my Samsung TV doesn't behave itself properly in terms of highlights. Using my quick check video the panels on the right are all black, but the panels on the left show 3 different shades. Result = wishywashy video. The YouTube app on my Samsung phone however does behave itself and as a result YouTube videos look much better.
This target which I created 10 years ago for use with stills may be of some use if you place it in the timeline / see how it looks after rendering / see how it appears on a TV or projector etc. Feel free to download and save it for your own use. The profile is adobeRGB1998. If you bring it into Photoshop and use the Information tool to examine it you should see the appropriate RGB values for each ring even if you cannot see them differentiated onscreen:
My post asks "Are all US projectors & TVs RGB 16-235?" It seems that the answer is "no, some are 0-255". Apparently the "one size fits all" (16-235) approach does NOT apply.
They are not 16-235 RGB and they are not 0-255 RGB. Generally speaking:
A- Projectors have different inputs. Some of these inputs are for digital signals, others are for analog signals. They expect different formats. The analog formats do not have 1s and 0s so the whole concept of 16-235/0-255 doesn't exist for those formats.
B- Standards define correct levels. There is only one set of correct levels for a given format.
C- Most equipment follows standards.
D- Assuming that C holds true (there are very rare cases where it doesn't), then you simply need to make a video file in Vegas that has proper levels.
In Vegas, many of the codecs expect 16-235 RGB levels. But not all of them. The most common exception is Quicktime- in a Vegas context, most but not all Quicktime codecs expect 0-255 RGB levels.
Vegas' settings can change how the codecs behave (simple solution: don't use 32-bit floating point). Some oddball codecs like Cineform (and perhaps Raylight) have settings to change their levels.
Some things in Vegas decode to 0-255 RGB levels. If you put .jpg and .png files into Vegas, they will decode to 0-255 RGB levels.
Many but not all video codecs will decode to 16-235 RGB levels.
You should convert everything to either 0-255 RGB or 16-235 RGB.
Suppose you want to convert everything to 16-235 RGB.
To do this, apply the computer RGB to studio RGB preset in the color corrector to all your .png, .jpg, and other things that decode to 0-255 RGB levels.
Well, good heavens - the process GARoss, that you are working with to take acquired footage and changed into a file that you push into HDMI pipeline has to be fully detailed.
Your NLE is VegasPro13? your files on the timeline are edited with 8bit video setting for project?, your use of digital intermediates to render intermediate steps for compositing layers is?, how many generations does your digital intermediate suffer before coming out of the NLE as a rendered file for HDMI transport?, your connection to HDMI transport is a Blu-ray optical disc? or usb stick, or? More and more info is required to suffer your proposition completely.
I find the lash-up you describe very weird. Why do you have Ethernet connection to a set-top optical disk player? What is this pipeline for and what does it transport. Are you just testing a network connection of a .jpg file type? That is really nothing about what this thread should be about.
Record your .jpg into a blu-ray disc with DVDArchitect to get a Blu-ray media - disconnect from the PC and use your optical disc to playback with HDMI transport to projector.
A .jpg source file is a whole lot different than video media formats - Why are you not using video?
FYI - an Oppo brand Blu-ray player is a problematic single test play machine - never ever use Oppo withOUT confirmation from another brand like Sony or Panasonic playing the same media disk.
As for Bob's comment below about HDMI between computer and projector - I am under the impression that you do not and have not been using such a connect for transport.
I use the projector's HDMI which, I believe is digital. It appears the projector is 0-255 compatible. All I want is to render video correctly for the projector's capabilities.
My DSLR records 0-255 mov files. It would seem that rendering with Levels/ computer RGB to studio RGB applied would keep the mov files 0-255. Is this correct?"[/I]
your projectors HDMI inputs almost certainly expect digital signals from 0 to 255 but that is kind of irrelevant.
Your video is going to be played out to the projector's HDMI inputs by software in a computer or by a DVD player. Those devices / software expect levels in the 16-235 range for video, it will convert the 16-235 to 0-255.
On the other hand if you view some still images or graphics on the computer and sent that to the projector's HDMI inputs the software you use will not make any changes in levels.
Those devices / software expect levels in the 16-235 range for video
They don't really do that.
They generally expect video files with proper levels. Internally, the data is stored in Y'CbCr format. When you render out of Vegas, the codec used for compression will convert from RGB to Y'CbCr. What you need to do is to make sure that you are providing the RGB levels that the codec is expecting: either 0-255 RGB or 16-235 RGB.
A DVD (or MPEG2 file) doesn't store any RGB values. It stores Y'CbCr values with video compression on top of that. From a Vegas user's perspective, there's no way to look at the Y'CbCr values. You just have to provide the RGB levels that your codec is expecting.
Sometimes 0-255 RGB will produce the right Y'CbCr values, sometimes 16-235 RGB will produce the right Y'CbCr values. It depends on the codec.
I know that and you know that but saying that here only serves to confuse people.
I could also add that in most video card drivers it's configurable but the defaults get it (sort of ) right. I didn't say that because it adds another confusing element to the discussion given the level that it's at.
[I]"Sometimes 0-255 RGB will produce the right Y'CbCr values, sometimes 16-235 RGB will produce the right Y'CbCr values. It depends on the codec."[/I]
Or is it the decoder for the codec or the video card drivers???
Again this is only serving to baffle people with science, KISS :)
I just feel that false simplifications can cause a lot of confusion and are more trouble than they're worth. You can get into a lot of trouble by building misconceptions on top of misconceptions.
This thread is an example. Suppose you hold the belief that "all US projectors & TVs are RGB 16-235". Then you get confused as to why Photoshop and Premiere work differently than Vegas. Looking at Photoshop and Premiere, you would get the impression that US projectors & TVs are 0-255 RGB. The reality is that neither statement is true.
I make videos that are going to either be played online, downloaded from online or played from an SD DVD.
Someone is going to use a computer online or offline or a TV with a USB drive (HD) or SD DVD or whatever they choose to use.
I need to put out a video that has to be the best fit for a one size fits all. It does not actually 'fits all' but I have to do SOMETHING that will get me as close to that goal as possible.
So with that I use 16 to 235 to make sure there is no clipping on devices that would clip with video out of that range and still get the best for the other 'what ever it is' that people use to play the video with.
I have to settle for the best 'one size fits all' as I can. While not perfect, I do not get any complaints from those who watch my videos. In fact I get a lot of complements.
So what I do works and that is what I will stick with.
[I]"I just feel that false simplifications can cause a lot of confusion and are more trouble than they're worth. You can get into a lot of trouble by building misconceptions on top of misconceptions."[/I]
That's true however you can also lose your audience when the question is simple.
Why conflagrate the issue of "are all US projectors and TVs RGB 16-235" with what Vegas, Photoshop and Ppro are doing?
Here even the question itself is unduly complex, it should be "Are all projectors and TVs the same?" and the answer is "Yes", for the same reason that an inch is the same everywhere.
"it should be "Are all projectors and TVs the same?" and the answer is "Yes", for the same reason that an inch is the same everywhere."
Analog pipeline is 16-235. Or call it IRE, or whatever one wants.
HDMI will carry full range.
The question is about HARDWARE; not one bit of the rest of this software jabberwocky makes any sense whatsoever, and I thoroughly object to seeing yet another thread being sabotaged by a flood of errata.