VidMus wrote on 9/2/2014, 4:16 PM
To muddy the waters a bit, I have one TV that will show the entire 0 to 255 range and another one that will not. This from a video rendered in Vegas since I do not have Adobe.

Maybe Adobe is doing an auto-correction and you do not realize it?
TheHappyFriar wrote on 9/2/2014, 5:19 PM
Every projector I've seen seems to do 0-255, but I've never thrown a chart up there to check.

As far as I know our TV does too.
farss wrote on 9/2/2014, 5:22 PM
I'm also betting Adobe does nothing to the video, leaving everything to the user to figure out what they need."[/I]

That is not the general consensus at all when it comes to video levels. It's Vegas that does nothing leaving it to the user to figure out what they need. Your own results are consistent with that.

PeterDuke wrote on 9/2/2014, 7:02 PM
"please bare with me"

Sorry, but my mind ran on where it shouldn't. :)
musicvid10 wrote on 9/2/2014, 7:18 PM
All analog inputs (composite, component, s-video) are 16-235.
Digital can carry full range.
GlennChan wrote on 9/3/2014, 2:22 AM
Does water freeze at 0 or 32? The answer is that it freezes at 0° C or 32° F. People on this forum constantly spread misinformation because they don't pay attention to the context.


Projectors and TVs generally expect proper video levels. (There are some bizarre cases where this may not be the case, but it's probably not relevant.)

What you need to do is to render proper levels out of Vegas.

In Vegas, some codecs expect 16-235. Other codecs expect 0-255. You need to feed the codec with the levels that it expects. Then it will produce proper levels. (Unfortunately, it's unclear in Vegas as to which codecs expect what. If Vegas handled levels conversions for you, you wouldn't need to know.)

2- Huh? Is Adobe or Vegas correct?
A simplified way of looking at things:
Adobe is correct.
Sometimes Vegas is correct, sometimes it isn't. It's a mess.

3- My DSLR records video @ 0-255 (according to Canon)
This may technically be correct but misleading.

Your camera records in Y'CbCr (0-255).
This gets converted to RGB.
Y'CbCr and RGB are not the same!!! Just because the material is 0-255 Y'CbCr to begin with does not mean that it will decode to 0-255 RGB. It really depends on what codec is being used. In Vegas, it may decode to 0-255 RGB or it may decode to 16-235 RGB.

In other NLEs, you don't worry about this stuff because the NLE handles it for you.
GlennChan wrote on 9/3/2014, 2:48 AM
To try to solve the original poster's issue:

Here is one way that you can ensure proper levels.
- ONLY render to the Sony MPEG 4 codec. Using a different codec can change everything.
- Do not use 32-bit floating point.

I could be wrong but your camera footage ought to come in as 16-235 RGB. (EDIT: I think I spoke too quickly here. This may not be true.) When you render your project out without doing anything to your footage, your levels will be correct.

16 is black. 235 is white.

Your Video Preview window will be wrong. What you see is *not* what you get. There are workarounds.

When you render your project out, your levels will be correct. If you play back that video in something like Quicktime, Windows Media Player, etc. your levels should look correct. (Unless the software screwed up or your video card settings were changed, which is rare.)
GlennChan wrote on 9/3/2014, 2:55 AM
All analog inputs (composite, component, s-video) are 16-235.

The problem that I have with your statement is that it will get people into trouble. You are misinforming people.

Whether your video will ultimately end up going into an analog or digital input is not relevant, yet you are suggesting otherwise.
(*Ignoring the 7.5 IRE setup issue, which probably isn't relevant.)
musicvid10 wrote on 9/3/2014, 5:14 AM
Absolutely not.
I was referring only to hardware, and it's the law.
NickHope wrote on 9/3/2014, 5:17 AM
Glenn, based on the results of my survey, Canon "video" camera footage will come in as 16-235 but Canon "stills plus video" camera footage (including dSLRs) will come in as 0-255.

Another interesting thing I discovered the other day was that the YouTube TV app on my Samsung TV doesn't behave itself properly in terms of highlights. Using my quick check video the panels on the right are all black, but the panels on the left show 3 different shades. Result = wishywashy video. The YouTube app on my Samsung phone however does behave itself and as a result YouTube videos look much better.
Peter Riding wrote on 9/3/2014, 10:47 AM
This target which I created 10 years ago for use with stills may be of some use if you place it in the timeline / see how it looks after rendering / see how it appears on a TV or projector etc. Feel free to download and save it for your own use. The profile is adobeRGB1998. If you bring it into Photoshop and use the Information tool to examine it you should see the appropriate RGB values for each ring even if you cannot see them differentiated onscreen:

GlennChan wrote on 9/3/2014, 11:49 AM
My post asks "Are all US projectors & TVs RGB 16-235?" It seems that the answer is "no, some are 0-255". Apparently the "one size fits all" (16-235) approach does NOT apply.

They are not 16-235 RGB and they are not 0-255 RGB. Generally speaking:
A- Projectors have different inputs. Some of these inputs are for digital signals, others are for analog signals. They expect different formats. The analog formats do not have 1s and 0s so the whole concept of 16-235/0-255 doesn't exist for those formats.
B- Standards define correct levels. There is only one set of correct levels for a given format.
C- Most equipment follows standards.
D- Assuming that C holds true (there are very rare cases where it doesn't), then you simply need to make a video file in Vegas that has proper levels.

In Vegas, many of the codecs expect 16-235 RGB levels. But not all of them. The most common exception is Quicktime- in a Vegas context, most but not all Quicktime codecs expect 0-255 RGB levels.

Vegas' settings can change how the codecs behave (simple solution: don't use 32-bit floating point). Some oddball codecs like Cineform (and perhaps Raylight) have settings to change their levels.

Part 2:
Some things in Vegas decode to 0-255 RGB levels. If you put .jpg and .png files into Vegas, they will decode to 0-255 RGB levels.
Many but not all video codecs will decode to 16-235 RGB levels.

You should convert everything to either 0-255 RGB or 16-235 RGB.
Suppose you want to convert everything to 16-235 RGB.
To do this, apply the computer RGB to studio RGB preset in the color corrector to all your .png, .jpg, and other things that decode to 0-255 RGB levels.
videoITguy wrote on 9/3/2014, 2:07 PM
Well, good heavens - the process GARoss, that you are working with to take acquired footage and changed into a file that you push into HDMI pipeline has to be fully detailed.

Your NLE is VegasPro13? your files on the timeline are edited with 8bit video setting for project?, your use of digital intermediates to render intermediate steps for compositing layers is?, how many generations does your digital intermediate suffer before coming out of the NLE as a rendered file for HDMI transport?, your connection to HDMI transport is a Blu-ray optical disc? or usb stick, or? More and more info is required to suffer your proposition completely.
videoITguy wrote on 9/3/2014, 3:25 PM
I find the lash-up you describe very weird. Why do you have Ethernet connection to a set-top optical disk player? What is this pipeline for and what does it transport. Are you just testing a network connection of a .jpg file type? That is really nothing about what this thread should be about.

Record your .jpg into a blu-ray disc with DVDArchitect to get a Blu-ray media - disconnect from the PC and use your optical disc to playback with HDMI transport to projector.

A .jpg source file is a whole lot different than video media formats - Why are you not using video?

FYI - an Oppo brand Blu-ray player is a problematic single test play machine - never ever use Oppo withOUT confirmation from another brand like Sony or Panasonic playing the same media disk.

As for Bob's comment below about HDMI between computer and projector - I am under the impression that you do not and have not been using such a connect for transport.
farss wrote on 9/3/2014, 3:25 PM
I use the projector's HDMI which, I believe is digital. It appears the projector is 0-255 compatible. All I want is to render video correctly for the projector's capabilities.

My DSLR records 0-255 mov files. It would seem that rendering with Levels/ computer RGB to studio RGB applied would keep the mov files 0-255. Is this correct?"[/I]

your projectors HDMI inputs almost certainly expect digital signals from 0 to 255 but that is kind of irrelevant.
Your video is going to be played out to the projector's HDMI inputs by software in a computer or by a DVD player. Those devices / software expect levels in the 16-235 range for video, it will convert the 16-235 to 0-255.
On the other hand if you view some still images or graphics on the computer and sent that to the projector's HDMI inputs the software you use will not make any changes in levels.


GlennChan wrote on 9/3/2014, 4:09 PM
Those devices / software expect levels in the 16-235 range for video
They don't really do that.

They generally expect video files with proper levels. Internally, the data is stored in Y'CbCr format. When you render out of Vegas, the codec used for compression will convert from RGB to Y'CbCr. What you need to do is to make sure that you are providing the RGB levels that the codec is expecting: either 0-255 RGB or 16-235 RGB.

A DVD (or MPEG2 file) doesn't store any RGB values. It stores Y'CbCr values with video compression on top of that. From a Vegas user's perspective, there's no way to look at the Y'CbCr values. You just have to provide the RGB levels that your codec is expecting.

Sometimes 0-255 RGB will produce the right Y'CbCr values, sometimes 16-235 RGB will produce the right Y'CbCr values. It depends on the codec.
GlennChan wrote on 9/3/2014, 4:35 PM

Ok, in Vegas you are rendering to Sony AVC mp4. That codec expects black to be at 16 RGB and white to be at 235 RGB.

When Vegas decodes a JPEG file, it will put black at 0 and white at 255 RGB.

So what's happening is that the mpeg4 encoder assumes that black is at 16 RGB, while your JPEG is putting black at 0 RGB. This is why you see clipping.

As you've figured out, applying the computer RGB to studio RGB preset in the Levels FX will get rid of this behaviour. It turns 0 RGB into 16 RGB, and 255 RGB into 235 RGB.
GlennChan wrote on 9/3/2014, 4:45 PM
Your DSLR records video and puts it into a .mov file?
If so, Vegas may be using Quicktime to decode that file.
If so, it will probably come in at 0-255 RGB.

If you render out to the Sony AVC mp4 codec, then apply the "computer RGB to studio RGB" preset. (I could be wrong because I haven't worked with that footage.)

2- Premiere should give you the correct output automatically without any manual intervention. Throw your DSLR footage into it and render out a mp4 file. That's what your Vegas output should look like.

If you mix other formats in your projects, you can repeat this test. If you mix formats in Vegas, you may need to manually apply levels conversions to all clips from one source.
farss wrote on 9/3/2014, 5:18 PM
[I]"They don't really do that."[/I]

I know that and you know that but saying that here only serves to confuse people.
I could also add that in most video card drivers it's configurable but the defaults get it (sort of ) right. I didn't say that because it adds another confusing element to the discussion given the level that it's at.

[I]"Sometimes 0-255 RGB will produce the right Y'CbCr values, sometimes 16-235 RGB will produce the right Y'CbCr values. It depends on the codec."[/I]

Or is it the decoder for the codec or the video card drivers???

Again this is only serving to baffle people with science, KISS :)

GlennChan wrote on 9/3/2014, 7:46 PM
I just feel that false simplifications can cause a lot of confusion and are more trouble than they're worth. You can get into a lot of trouble by building misconceptions on top of misconceptions.

This thread is an example. Suppose you hold the belief that "all US projectors & TVs are RGB 16-235". Then you get confused as to why Photoshop and Premiere work differently than Vegas. Looking at Photoshop and Premiere, you would get the impression that US projectors & TVs are 0-255 RGB. The reality is that neither statement is true.
musicvid10 wrote on 9/3/2014, 10:56 PM
0-255 should work fine over HDMI.
No need to change your source levels.
You can always throw on a levels chart to check for clipping, though.
Here's one I made especially for Vegas users:

VidMus wrote on 9/4/2014, 12:37 AM

I make videos that are going to either be played online, downloaded from online or played from an SD DVD.

Someone is going to use a computer online or offline or a TV with a USB drive (HD) or SD DVD or whatever they choose to use.

I need to put out a video that has to be the best fit for a one size fits all. It does not actually 'fits all' but I have to do SOMETHING that will get me as close to that goal as possible.

So with that I use 16 to 235 to make sure there is no clipping on devices that would clip with video out of that range and still get the best for the other 'what ever it is' that people use to play the video with.

I have to settle for the best 'one size fits all' as I can. While not perfect, I do not get any complaints from those who watch my videos. In fact I get a lot of complements.

So what I do works and that is what I will stick with.

farss wrote on 9/4/2014, 3:12 AM
[I]"I just feel that false simplifications can cause a lot of confusion and are more trouble than they're worth. You can get into a lot of trouble by building misconceptions on top of misconceptions."[/I]

That's true however you can also lose your audience when the question is simple.

Why conflagrate the issue of "are all US projectors and TVs RGB 16-235" with what Vegas, Photoshop and Ppro are doing?
Here even the question itself is unduly complex, it should be "Are all projectors and TVs the same?" and the answer is "Yes", for the same reason that an inch is the same everywhere.

musicvid10 wrote on 9/4/2014, 8:38 AM
"it should be "Are all projectors and TVs the same?" and the answer is "Yes", for the same reason that an inch is the same everywhere."


Analog pipeline is 16-235. Or call it IRE, or whatever one wants.
HDMI will carry full range.

The question is about HARDWARE; not one bit of the rest of this software jabberwocky makes any sense whatsoever, and I thoroughly object to seeing yet another thread being sabotaged by a flood of errata.

[Edited for decency]