See, no ambiguation, worldwide (it's to keep our analog teevees from exploding!)
Gaross asked a simple HARDWARE question, which deserves a simple answer (Bob's will do nicely).
"Applying Levels / Computer RGB to Studio RGB means 0-255 is recalculated to fit within 16-235 otherwise 1-16 & 235-255 would be crushed."
Because my Sony cameras shoot 16 to 255, I use Sony levels to keep the upper portions of the video from being crushed. The gives me details that normally would be lost and helps to prevent white tops on bald heads in poorly lit Church's. The quality of the video is much better this way.
Even so,
My Sony cameras have a tendency to overexpose a bit when using auto-exposure. So I always use an AE Shift of -1.0EV to bring it down to where it belongs and make things look more like they actually are instead of unnaturally bright. Even my new HDR-CX900 does this. A dark room should not look like a bright room! Outdoor videos can look awful (areas of grass over 100 IRE - Sky not blue) if I do not bring the exposure down a bit. The CX900 has a better dynamic range than my PJ710's so it is even better.
With theater videos the bright lights and auto-exposure is guaranteed to give over-exposed faces on the performers if I do not use an AE Shift of -1.0EV. And even then if one is not careful it can still be over-exposed (from uneven lighting) at times. An AGC limit of 0db with AE Shift of -1.0EV gives great results most of the time in the bright lights. Note: Bright lights not spotlights.
So I rescue a bit of bright details and get a much more realistic video in the way of brightness. Note: Different video shoots benefit in different ways when one does all of this so all the more important it is to know ones camera(s) to get the best possible video(s).
You can bring-up the levels of a lightly to moderately under-exposed video but it can be extremely difficult to impossible to bring down an over-exposed video. So I err on the lower exposure side.
The most common problem with getting video to look good on a projector is the projector not having enough light output. Beyond that and if one includes the latest LED "video wall" gizmos there's a whole universe of issues to discuss that includes real colour space issues.
"I just feel that false simplifications can cause a lot of confusion and are more trouble than they're worth. You can get into a lot of trouble by building misconceptions on top of misconceptions."
Instead of spreading false simplifications, give people simplifications that are true. There's no reason to make stuff up.
In Vegas, all you really need to know is how to make your rendered files have proper levels. Most of the time, if you convert everything in your project to 16-235 RGB, you will have proper levels.
That's a correct answer in 2 sentences.
---
Are all US projectors & TVs RGB 16-235?
The answer is no. It's easy to prove (in Vegas render something to Sony DV and render something to Windows Media player; upload to Youtube; clearly the levels are different). But who cares.
Yes, I have a friend who spent $20K on a home theater/studio space (before the cost of equipment). It is wonderful. Note that your projection distance is quite short in a well-controlled environment.
Bring your equipment into an existing 50' auditorium or sanctuary, project onto whatever scrim or wall is available, and try to cover all the stray light sources. I promise your (and your audience's) experience will be quite different.
We once did a stage production where an inexperienced set designer decided we would project a backdrop from a 12"x18" painting. She was so proud of her friend's artwork, we actually used it during the Overture ;?)
[I]"The answer is no. It's easy to prove (in Vegas render something to Sony DV and render something to Windows Media player; upload to Youtube; clearly the levels are different)."[/I]
This is true but what has this got to with Video Projectors?
DV and WMV are codecs, YouTube is a streaming video service, video projectors are display devices.
All video projectors the world over expect the same signals on their various inputs be they composite, analogue component, digital component or serial digital. In fact pretty much all display devices the world over expect the same signals on their inputs for the same relative light output, if there were any significant differences the industry wouldn't exist.
Now some display devices today also have a facility to play video files or display digital stills from storage devices such as thumb drives via a USB port. Yes, assuming the display device can play Sony DV and WMV files then the video levels fed to the encoder by whoever created those files have to match what the encoder / decoder expects or the video will end up displayed incorrectly but that's a discussion about codecs, not display devices. The display device is irrelevant to a discussion about codecs and codecs are irrelevant to a discussion about display devices.
What most people want to know is how to make their final product appear correctly.
If somebody believes that "all US projectors and TVs are RGB 16-235", then that person can dig themselves a hole. Clearly, if you change everything in your editing program to 16-235 RGB, you won't always get a desired result.
The display device is irrelevant to a discussion about codecs and codecs are irrelevant to a discussion about display devices.
What I was trying to point out is that misconceptions about how TVs/projectors work (and also some misconceptions about how cameras work) stem from misconceptions about how Vegas works.
Yes, assuming the display device can play Sony DV and WMV files then the video levels fed to the encoder by whoever created those files have to match what the encoder / decoder expects or the video will end up displayed incorrectly
That's exactly the point. If you assume that TVs are 16-235 RGB, you would erroneously assume that making everything 16-235 RGB will give you the correct levels. Which it won't.
It's true that most though not all digital formats are standardized around 16-235 Y'CbCr (if the format is 8-bit). But so many people on this forum leave out the Y'CbCr part, which leads to some people erroneously assuming that these formats are standardized around 16-235 RGB. That's not the case.
The distinction is important. In Vegas, sometimes 16-235 Y'CbCr will convert to 16-235 RGB and sometimes it will convert to 0-255 RGB. Some/many DSLRs record 0-255 Y'CbCr. Sometimes this will convert to 16-235 RGB and sometimes this will convert to 0-255 RGB.
Knowing the Y'CbCr will not help you reliably predict what the resulting levels in Vegas will be. With Vegas, You want to know two things:
A- What video codec is being used.
B- Whether or not the camera records illegal values above white (superwhites).
[I]"What most people want to know is how to make their final product appear correctly."[/I]
Indeed and for a conversation about projectors the most pressing issue isn't video levels, that's already been done to death here and elsewhere.
I'm still wondering how one can encode a video file out of Vegas so that it'll send the correct format descriptor bits to a projector or TV so there's a chance the aspect ratio will be correct.
"B- Whether or not the camera records illegal values above white (superwhites)."
All of my Sony cameras record illegal values above white (super whites). So I adjust for that with my editing.
"If you assume that TVs are 16-235 RGB, you would erroneously assume that making everything 16-235 RGB will give you the correct levels. Which it won't."
If you know that your target device is going to be a certain TV, projector or whatever it is then one can shoot for that goal.
My videos go out to who knows what device they will be played on so I choose the best possible 'one size fits all' of 16 to 235.
I get no complaints and I get a lot of complements.
I spent many, many hours of prayers along with trial, errors and expense I really cannot afford to get the great results I now get.
So I am not going to waste my time and rack my brain worrying about this. At least not until the industry does something to muddy the whole thing up some more.
"If you assume that TVs are 16-235 RGB, you would erroneously assume that making everything 16-235 RGB will give you the correct levels. Which it won't."
Sorry I wasn't clear: sometimes it will work and sometimes it won't. If you encode to a codec that expects 16-235 RGB levels, then your output will have proper video levels. If you encode to a codec that expects 0-255 RGB levels, then the output will be incorrect.
The 'one size fits all' solution is to render a file with proper video levels. Some people render Quicktime files (or use other editing applications) so this distinction is relevant to them. What works for you may not necessarily work for other situations. You're probably doing things correctly.... it's just that Vegas is sometimes situational.
Anyways this discussion is going around in circles.
To muddy the waters a bit, I have one TV that will show the entire 0 to 255 range and another one that will not.
If you have a miscalibrated CRT TV then things might appear that way.
In general:
Some display devices can show illegal values. Some(/most) can't.
Some playback devices will clip illegal values, most won't.
For consistency in product, you can get rid of all illegal values in your output file. This way your audience won't see different things. It makes your video more "one size fits all".
PC's seem to like 0-255.
I think you've drawn the wrong conclusion? Computer monitors, TVs, and projectors all like video signals that follow the relevant standards.
As a Vegas user, all you need to do was to render a file with proper levels. You've figured out how to do that for your projector.
Take that same file and play it back in Windows Media Player, Quicktime, etc. Or upload it to Youtube. You will see that the levels will be correct when played backed elsewhere.
-It's Vegas that plays back files in a weird unintuitive way. MPEG2/4 can play back differently in Vegas depending on whether it came in as a properly-encoded AVI or properly-encoded Quicktime.
-Arguably, Vegas doesn't follow standards. This is why it is unintuitive. I think this is why so much confusion exists.
Adobe quite possibly uses 'fullrange=on' flag for 0-255 wrapped B.T.709, which is by no means universally supported, and just serves to confuse, because no one knows it's there.
I know for certain that Vegas does neither write nor honor the flag.
"To muddy the waters a bit, I have one TV that will show the entire 0 to 255 range and another one that will not.
If you have a miscalibrated CRT TV then things might appear that way."
I should have stated that the TV is an LCD type. It is the only TV I have ever seen that did this. The TV is literally 16x9 in that if one should measure the screen, horizontally it is exactly 16 inches, vertically it is exactly 9 inches. The only other time I had an LCD display that showed the entire 0 to 255 range was a 7" monitor that was purposely designed that way.
My computer system is in the corner of my living room. All I need to do is burn to a test DVD/Blu-Ray with a test pattern and then go to middle of the room and play it on my Blu-Ray player. I then test to see what results I get. My gigantic 32" TV (LOL!) is properly calibrated so I will know that my final results are what they should be. I also make a test burn of each project to be sure I am getting the expected results.
With Vimeo, I can upload a test pattern video and also test for the correct results. I can view my Vimeo videos on my TV via Roku and/or view them on my computer. My main computer monitor makes all videos look terrible because I deliberately mis-adjusted it so it is not too bright. The other monitor is for viewing the videos properly with studio to computer RGB applied. The main monitor has a lot of vertical room and is great for a lot of tracks.
To the OP,
As for projectors, the fun is with the numerous types of connections. SD, SVHS, HDMI and Ethernet to name a few all expect something and IF one renders to a certain format using what IS expected then one does not need to worry about what the projector uses.
So what is expected? What does *.mpg expect? What does *.mov expect? Know the answers to those and more and get the results you want. As I do, I know what I am starting with and I test to make sure that I end-up with the same thing. That is why test patterns were created. They help one know that what the final display is showing matches the original source. Anything different, there is a problem somewhere.
But, I also believe playback components detect video files (perhaps with a flag similar to widescreen DVD) as 16-235. I suggest this but can't say it's so but it would seem likely as Adobe mp4 & wmv files DO playback as 0-255. Why else would one mp4 file be crushed & Adobe mp4 do not?
Adobe Premiere (like most NLEs) automatically handles the levels conversions between different formats.
Vegas doesn't.
What you're seeing is the improper conversion of a JPEG into MPEG4.
---
Anyways, it looks like you've figured out something that works so congratulations!
"What you're seeing is the improper conversion of a JPEG into MPEG4."
And yet all other TV's I have display the same video properly. I have two CRT's, one 32" LCD and a number of 7" TV's in addition to the 19" LCD one. What I am seeing is a TV that CAN display video that has levels above 100 IRE AND below 7.5 IRE. That is above 235 and below 16. I did not use MPEG4 for the video.
I think I should try to do a tutorial on how to use the waveform monitor with the included NTSC and PLUGE test patterns. Also how to calibrate a TV with those test patterns.
Anyone working with an NLE without a properly calibrated monitor is asking for problems!
IRE doesn't really exist in digital, any more than VU audio scale.
So better to use a correct reference that doesn't create false ceiling space.
I know the temptation to employ a reference one is familiar with; it literally took me a year to wean myself from +4 VU, and the concept of a soft ceiling, which only exists in the analog mind.
"IRE doesn't really exist in digital, any more than VU audio scale.
So better to use a correct reference that doesn't create false ceiling space."
I still use the waveform monitor as a reference for my video levels and it works quite well for me.
I will continue with what works best for me even though the analog world is THANKFULLY mostly gone! Many years I trained and learned in the analog world, I learned to relate analog tools to digital and when it is all said and done it works quite well for me.
Now what I wish is that SCS would update the waveform monitor to have the option of showing 0 to 255 with markers at 16 and 235 and whatever other numbers people here would like. I know there are those 'other' monitor types but for me, an old school person, it would be most beneficial to have the number options on the waveform monitor. I also WISH they would put a 7.5 IRE marker on the thing. Adobe does, so can SCS.
And by the way, there are still settings on the video cameras for zebra stripes at various 'IRE' levels. So like it or not, IRE is still there and it is not going away anytime soon!