external monitor vs. TV

Skipdesign wrote on 3/30/2005, 11:50 AM
I see the discussion of the need to use an external monitor when working wtih Vegas to make sure that the out put fits on to a TV screen. What is the advantage of using a monitor over hooking up a TV and using that. I use a TV when I shoot so I can see my self and so I know that I am centered...etc. But is there a reason that I should use a Monitor?...buying a small TV seems more cost effective. Please Advise...and thanks.

Comments

craftech wrote on 3/30/2005, 11:53 AM
This is an old battle (do a search). I am of the TV school over the monitor. I use multiple TVs to decide upon color correction values. The monitor is necessary if your work will be broadcast. Otherwise you are likely to end up with washed out looking videos if you use one, particularly if you dub to VHS.

John
rs170a wrote on 3/30/2005, 11:54 AM
The two main advantages a monitor has over a TV set are underscan and a "blue gun only" switch.

Mike
Randy Brown wrote on 3/30/2005, 12:06 PM
I wonder if Skipdesign is talikng about a "broadcast display monitor" or a regular CRT or LCD monitor????
Randy
Jsnkc wrote on 3/30/2005, 12:07 PM
My theory of thinking at least for the stuff that I do....is that the people will be watching it on a TV and not a professional monitor, so that is what I should calibrate for is what they will be watching it on. I know there are many, many diffrent schools of though on this though so I know that my method probably isn't the best.
Skipdesign wrote on 3/30/2005, 12:12 PM
Hi...Skip here...and I don't know the difference. I see monitors advertised in all the trade mag's and I just knew that a TV was cheaper. If I am to run into color, quality etc. problems, thats that I need to know. If there is really no difference, then that is good knowledge too. I am new to editing. I shoot on a Canon GL2 and chose Vegas 5 after extensive reading. I don't want to edit my video and have the end product come out pour quality. Thanks for all the responses.
FrigidNDEditing wrote on 3/30/2005, 12:22 PM
All I know is that the Quality of a cheap monitor is better than a TV usually. I've used both and I almost don't like the monitors because the quality looks so nice there that the customer will see it on there and be very happy - then they take it home and watch it there - or see it on TV and think something went wrong blah blah blah. There's no problem with that let down if they see it on a normal TV usually. Just been my experience.

Dave
Yoyodyne wrote on 3/30/2005, 12:31 PM
My take on this is that if your workin' on the cheap - a decent t.v. will be o.k. most of the time. Just output color bars from vegas and get one of those blue eye filters (it acts as a poor mans blue gun only button) from somewhere, I think you can order them here:

https://secure.brandlocker.com/promos/thxpromo.cfm

this way you can check chroma, etc. here is some good info (it's for home theater but still applies):

http://www.thx.com/mod/products/dvd/video.html

The biggest problem with cheap tv's is they don't hold pluge - the level of black varies depending on the amount of bright or dark information in the picture. This can be a problem - unfortunately sets that are good at holding pluge tend to be more pricey. Also they use a courser phospher so resolution just isn't as sharp - you get what you pay for...if your just looking for "close enough" a cheap set should do o.k.

hope this helps
Jay Gladwell wrote on 3/30/2005, 12:31 PM

Dave, that's when you take a few minutes and educate the customer on the difference between what they see in the studio on the monitor and what they see at home. The way they have their home television adjusted is not your fault (or problem, when it comes right down to it).

It is a rank impossibility to correct video for every T.V. set out there. Hence, the need for a standard--NTSC. We correct for the standard and encourage the customer/consumer to do the same. That's all anyone can do!


FrigidNDEditing wrote on 3/30/2005, 12:44 PM
I know, I know - and I do. I just find it annoying when customers get annoyed or upset about it. (not all by anymeans) Get almost a, "I thought it would look better than this" attitude when they see it later.

Such is life - I suppose.

Dave
BillyBoy wrote on 3/30/2005, 12:52 PM
Not wishing to open this hot potato issue again, but neither underscan or blue gun are really that an important a issue.

For the newbies simply so you understand the issues:

All TV's by design hide a portion of the signal. Its partly the preimeter of the picture tube in CRT type sets that gets hidden by the mask/shielding/frame that surrounds the picture tube, some cabinets designs also can hide a portion of the screen. It is also due to something called overscan, explained later. Both conditions varry from make to make, even within the same model from set to set. New type LCD and Plasma TV sets hide less of the picture.

Vegas like some other editing software show the ENITRE raw frame. This may include ragged edges and other junk that's hidden by the TV's mask or overscan. You'll frequently see people in this forum asking how to hide this objectional noise.

A so-called "professional" monitor (there is no such thing aside from marketing BS) shows the raw picture...warts and all. Vegas and DVD Architech both have features that show an aproximiation of what portion of the whole frame will be hidden when viewed on a TV. You may disable or toggle this feature. The part not typically seen by the consumer typically averages roughly 10%.

If the TV is badly calibrated or showing its age where the high voltage isn't working to specs something calling blooming can happen when the entire picture seems to grow meaning you see less of the whole frame as the brightness is increased. Now that you understand some of what happens, the two terms maybe will make more sense.

Underscan is when the signal is seen in total. It may or may not fill to the edges of the screen. The TV doesn't fiddle with it. Becuase its just the raw picture some ragged edges may appear and/or other junk as well.

Overscan means the picture is actually seen slightly larger. This ensures that the picture fills to the edge of the screen. The outher edge is cut off, actually hidden.

Looking at a over scanned image may mean, likey does, you're not seeing the entire picture. Since ALL televisions overscan to some extent its another tempest in a teapot issue. Read that to mean any person dumb enough to actually include things in the outer edge of the frame that are important isn't a very good editor. So really if you see the tiny bit of picture the consumer won't is really mostly a none issue.

A blue gun only switch has some limited value during the process of calibrating/converging a TV with a CRT type picture tube. In older style TV's there are three 'guns' one each for the primary colors red, blue and green. To get good convergience all three guns must hit the majority of the screen pixels squarely so they converge. If they don't, and they can't perfectly over the enitre screen regardless what you've heard (more marketing hype) , then if you look close you may at times see halos and other color distortions at places on the screen.

When calibrating, if the monitor has a blue gun switch then the other two guns can be disabled making convergience, not actually calibration easier to do. Since 99% of those reading this don't have a clue how to do proper color convergience of a CRT type TV or the equipment to do it with again the point is mostly mute. A blue gun switch may be of minor help while making minor tweaks of color hue, you can do nearly as good looking through gel. Again, without first doing proper color convergience and yes, that takes some very high end and expensive lab equipment plus knowledge and experience to do it correctly, the blue switch is mostly another "professional" monitor toy ie marketing hype the uninfomed gladly swallow.

As I've always said and often am misquoted you should buy the best monitor/external TV whatever you want to call it, that you can afford. Today's high end LCD monitors are as good or better than yesterday's so-called "professional" CRT monitors IF they have a high enough native resolution.



craftech wrote on 3/30/2005, 1:08 PM
You still have to tweak the settings on the monitor after they are "properly" set up or the customer will have a washed out video. Before I got into video I used to buy recital videos, etc and wondered why the colors were washed out. It was because they were adjusted using a "professional" monitor.
Buy a setup disc from Discwasher or Avia and see for yourself. Set up a TV "properly". I guarantee you will be lowering the contrast and brightness from factory default settings. Now adjust your video color, etc in Vegas using your properly adjusted video monitor until it looks perfect. Then go play it on a TV. It will look washed out.
When will the difference be most noticeable?
In low light shooting or stage lit shooting. In daylight or brightly lit shooting there is more latitude so it won't be as noticeable, but when the light dims or gets funky (as in a dance recital or a stage production) the video will be washed out looking especially if you dub to VHS.

John
BillyBoy wrote on 3/30/2005, 1:34 PM
I agree with what John is saying. There are two camps that seem to fight like cats and dogs on the issue. I guess the best answer is there is no carved in stone right way to do it.

For those that only do projects for broadcast TV or are more concerned with doing it to "specs" the hell with how the consumer sees your work you'll probably want to do things somewhat differently then the guy only making wedding videos, slide shows or reaching some smaller audience or just making vids for himself and a handful of friends where you know if you throw the dice, chances are many of your customers or friends would find you work less than ideal when viewed on their badly calibrated TV's if you did it only to some rigid specs and don't bump it up a notch.
BillyBoy wrote on 3/30/2005, 1:48 PM
Well duh, by needing to respond as you did, it seems you already did "go for it" John. Just like you went for it when some company marketed a "professional" monitor it seems.

Just because we have different opinions doesn't mean mine are flame bait.

We live in a world where marketing frequently influences people. Just last week a major food company announced its now offering its popular mayonnaise in a extra wide mouth flip top lid packaging. For sure, it won't make the mayonnaise taste better, last longer, but I bet it will increase sales.
PossibilityX wrote on 3/30/2005, 1:56 PM
Skip, Billy Boy makes an excellent point that beginners sometimes overlook----what you see in the Vegas preview window is NOT what will be seen on TV sets. TVs "cut off" a portion of the video frame.

For this reason I find it neccesary to ALWAYS be sure to select the "overlays." These are two "boxes," one inside the other, that are superimposed on the preview window to show you what portion of your video will show up on a TV screen.

To select the overlays, click on the button that looks like a tic-tac-toe grid. There's a drop-down menu. Choose the one called "safe areas" and the two boxes will appear.

If you insert titles in your video, be sure to keep them inside the smaller of the two boxes.

If your project will be shown only on a computer, then the overlays aren't neccesary.
trock wrote on 3/30/2005, 1:57 PM
To save rendering time I do a lot of my flitering pre-capture with various hardware devices so it's vital for me to be able to accurately see what I'm doing while I'm doing it. I personally found TV sets (I tried 4) a bear to calibrate accurately so that what I saw on the TV would translate well to various DVD player/TV set combinations.

I picked up a new Panasonic video monitor on eBay for $50 and after I calibrated it things have been wonderful since. When it looks great on the monitor it looks great on whatever DVD player/TV set combo I test it on and clients are very happy. I also use the same monitor when editing in Vegas for any fine-tuning.

So for me, a video monitor worked out much better than a TV set. YMMV.

Tony
DavidMcKnight wrote on 3/30/2005, 2:01 PM
BB said...

All TV's by design hide a portion of the signal....


In this post you've just said that a pro monitor shows the full signal, and a TV does not. Sounds like more than "marketing BS". And don't try to backpedal, saying you were "misquoted". Those are your words. Glad you finally stepped up!
BillyBoy wrote on 3/30/2005, 2:34 PM
I should know better to say anything in this kind of thread knowing the result that always happens. No backpedaling necessary. The hype used to sell "professional" monitors includes sometimes removing any trim a "tv" has and making other cosmetic changes.

Do you think buying a fancy sports car and removing the racing stripes or painting it a specific color makes it go faster?

Just once, it would be interesting for anyone in the "pro" monitor camp to actually specifically detail what makes a pro monitor more professional. I don't mean repeating the often heard marketing hype, show me some tanigable facts then we'll talk again. Till then...
rs170a wrote on 3/30/2005, 4:31 PM
BB, I have one question for you that I asked in a similar thread recently and never got an answer to.
Have you ever worked on a broadcast-level shoot?
If the answer is no, then please stop slamming those of us who use "professioanl" monitors.
If the answer is yes, then you know why you use "professional" monitors as opposed to TV sets.
I've done this over the years at a number of different levels ranging from local television to network level and, on every shoot, a "professional" monitor was ALWAYS used, never a consumer TV set.

Mike
BillyBoy wrote on 3/30/2005, 4:47 PM
Yes John, we had this conversation before and everytime we do you are just as anal about it. You apparently accept without question marketing hype. If this was the 60's and someone questioned unstable power supplies or black level drift and other things you apparently talked yourself into accepting that you've read in ads or gotten third hand I many consider it since solid state devices were just coming into their own. Tube type TV's were still common and the norm. This is 2005 John. While the technology of how television works hasn't really changed much since the late 1930's when it was invented, the manufactuer of quality, more stable components has improved markedly.

I've made NO contentions. I have pointed out time and again not everyone needs a so-called "professional" monitor. I go to great lengths to say get the best monitor/tv you can afford for the type of work you do yet you everytime the topic comes up along with a few others still want to agrue. Its getting old.

This IS the last I will say on the subject in this thread.

Rednroll wrote on 3/30/2005, 5:21 PM
Oh hear we go again!!!

You know what disagree with BB's points all you want. It seems to be the only information you can post about. You guys are rediculous and make me personally sick reading your posts. BB posts some good information, and it seems like a pretty intelligent discussion on his part. It seems like a few of you have nothing better to do than to attack his information, and then personally attack him with your responses. Post some facts for once, or are you not intelligent enough to offer any of your own advice for your opinion with supporting facts? I sure as hell can't find them in this discussion. I'll tell you, being an audio user trying to learn more on the video side, I've learned more from BB's posts than any of you criticizing him proclaiming to be doing professional work.

John Cline go on about your tighter tolerances of electronic components B.S. and everything else. Do you design electronic equipment or are you a video editor? I DO design electronic equipment and your tighter tolerance argument has no weight behind it of making a professional monitor any better than a non professional monitor. What truly makes one better than the other is in the manufacturing process in ensuring it passes end of the production line "Tolerance specifications". You can use all .1% tolance parts and end up with a stack up tolerance of 50% off from optimal design intent. You can use less expensive 2% tolerance parts and still end up with better speciifcations than the .1% parts. It all depends on the manufacturing process and what tolerance specifations the manufacturer uses to consider a monitor pass or a fail condition before it ever gets put in a box and shipped to you. You would obviously know this if you had any experience in the matter, rather than reading marketing B.S. like pointed out. I'm a electrical engineer that designs electrical equipment for one of the leading manufacturers of audio electronic equipment in the world, if you want to debate my facts, let's go. I'll be here all week.

You know what, I just learned something from BB, about a difference between professional monitors and non professional monitors that I didn't know before and you guys use it against him. Well, it sure was a lot more than I've learned from any of you doing "professional" work. So go ahead and criticize that BB doesn't do professional type of work. He's more professional with his information than all of you criticizing him, put together. And I'm talking to you rs170a, John Cline and dmcknight, just incase there's any confusion.
Rednroll wrote on 3/30/2005, 5:39 PM
Yeah, and thanks for the supporting facts backing your statements. What's that I hear again??? You once again don't have any supporting facts???? You might try getting a little educaton in these matters and maybe you will be able to pull some out of your back pocket, like I just did.
goshep wrote on 3/30/2005, 7:11 PM
Skip,

There was a time when these forums were a valuable source of information. Those days, it seems, are fading. For what it's worth, I'd go with BillyBoy's advice. When you become experienced enough to know the difference yourself, you can probably justify the cost of a "professional" NTSC monitor if you need it.

craftech wrote on 3/30/2005, 7:55 PM
Rednroll, you're an audio guy... do you mix on little 3" Soundblaster computer speakers? Do you add a bunch of bottom end to your mixes to compensate for those folks that will be listening on a transistor radio? Hmmmm, I didn't think so.
===========================
Is that analogy for real John? Are you suggesting that 3" computer speakers make up the normal listening speaker for home audio and that they are analagous to a typical television a typical TV viewer watches? The average television owner does not change the factory default settings for the picture. Those settings are too bright compared to the "correct" settings. That is why I forgo the professional monitor in favor of testing my color corrections on no less than FIVE different makes of television before I finalize it. My tapes and DVDs look good on just about anyone's television as a result. On a "professional" monitor, they look TOO DARK. The only problem I can forsee with this is if a customer has a professional monitor sitting in their living room instead of a television. In that case, I'll gladly give them a refund.
John
Spot|DSE wrote on 3/30/2005, 7:58 PM
I can't speak about tolerances and such things when it comes to building a monitor vs a television.
I do know, as does anyone who's compared them side to side, that the image quality of a television shifts tremendously over the course of a day's use whereas a production monitor hits its stride after about 20 mins.
-Televisions don't have underscan, which is critical for production monitoring and field monitoring. You also can't see flagging on a television, because it's out of the field of view. You need underscan to see flagging.
-Televisions cannot be calibrated as accurately as a production monitor, they don't have the same controls. (some do, but you might as well buy a production monitor for the cost)
-Televisions rarely have more than 400 lines of horizontal resolution. Broadcast monitors rarely have less than 600. You also get EBU phosphors in a broadcast monitor, you don't in a television.
-EBU phosphors are formulated to provide sharper pictures that aren't as bright.
-You don't get options for SDI input on a television, or at least I've yet to see one, but most broadcast monitors either have SDI or an optional card for SDI.
-Most televisions, but not all, have plastic cabinets. They are not metal like a production monitor. All production monitors (that I'm familiar with) are metal-cased, for the purpose of shielding from interference.
-Most televisions don't allow for looping; all broacast monitors that I've ever seen do.
-I've never seen a television with blue gun; almost all broadcast monitors do. -There might be a television that allows for external sync; most broadcast monitors either allow this, or have an option to allow this for additional cost. Of course, you can always use a piece of Rosco 64 gel.
-Professional monitors have very robust degaussing circuitry. Televisions merely wipe the magnetism from the screen.
-Televisions have a higher convergence error than broadcast monitors, but this is becoming much less of an issue.
-Broadcast monitors all use Type C phosphors or they can't say they're SMPTE standard. Televisions don't. It's expensive, and unnecessary, and also not useful for the home/consumer environment, simply because most people prefer brighter screens, and they don't watch TV in standardized 5000- 6500k environments. The environment is variable, and therefore the image needs to be brighter.
-Most televisons can't switch between 16:9 and 4:3. Most broadcast monitors do.
-Most televisions can't provide RGB or even Y/C inputs. Most broadcast monitors do.
These are all reasons why you need a broadcast monitor IF your work product is for clients, for airing, or for anywhere else other than your family and friends. It's a standards thing. It's not the same argument as speaker monitors, because those are mostly subject to opinion and room environment, and sometimes even the type of music being mixed/mastered. While we might like the 'brightness' or 'beefiness' of a particular speaker monitor, we don't say "I like the skin tones to be more red" or the "grass to be more green" in a broadcast monitor. We want it to be 100% of what the camera caught, and as close to a 'standard' as the broadcast will allow it to be when it hits the air or cable. There aren't but maybe 10-15 production monitor models worldwide that are acceptable for broadcast production in the mid price range of 5K and less. Colors are measurable, standardized, and fixed. We don't want monitors with personality or opinion. We want them flat, accurate, and consistent whether we're in LA or NYC, or DC, Dallas, or Tokyo. And with HD, this becomes a worldwide standard. ITU709 rules the world there.
So, keep arguing about it all you want, but as I and a few others have said time and again; you owe your client the best, most accurate image possible, because they're paying you for it. And while I've found that the higher end Sony Wega monitors are very, very good, they still ain't close to what a PVM monitor will consistently deliver, 18 hours a day. Just feed a set of color bars into a BVM and a Wega side by side. Look at them every hour. They'll shift.
Does this mean you can't do video without spending a fortune on a monitor? No. Of course you can do video without a high end monitor. But if you have paying clients and your media is going to air or being viewed as "professional," such as corporate media, infomercials, etc, you need an accurate monitor that will provide an image that looks the same on any calibrated broadcast/production monitor anywhere in the country. If it's not a "standard" production or broadcast monitor, then it's "sub-standard." Sub-standard doesn't have to mean "bad" but it also doesn't mean "standard."
I wonder if we'll still be having this same lame (and ridiculous) argument in a year and a half, when LCD's will be the primary monitoring source for HD, as HD starts to really take hold on the world? Nah....when that time comes, we'll probably be arguing over color profiles. (which are better, the free ones, or the ones you buy from Spyder or similar) ((tongue in cheek here))