What Monitor Do YOU Use???

Comments

Chienworks wrote on 10/10/2003, 2:16 PM
I still want that SONY TV i saw at the Creation Festival in Pennsylvania. I'm not quite sure how i'd fit a 27 foot diagonal screen in my 7x9 foot editing room though ...
John_Cline wrote on 10/10/2003, 3:03 PM
BillyBob,

I got exactly the response from you that I was expecting, when someone doesn't agree with some of your insane assertions, you resort to personal insults. How big of you.

John, have you even bothered to try what I did?

Of course I have.

You keep bringing up a NONE ISSUE; image quality.

First of all, it would be, "non-issue" not "none issue." Nevertheless, grammatical mistake notwithstanding, that is perhaps the most ignorant thing I have ever read from you and I have read some pretty ignorant stuff from you. It's ALL about image quality. And, no, it isn't all about "image crispness" it's about image accuracy. We're talking reference monitors here and reference monitors usually list what type of phosphors they use, i.e. SMPTE, EBU etc. and they have high bandwidth circuitry and well regulated power supplies. This isn't marketing hype, this is actual high-level electrical and mechanical engineering.

The fact you can't run away from is a PROPERLY calibrated TV regardless what is costs CAN have proper black and while points and show proper hue meaning yes, even a Walmart purchased $99 consumer TV can serve as an external monitor.

Listen up BillyBob; NO, THEY CAN'T. A not-well-regulated power supply in a cheap TV will cause the black and white points to drift depending on the overall brightness of the scene. Also, a cheap TV will not have linear response in its RGB circuitry nor its phosphors, so the hue can never actually been correct no matter how much you tweak it. If you understood anything about video monitors, you would know this.

I also see you're starting to back away from your orignal position. Now you say Consumer televisions are GENERALLY not accurate enough.

There are some expensive consumer TV's that can look pretty good. NONE of them are available at Walmart.

Well, maybe if you knew how to calibrate one you could use it like I do and use all that saved money for something else. LOL!

Wow, another cheap shot from BillyBob. Of course I know how to calibrate a monitor using both external and internal tweaks. But all the tweaks in the world aren't going to make a $99 TV that has been built to the lowest possible standards look as good or consistent as a well designed and manufactured video monitor.

BillyBob, some people around here actually tend to believe what you say and that's a problem. It's a problem for you because it has so grossly inflated your ego and a problem for them since they now believe something that is patently ridiculous and then pass this erroneous information to others.

Finally, let me ask you a question, what ARE your credentials?

John
HPV wrote on 10/10/2003, 3:25 PM
I use a Commodore 1084S-D2 monitor. John, how do they compare to the 1702?
Caution to anyone looking at these old Commodore monitors, or any older unit. As they age you'll find that they get can get a soft picture and won't have full brightness.
Billyboy, you've stepped into a mess on this one. John knows his stuff. Live and learn.

Craig H.
John_Cline wrote on 10/10/2003, 3:44 PM
Craig,

The Comodore 1084 is pretty much the same as the 1902 which was Commodore's monitor for the C128. The same monitor for the Amiga was called the 1084. The 1702 was built for the C64 which had a 40 character wide display and the 1902 and 1084 were made for 80 character displays. Both the 1902 and the 1084 have higher bandwidth circuitry and a finer dot-pitch on the picture tube than the 1702.

Another thing the 1702, 1902 and 1084 had going for them was that they were "converged" rather well from the factory. Convergence is how well the red, blue and green guns line up together on the screen. BillyBoob's $99 Walmart TV probably looks pretty dismal in this respect as well. If you know how to do it and have the right equipment, you can set the convergence on a cheap TV, but why? My time is worth more than that.

You are absolutely correct about their age causing the picture to soften (although this can be tweaked a bit with the focus control which may be accessible through a hole on the back of the monitor.) All TV's brightness diminishes with age as well and there isn't anything that can be done about that. It's the nature of the beast. (However, a good CRT will last a LOT longer than a plasma TV will. Personally, I think plasma TV's look awful, they're expensive, consume a LOT of electricity and don't last very long. But they are thin...)

John
vitalforces wrote on 10/10/2003, 4:11 PM
Like Sunnis and Shiites, I want both JCline and BB to stay in the forum so we can benefit from their collective knowledge. Mr. Cline, less armed force at each intersection, and BB, no more car bombs.

Yeah I know, who the h*** am I? Just somebody who wants to learn from others who passed before me.

For myself, I have no intention of going out and spending a thousand dollars for an NTSC editing monitor. But then I don't edit for customers. I followed BillyBoy's instructions in the past (and others) relating to how to roughly calibrate a standard TV set.

BUT I also remain mindful of Mr. Cline's (and others') caveats concerning the inherent variability of inexpensive TVs over time, e.g. drifting blacks, etc. I have an old $200 Samsung that I use for the present. I also test tapes & DVDs on as many other TVs as my neighbors and relatives have time for. But if I get offered a deal on one of my films from a cable network, or a major festival picks up a submission, I'll go to a post house and have them color correct and legalize the thing on their own equipment.
BillyBoy wrote on 10/10/2003, 4:19 PM
"BillyBob, some people around here actually tend to believe what you say and that's a problem."

A problem for you or them John?

Some people around here may have 30 years experience but haven't grown up yet. Sound like anyone you know personally John? Hint: maybe the guy you see staring back at you every morning from the bathroom mirror.

You actually read the rants you write John? Lets see...

"...someone doesn't agree with some of your insane assertions, you resort to personal insults" Well John, in a single post you call me BiilyBob several times, say I make insane assertions, then you go on to say some people belive what I say and call that a problem, and you got the nerve to say I engage in personal attacks? Then you resort to looking for typos. LOL!

A side by side test isn't an insane assertion. I also find it interesting that the 'professional' grade monitor I used in the test costs 4 times as much as the ones others are throwing out as professional grade.

To follow your "logic" people using the most expensive monitor must be the most professional because of course everyone knows those driving the most expensive cars are always the best and safest drivers. That how you think John?

If anyone is making assumptions its you John. Just remember the next time you're staring at your fancy monitor with a 'well regulated power suppy' and super-duper screen phosphors and telling youself oh what great picture quality, remember Joe Average doesn't have that kind of TV and likely the one he does have is badly calibrated. So what it means John is you're engaging mostly in a healthy ego stroking exercise to impress yourself and maybe equally clueless clients. Because you see John that extra 1% accuracy you may get using an expensive monitor is all for naught because even you admitted in another thread some time back that there are many broadcast engineers who are suppose to see that what's going out over air meets standards and to be kind, I'll just say they let it slide.

So what I'm saying in very simple English so even you have a chance to understand it John is nobody will notice if the accuracy is off 1% because by the the time the TV broadcast reaches the typical TV it could very easily already be off 4% and we haven't even factored in the typical poorly calibrated TV Joe Average is viewing your work on John.

So to recap what I said, feel free to use as an expensive monitor as you want. If it impresses you, swell. I'll this tell people that if they aren't using that kind of monitor they're not missing a thing, not in the sense that anyone would notice without putting expensive hardware on it to see.

"Finally, let me ask you a question, what ARE your credentials?"

I let my tutorials answer that. I think they have. ;-)
riredale wrote on 10/10/2003, 4:27 PM
Okay, I'll dip my toe into these toxic waters and then run like hell for cover.

On a quality scale of 0 to 100, I'd have to say that editing without reference to any external color monitor (i.e. completely trusting the preview window on the PC) would give me results of 75. Using a cheap but clean 13" TV from Target for final tweaking (along with the excellent Vegas color wheels) would give me results of 90. Using an expensive pro monitor would give me 95. I do this for fun, but I'm pretty serious about the quality of my work. I use the cheap TV approach. I think my stuff looks damn good.
BillyBoy wrote on 10/10/2003, 4:53 PM
"All TV's brightness diminishes with age as well and there isn't anything that can be done about that. It's the nature of the beast."

An example of your "expertise" John?

While its true overall picture brightness diminishes with age there are things you can do to get some more life out of it. Something as simple as resetting the black and white points can restore some brightness to a set starting to show its age. Another thing depending on the age is have a shop give it a good check-up or you can do it yourself if game and follow some precautions.

Another common problem is as the CRT tube ages the voltage reaching the tube's filaments drops, lilkey due to a failing or open resistor. If you increase the voltage to the filaments slighly you'll frequntly get the brightness back... at the expense of maybe running out the life of what's left of the tube. There's several other things as well, but what the heck do I know, I only build my own color TV. Saying there is nothing you can do is simply ignorance speaking. With TV's as cheap as they are these days may not be worth the time or expenss messing around.
pb wrote on 10/10/2003, 5:02 PM
BillyBoy is right about the aging issue. We still have some Sony PVM 8020s and 1271s from the 80s. They are well used but can be calibrated using blue only on the little guys and blue gels with the 12" ones. The settings do not hold indefinitely but given the thousands of hours the old darlings have been powered up, who cares? Only major difference I have seen over the years is picture resolution within the same price points keeps getting better with each new model. At home I have a PVM-14M2U and a PVM 1341. Allow for inflation over the years and they cost the same but the 14M2U has higher resolution AND component mode. Uhm, call me pretentious and daft if you like but I am not comfortable using a consumer TV as a reference monitor. My 14M2U is 16:9 switchable and as it is about ten years newer than my 1341, holds its settings well. We do, however, test edited product adn DVDs on Walmart and Zellers TVs, because that's what the end user will likely have.
Spot|DSE wrote on 10/10/2003, 5:18 PM
Billyboy, it's not a question of "I'm more professional than you" it's simply a matter that I've spent a LOT of money on education, a good camera, good lights, and you CAN'T see all of that on a cheap monitor. No way! If you could, then more people would use them.
You need to see the image on the BEST format, and the WORST format, IMO, to know what range things will be going in. Just recently, I did a session where media originated in Hi 8, VHS, SVHS, DV, and Beta SP, all in the same project, from easily a dozen different cameras. Just for giggles, I threw it all up on the monitor a moment ago. On my 99.00 Sanyo, I can't see a whole lotta difference between the images, except for the luminance variance. On my high end Sony, I see a HUGE difference, and things that I would have corrected on my cheap monitor, now can be seen that they don't need to be corrected as a result of display on my high end monitor.
A true craftsman can build a house with nothing but a hammer and a hand saw. And it will look great. But due to the crude nature of those tools, the house will lack refinements. Good enough for most folks to live in, no doubt. But give the same guy a truck full of Mikita (or whatever brand) power tools that include mitre saws, jigsaws, sanders, and whatnot, and he'll build a very fine house with many beautiful finishes.
A 300 line monitor simply CAN'T show the media accurately. It's not about the end viewer owning a monster television, it's about having it look it's very best before it hits the air and Lord knows what happens to it after that.
Again, we monitor on very high end audio monitors, because we can hear EVERYTHING that's in the mix. The end user doesn't have gear remotely close to ours in most cases, but on the other hand, if it's not EQ'd right, then they might hear something weird, out of balance, or whatever that we didn't hear in our room. Or, it might be that it's going to be compressed elsewhere, and compression artifacts will gum it up because again, we might not have heard something.
That's not to say that great mixes of audio (or edits of video) CAN'T be done on cheap monitors, of course it can. But it's like walking a wire without a net. And when reputation is the cost of the fall, I'll take the net or not walk anytime.Funny, people only remember the BAD work you do, not the good or average work you do. So, I strive for excellence.
Then again, the BET award we won didn't even touch a monitor during the edits, because back then we couldn't monitor on a broadcast monitor. But looking at it now, it NEVER would have left our room in today's standards.
Spot|DSE wrote on 10/10/2003, 5:36 PM
Ouch! c'mon guys, lighten up! You both are right, the only thing I can take exception to is Billy's saying that a 99.00 monitor from Walmart can look as good as, or be as useful as a 600 line monitor. No way. I do it every day, have both right next to each other every day, looking at them every day. I can take a still image of both monitors sitting side by side, in fact, look close in the Vegas book and you'll see it. Even in a still image, the quality is VERY visible. It's not 1%, it's more like 15-20%. When we rent TV's from hotels for VASST, I'm constantly apologizing for the weak quality of the image. And that's AFTER I've spent 15-30 mins calibrating with my USB calibration tools and Vegas.
A cheap monitor can look good, and is probably usable for 50% of the work being done in Vegas. I've done projects that went to broadcast that were monitored on hotel room televisions for purposes of previewing, but I'd never ever attempt to do color correction or anything color-centric on our 99.00 monitor. When we are previewing on both at same time, which is always, the color correction always looks significantly different on the two. And sometimes decisions have to be made about how fine to do a correction, because doing something "right" doesn't always mean it's going to look great on those low end sets. So we end up compromising quality. Just like in audio, we have to sometimes compromise the final output simply because of the limits of 44.1/16 bit. But we still record and monitor at 24/96, because in the end, it's better after its quality is reduced. I think I explained that badly...I think you know what I'm trying to say.
But geez! Let's not throw crap on each other. Billy's tutorials are brilliant, and John's posts are likewise. C'mon, guys!
Jesse, I should have just returned your phone call, huh?
JJKizak wrote on 10/10/2003, 7:13 PM
Some things here that no one has addressed: What is the frequenciy response curve of your ears and what is the quality of your vision. Also do you wear glasses and are they color corrected and how well? I have seen people adjust TV sets with pure orange faces because their glasses were not color corrected. I would bet not two of us posters have the same hearing range. Mine is approximately 40 to 12000hz. How can I possibly evaluate anything above 12000hz? I can't hear anything below 40hz. I use the old CBS test records to "see" what I can't hear.

JJK
craftech wrote on 10/10/2003, 7:24 PM
There are as BillyBoy said several factors which can reduce brightness such as: the digital user brightness and contrast set too low, the internal G2/screen control on the flyback transformer in need of adjustment, or the weak cathode ray tube filament supply which BillyBoy mentioned. Often the drop in filament supply voltage is due to a leaky cap on the PSU output side. The brightness diminishes as it warms up.
With gradual deterioration in brightness in which the CRT has shifted its bias point for brightness there is a control labeled sub-brightness or background level or back level which can bring it up again. YMMV. There is a bias pot which can restore it but which can screw everything up in the process as well.
There is also a CRT brightener which increases the filament voltage by adding a turn of wire on the flyback core, but only a single turn. There are labs which do CRT restoration, but for the most part it is short lived.

John

PS: Maybe this will become the infamous "Monitor Thread". How awesome!
craftech wrote on 10/10/2003, 7:27 PM
not two of us posters have the same hearing range. Mine is approximately 40 to 12000hz.
=======================================
Say What?
jsteehl wrote on 10/10/2003, 7:39 PM
Keep it up guys! It's like a front row free education. I'm learning much here.

Funny, I was in a similar post volley in another forumn ... " I can edit just as well on my Studio8 as you would be able to in Vegas, why spend the extra money". :)

Ok Spot, BillyBoy said pro monitors are just window dressing.

BillyBoy, Spot said TV's are for wannabies.

Nowwwww go!

BillyBoy wrote on 10/10/2003, 9:15 PM
The whole truth probably lies somewhere between the two extreme views. SPOT is a gentlemen and can make his points without getting angry or name calling and we can discuss the issue. John has had similar outburts in the past when he disagrees with what I said. At least it makes this thread more lively than some. ;-)

What I'd like to do is ask SPOT specifically what he does in the way of testing and measuring to support his observations, if its all eyeballing or if he uses test equipment, and if so what hardware/software and what exactly he does then I'll elborate more on what I've done and then maybe we can find some common ground.

The only thing that bugs me is why anyone would think I would suggest you do X, Y, Z if I didn't do it myself and test that it works. My question is if many of you think what I offered on color correction in the tutorials is useful and many of you said it was, then why would I blow it using a method that would distort it? Does that make sense?

GaryAshorn wrote on 10/11/2003, 6:34 AM
Gee, I'll throw in my 2 cents. The reason you use a better monitor that is more accurate is to know how close to the middle of the wide range of cheap sets you can be. In other words, if you don't know how close the accuracy of your efforts are, then you don't know how far off you will be with the WIDE variation of the everyday TV sets that people use. It isn't about what we see on the sets we have in our studiio. It is about how accurately can I make a product that the widest range of possible play back sets can play my video back and look acceptable. And using a STABLE, REPEATABLE, CONSTANT and MEASUREABLE monitor means you get it the same every time. A lesser monitor hits it one day and is off the other and does vary with the content you send it. Let me guess I am the only one who recalibrates my system after rewiring.....Yeah, I thought so....If you don't know what you have, how do you know what others will see. It's like the guy who thinks the automatic on the camera gets it right, yet doesn't know why his camera doesn't match someone elses......

Gary Ashorn, PE
RexA wrote on 10/11/2003, 2:35 PM
>And that's AFTER I've spent 15-30 mins calibrating with my USB calibration tools and Vegas.

What do you mean by USB calibration tools? Is that some kind of feedback tool to accurately measure color and brightness?

Sounds interesting, or have I completely missed what the USB part is about?
jcg wrote on 10/11/2003, 3:32 PM
This is the original poster back again. This thread has stayed remarkably on course and with a pretty good variety of contributors. I know you haven't all responded just for me, but I nevertheless wanted to say thanks for every contribution.

This reminds me to ask why the forum doesn't allow us to see (and print) and entire thread without having to open one at a time. Unless I have been missing a feature, this is extremely inconvenient.

Thanks again,

JCG
Chienworks wrote on 10/11/2003, 4:40 PM
jcg, click on Edit Account at the top of the page. Change the Forum View to "Non-Threaded".
MichaelS wrote on 10/12/2003, 10:43 AM
Wow...I've got four of these old Commodore monitors and have always used them for travel or in the field when stuff usually gets broken. Yes...there are thousands of these babies tucked away in attics everywhere. I'd never thought about using an S-video cable. This opens up a whole new level of use for them. Thanks for the nugget!
BillyBoy wrote on 10/12/2003, 1:18 PM
That little USB calibug looks like an interesting product and should make it easier to get closer than calibrating by hand. Check what I've said all along near the top of my tutorial:

"While the best method is to use a color bar generator and other expensive equipment, that method is beyond the reach of most hobbyists."

The Calibug is about $100, so not that expensive.

I still stand by what I said. I'll make one more attempt to clear up any confusion. SPOT said the difference between videos viewed on his expensive NTSC monitor and once when he related that he viewed the results on some cheap hotel TV's was more like 15-20% variance.

Well DUH! I didn't use some beat-up hotel TV for my comparisions and I doubt you will. While I used a "cheap" TV, a Sharp 14 inch consumer TV, it has never been dropped, bumped or subject to any abuse like its counsins in thousands of hotel rooms all over the planet no doubt are.

Now WHY I say you can get good results with a cheap little consumer TV if you use one as your external monitor. One word: CALIBRATION. Doing it by hand can get you close. Using something like the Caibug should allow you to get closer.

But is that really what we're disagreeing about? No, not really. Those that support only using some expensive NTSC monitor like to point out the quality of picture, and alude to things like the power supply maybe isn't as stable on the picure can bloom because the high voltage drifts causing the white and black points to drift. Sounds more like UNDOCUMENTED claims from those that have bought an expensive monitor and need to justify the expense. True as I've always said of course the picture quality will be better on an expensive monitor, of course, it has a superior picture tube!
The rest in my opinion is mostly BS if you factor out image quality.

Here's why:

Two words: Chrominance and Lumiance. The first pertains to hue, the second brightness.

Now the REAL differences between an expensive monitor and that "cheap" TV. The picture tube phosphors. A "consumer" TV may have what they like to call enhanced phosphors or other buzz words to help make for a more pleasing picture. All the major brands if you read the fine print claim their phosphors give the "best" picure.

Consumer TV's also frequently have adjustments that allow you to tint the picture so it is shifted either more towards blue or red. An expensive NTSC monitor is phosphor netural. It produces the most realistic colors, but because we have all been somewhat brainwashed by vieweing thousands of hours of commerical television, these more honest and natural colors as seen on some expensive monitor can look "funny". The point is that's how they are suppose to look, meaning nobody dicked around with the phosphors in some silly attempt to try to make the colors look "better".

Lumiance is how bright the overall picture is. With television it is a matter of setting the voltage properly. While all TV's or monitors regardless how much they cost have limiting circults to prevent the current from getting out of bounds, you as the consumer also are effecting the voltage without probably knowing that's what you're doing. Again, consumer TV's come out of the box DELIBERATELY not properly calibrated.

Read that last sentence again!

That's right. People have gotten used to viewing television with both the brightness and contrast cranked beyond the point you would get the best overall picture because someone long ago decided television looks "better" that way, meaning brighter and with the increased contrast.

OK, now that you know what's REALLY important, maybe you'll understand what I mean ny 'properly calibrated'.

You, me, anybody can dick around till you're blue in the face adjusting this or that and tell ourselves, that's how the image is suppose to look. While you have color bars and other charts and test patterns that help, without starting with a 'on standard' hue and PROPER lumiance you're chances of getting it "perfect" without test equipment are about the same as winning the lottery.

I wrote the tutorial I did, because I know most of you don't have expensive lab equipment. So I gave you a means to 'get close' and it works and now I'm about to tell you WHY it works... if you do it correctly.

As I've said in several threads I'm sort of a 'computer guy', more accurately a long time electronics enthusiast. Been so since I was seven, which was almost fifty years ago. When I did my testing I did something probably nobody else here did. I rented a state of the art wave form gererator/monitor. With that piece of hardware you can generate waveforms (they look similar to what you see on the audio track) which allows you to very precisesly adjust what needs to be adjusted to bring the monitor or TV up to specs. No guessing, no dicking around. Either its set right or it isn't. That means I and problaby only I started with an even playing field where both the expensive test monitor and my little cheap TV were calibrated as best they could be. And yes, I needed to calibrate both monitors which may surprise you, not me. In fact the more expensive NTSC monitor was off more orignally the my little cheap TV was!

That accomplished, the two key elements, Chrominance and Lumiance levels were within specs on both. If so the remaining differences are due only to the superior resolution of the more expensive picture tube found in the expensive monitor. That's why I say you can get within 1%. Not in picutre quality, NOT WHAT I'M TALKING ABOUT, but with chrominance and lumiance, you bet you can. And that is the basis for making color/level adjustments properly.

So what it boils down to is if you can afford one and think you need some expensive monitor by all means buy one. If you want to get acceptable results you still can get very close IF the TV you use lets you adjust what needs to be adjusted. Maybe I should have made that part more clear. Some TV's are easier to calibrate than others.


FadeToBlack wrote on 10/12/2003, 3:04 PM