Comments

Coursedesign wrote on 10/26/2009, 12:02 PM
If this is for broadcast or other delivery on 1080i60, then yes.

It's too easy to screw things up watching interlaced video on a progressive screen.

There are a few professional LCDs that can emulate an interlaced display, about $4,000. Plus the AJA card.
cold-ones wrote on 10/26/2009, 12:48 PM
Good point---for the immediate future, I'll be delivering SD, but figured I'd edit in HD and downconvert to SD DVD. What's a good way to monitor this kind of setup?
Coursedesign wrote on 10/26/2009, 1:59 PM
Do what I do: use a Sony PVM CRT monitor for checking DVDs.

They can be had for a few hundred dollars, but be sure to get one that has been re-capped (had its capacitors replaced) and checked. Blue gun-only is a good feature on some of the models, but I've survived with a Lee Tri-Blue filter (440 nm) instead.

If you're near L.A., I can recommend a local place that refurbishes these commercially.
Coursedesign wrote on 10/26/2009, 3:20 PM
The reason it makes sense to spend as little as possible on interlaced HD monitoring is simple: 1080i is a stopgap until we can get 1080p.

There are no CRTs manufactured anymore, and all subsequent display technologies are inherently progressive.

The only reason 1080i exists is because of the infrastructure in the production and distribution networks.

And, because TV engineers chose to work in TV technologies rather than computer technologies, each upgrade is a very expensive and complex system-wide adventure.

I can't wait for the last TV engineer to be laid to rest (hopefully voluntarily) so we can get rid of the last "TV" technology." Enough already!

At least "video tape" is on the way out in production. Professional video camera tape heads last about 1500 hours, while computer data tape drive heads last 50,000+ hours.

A 120 MB/sec LTO-4 computer tape deck is a few thousand dollars, while an HDCAM tape deck is five figures. Plus maintenance.

And there is more, but I gotta watch my blood pressure :O).
farss wrote on 10/26/2009, 4:05 PM
"Professional video camera tape heads last about 1500 hours"

You must still be using SP. Never heard of a head wearing out that has ME tape running over it. If I recall correctly the record for head life is over 6,000 hours on some old DSR 20 VCR. The bearings in the drum motors are a different matter.

Contrary to what many seem to think going tapeless is not the end of electromechanical gizmos in cameras that will wear out. There's fans and zoom / focus motors in the tapeless cameras as well. A good percentage of the cameras we've pensioned off had very noisy zoom motors.

Bob.
Coursedesign wrote on 10/26/2009, 5:11 PM
Yes, but the motors cost a fraction to repair compared to the video heads.
GlennChan wrote on 10/26/2009, 11:48 PM
I can't wait for the last TV engineer to be laid to rest (hopefully voluntarily) so we can get rid of the last "TV" technology." Enough already!

That would be a long, long while from now. Over the air broadcast will stay the way it is for a long time... changing the standard would require everybody getting new set-top boxes / receivers that can handle the new format.

Anyways... slightly more on topic... a lot of the video world is becoming ITified... the world of 'video' and computers is merging in most areas. And as an offshoot of that, if you buy any monitoring solution now, it will become obsolete about as fast as computers do. It would make sense to only buy it when/if you need it. There will be a lot of advances in display technology in the future.

2- For SD, a CRT is the winner hands-down. I would try to get a monitor that hasn't been used much, because they get worn out with use (e.g. like shoes; you can repair them, but that isn't quite as good as new).

LCDs and the like have trouble with non-square pixels and interlacing.



I would disagree with that... :P

"TV" will likely be based off computer interfaces and standards. One day, you will get able to get high-quality TV over the Internet or via some form of IPTV. Display technology already piggybacks off of the computer world (e.g. LCDs) and you'll likely see attempts at covergence in the standards, e.g. next generation TV will be 1080p60... the progressive part meshes better with the computer world.
etc. etc. etc.
cold-ones wrote on 10/27/2009, 7:50 AM
I'm still confused. If I'm shooting in 1080-60i, what is the best way to monitor? Do I set the Vegas timeline to SD and use my Canopus to send video to a monitor? Or should I set the Vegas timeline to 1080-60i, and if so, what method do I use to send this signal to a HD monitor?
TheHappyFriar wrote on 10/27/2009, 8:09 AM
You using HDV? Get a video card with two DVI/HDMI outs (or two vid cards, one with DVI/HDMI) & hook monitor 2 to your HDTV that does 1080. All set. I've done that before & it works. Be sure to adjust your TV/monitor (if it's just monitoring & you need to buy one, get one w/o a tuner to save a few $$) via color bars & what not.
Coursedesign wrote on 10/27/2009, 1:58 PM
I can't wait for the last TV engineer to be laid to rest...

Well, that's just my point. Everytime we make a bit of progress, enormous and VERY expensive changes are needed throughout the whole distribution chain.

With the increasing use of "bit buckets" the distribution couldn't care less what's in those bits.

Any TV station that thinks its mission in life is to blast out electromagnetic energy at a certain frequency needs to rethink what it is that their viewers are really looking for.

I totally understand why TV until very recently was based on only TV technology with narrow standards defining what each box could do. The computer gear was not reliable enough, or in some cases not fast enough.

But I'm looking forward to a future where all programming is on demand and comes through an agnostic bit pipe, where a program in 1080p60 can be followed by another program in 480p30, or 684p15.73 for that matter.
Wolfgang S. wrote on 10/27/2009, 2:11 PM
With the brandnew Vegas 9c, you could use now the BlackMagic Design DeckLink HD Extreme, but also the XENA LHi and Io Express, including HDMI input and output.

If you wish to run still the 32bit version of Vegas, you could use the not-officially supporte Blackmagic Intensity Pro too.

Beside that: you can use secondary displays from every graphic card, even if the preview is less superior, compared with the AJA and Blackmagic cards.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

GlennChan wrote on 10/27/2009, 4:46 PM
But I'm looking forward to a future where all programming is on demand and comes through an agnostic bit pipe, where a program in 1080p60 can be followed by another program in 480p30, or 684p15.73 for that matter.
You'll probably be watching TV on your computer then. I think that's the most sensible approach.
GlennChan wrote on 10/27/2009, 4:47 PM



You're outputting SD right? Then monitor SD. Ideally, you want to monitor what your target format / deliverables are.
So you want to know what the following would look like:
A- a pristine digital master (some computer file you save when you are done with your project)
B- What your DVD will look like with compression
Previewing over firewire with DV compression is close to A (and B). When you are done with your project, you go ahead and watch the final DVD on a DVD player.
Coursedesign wrote on 10/27/2009, 4:59 PM
you can use secondary displays from every graphic card, even if the preview is less superior

"Less superior" means basically "not likely to be accurate" which will be fine for some but not for others.

For HD, accurate display means using an AJA or BMD card hooked up to a 1080p pro monitor ($4,000 and up).

So most people make do, or they send their work to a post outfit for final grading ("color correction," a deep subject...).


Glenn, I hope you will be on the barricades to speed up the conversion from fixed video formats stored on plastic tape to a format-agnostic bit flow where only the camera and display will need to be updated occasionally...

That way we won't have to specify extra screws for the lid on your casket when the time comes... :O)

TheHappyFriar wrote on 10/27/2009, 8:05 PM
Wolfgang S.: Beside that: you can use secondary displays from every graphic card, even if the preview is less superior, compared with the AJA and Blackmagic cards.

Coursedesign:"Less superior" means basically "not likely to be accurate" which will be fine for some but not for others.

that makes no sense. I'm not up on all the preview hardware but digital = digital. Just like the output of the same DV tape from played in a $250 miniDV camera via firewire is the same output as DV from a $3000 Sony DVC VCR via firewire, the output of Vegas via a DVI/HDMI out on a video card should be identical to the DVI/HDMI out on an AJA or BMD card. It's the exact same signal. I could see more options for the AJA/BMD (component, BNC, SDI, color correction, etc) but digital = digital.
Coursedesign wrote on 10/27/2009, 8:19 PM
There is a lot more to getting an accurate display (which is using digital technology to display an analog image) than passing on bits.

It's amazingly complex to get it right, and it currently can't get done in HD for less than thousands of dollars.
TheHappyFriar wrote on 10/27/2009, 8:49 PM
I can see where the device showing the preview can make a big difference but we're not talking about the TV right now, just the output to the TV. How can one card output digital better then another card? I thought the point was it's identical? There's no analog conversion between the hard drive to the display device, that's done @ the device.

If there IS a difference then a $1000 BD player can display a better picture then a $100 one with the exact same hookup, or one firewire card can give better output/input vs another card.
Coursedesign wrote on 10/28/2009, 9:47 AM
How can one card output digital better then another card? I thought the point was it's identical?

Vegas stores its digital video bits in the RGB format, while your TV and nearly all TVs and pro video monitors use "YUV." That means a conversion from RGB to "YUV" is needed, and it isn't necessarily lossless (even before we get into Computer RGB and Studio RGB).

[See glennchan.info for the reasons I'm using everyday language and refer to it as "YUV" instead of the formal name.]

Computer monitors are RGB, and that is probably the reason Vegas used it for its video representation in the heady days of "multimedia." The other NLEs at the time used "YUV" (and still do) because that's what they had to ingest and that's what they had to output for delivery, so they might as well keep it in that format.

If there IS a difference then a $1000 BD player can display a better picture then a $100 one with the exact same hookup

Both BD players will output YUV, so in that case you are at least looking at the same bits.

But depending on what you are sending the bits to, the screen may scale these bits to 1920x1200, or 1280x720.

And different screens will show the same bits very differently, so it's important to have a chain that you can calibrate throughout. If you need to be that picky, for broadcast or "film" delivery.

Hope that clarifies.

If I had a dollar for every time people said, "it's digital so it must be perfect," I'd be typing this from my own island in the Bahamas. I first heard it in the early 1980s, and it has never stopped.

It's very much like when Digital Volt Meters first appeared. One of the professors at my school hooked up a particular DVM to a voltage source. "What's the voltage?," he asked.

Students were quick to read the digital display and answer, "7.53642819 Volts."

Then the professor's lesson began. "Really?"

"Turning the DVM upside down, we see that this DVM is accurate to 1% of full scale. On the 10V range, that means 0.1V, so we know the voltage is 7.5V +- 0.1V" (actually slightly more complicated than that, but let's not dig ourselves too deep).

And then, once we thought we knew everything about measurement, he taught us that "accuracy" and "precision" are totally different concepts (but both very useable).

[I hope John C. doesn't get a heart attack from this, I'm really just trying to provide some useful, helpful info.]
Avanti wrote on 10/28/2009, 11:27 AM
TheHappyfriar, can you use your set up (HDMI to preview) if your using a two monitor system to edit with?
Coursedesign wrote on 10/28/2009, 11:52 AM
"accuracy" and "precision" are totally different concepts

For monitoring video (or audio, same concept), if we know that our system has precision (we can ensure it doesn't change from day to day), we can live with a lack of accuracy.

For example, if we know that when we grade video to look good on our screen, it looks too dark when sent to the TV station, so we brighten the picture a bit to where it's too bright for us but we expect it to look good to viewers.

Bars and tone are meant to help, but only to a point, and broadcast filters ("clamps") have made a lot of footage look unnecessarily bad.

Likewise when monitoring audio on our $8,000 studio monitors (hypothetically of course), but we are mixing for car radio listeners, we can use our experience to mix in a way that sounds "wrong" but we know it will be OK in the car in most cases.

But to be sure, we have to bring a test mix to the car and listen to it there, because the $8,000 studio monitors are not designed to accurately reproduce this environment.