How To Connect TV Monitor To Vegas?

Jay Gladwell wrote on 4/23/2009, 5:23 AM

In the recent past (up to this morning) I've had a secondary studio monitor connected to Vegas via a DSR-30 deck accordingly:

Preferences > Preview Device > Device > OHCI Compliant IEEE 1394/DV (and then further down) Conform output to the following format > NTSC DV

Under "Device" I have three choices:
1 AJA Video Device (Where did this come from? I don't have an AJA video device!)
2 OHCI Compliant IEEE 1394/DV
3 Windows Secondary Display

So my question is, how do I connect the new HD monitor to the computer for previewing footage from Vegas without using the DSR-30?

Thanks!




Comments

JJKizak wrote on 4/23/2009, 5:55 AM
Depends on your video card. If you select "Windows secondary display"
the 1394 will be cut off for viewing and your video card will run the second monitor if you have at least two outputs on the card. Some cards have three outputs simultaneously but most of them have any two but not three. You can drag your Vegas windows to the second monitor and when you do a preview render you can see the full size video on the second monitor as long as you drag it over first. I use Nero Showtime for mine and it works fine for doing this. When you close Nero then your original preview window pops back as long as you engaged the "preview on external monitor" button. If your monitors have several inputs (S, VHS, Component) you can still have then wired to the monitor for other workflows. I still would prefer a third monitor for the Vegas windows.
JJK
Jay Gladwell wrote on 4/23/2009, 6:01 AM

James, I didn't want to use it as a computer monitor, just a video monitor, i.e., click on the "Video preview on external monitor" button.

The video card has an S-Video output, but I guess I need a card with HDMI out, right?


TheHappyFriar wrote on 4/23/2009, 6:52 AM
DVI to HDMI converter or a VGA to HDMI converter for HDMI hookup, but some monitors/TV's have VGS, DVI & HDMI so it may be a non-issue.

But no matter what, it's a computer monitor once it's plugged in to your computer. What you use it for is up to you.

You can get a cheap 2nd video card & put that in your computer so you don't use the multimonitor hookups on your main card, or for more hookups.
Jay Gladwell wrote on 4/23/2009, 7:03 AM

"But no matter what, it's a computer monitor once it's plugged in to your computer. What you use it for is up to you."

Thanks for the clarification, Stephen. I wasn't fully aware of that. At this point, I want to make certain I get the best picture possible. I presumed that was using the HDMI connection.


craftech wrote on 4/23/2009, 7:47 AM
Jay,

Is it a TV monitor or a computer monitor? The title says TV Monitor so assuming it is a TV monitor you may want to make note of the fact that not all connections are allowed by all versions of Windows XP and especially Vista. HDCP compliance.

Some DVI to HDMI converters are rejected and some VGA to DVI connections as well.

If you run into problems like that it isn't always clear as Microsoft error messages will refer to HDCP without explanation. Just be aware of it if you get such an error message and try a different interface or connector or adapter first before you start tearing your hair out and blaming the hardware itself.

John
TheHappyFriar wrote on 4/23/2009, 8:27 AM
interesting, never knew there could be error messages. I've hooked my current comp up to four different HDTV's vis DVI to HDMI conversion, but I guess if the card is HDCP compliant (all current ones are) and so is the TV (all were) then I wouldn't have issues.

I saw a 1080p 26" monitors @ bj's one time that said it was HDCP compliant (think it was only $299 too). It may be standard now & only an issue for monitors about several years ago.
Jay Gladwell wrote on 4/23/2009, 8:47 AM

John, it's a TV monitor. I'm still using XP Pro.

Actually, I'm still waiting for the UPS truck to arrive, so I haven't tried anything, yet.

Sab wrote on 4/23/2009, 6:39 PM
I have an HD TV hooked up to my laptop using HDMI. The Sony TV is considered a secondary computer monitor. In Vegas, I selected secondary monitor as my preview monitor. With preview set to Best Full, the image is sharp and beautiful although I only get a couple of frames per second with HDV footage. Switching to Best Auto gets much better motion albeit with a blurrier image.

Editing DV with this setup gets the usual poor result watching SD footage on an HD monitor. In that case, I also have my trusty 8" Sony CRT monitor connected by firewire to my DSR20 deck. Having both options available is very nice.

Mike
Jay Gladwell wrote on 4/24/2009, 4:20 AM

Thanks, Mike, for the information. I'll be hanging on to mt CRT also.

blink3times wrote on 4/24/2009, 4:33 AM
"The video card has an S-Video output, but I guess I need a card with HDMI out, right?"

If all you have is Svid than it's not really worth it. (Svid output is pretty bad). Upgrade to something with HDMI (or dvi)
Jay Gladwell wrote on 4/24/2009, 6:30 AM

Upon closer examination, it's NOT an S-video out, but looks very similar, but not quite. The S-video cable will not fit into it.

My card looks like this...


Zelkien69 wrote on 4/24/2009, 6:45 AM
The "almost an S-video" connector is actually a type of cable dongle that should have come with your computer. Typically it will have something along the lines of 3-8 other cables ranging from S-video, Compnent, Composite, and maybe a couple more options. Find the cable and you've found a way to hook it up.

But really what you said about needing DVI or HDMI is best case scenario in order to get the proper refresh rate out of your TV.
craftech wrote on 4/24/2009, 7:10 AM
Upon closer examination, it's NOT an S-video out, but looks very similar, but not quite. The S-video cable will not fit into it.
-----
Jay,
It's probably a 7-pin jack like on a laptop. That means it may contain audio as well or is S-Video and Composite in one connector so you need an adapter cable (that should have come with it, but did not), but if you have a laptop with the same jack on it and the laptop came with the cable, it should work on the video card.
If not, they are available.

If it is 8-pin, it is like the All-In-Wonder in which case you need a special adapter cable. Monoprice is the cheapest,

But I would use DVI out on the card for your purpose, and it says that the card is HDCP compatible so you shouldn't have any problems as long as you have your settings on the TV and computer set properly.

John
Jay Gladwell wrote on 4/24/2009, 8:38 AM

Thanks, John. That really had me puzzled (which doesn't take much).

Will a DVI to HDMI cable work as well as HDMI on both ends? Will the picture be as good?


Nobody wrote on 4/24/2009, 8:49 AM
"Will a DVI to HDMI cable work as well as HDMI on both ends? Will the picture be as good?"

Jay,
Picture quality should be the same either way.
TheHappyFriar wrote on 4/24/2009, 8:51 AM
HDMI = DVI = HDMI. Just a difference in cabling (plug & HDMI caries audio, DVI does not).

So it will work, yes.
Jay Gladwell wrote on 4/24/2009, 9:02 AM

Thanks, guys. Much appreciated!

craftech wrote on 4/24/2009, 9:10 AM
Jay,

Most DVI interfaces today are DVI-D. As long as the adapter cable is DVI-D to HDMI you should have no problems.

John
Jay Gladwell wrote on 4/24/2009, 10:57 AM

Thanks, John!

Any ideas as to how "beefy" of a card should I get? I presume not all cards would be suitabel for a 32" screen. Yes/no...?


craftech wrote on 4/24/2009, 12:21 PM
Any ideas as to how "beefy" of a card should I get? I presume not all cards would be suitabel for a 32" screen. Yes/no...?
=========

I don't know why you want to buy a different video card. Try that one first. It has the typical Dual Link DVI-I connector on the back which will accept a DVI-D cable. DVI-I is both Digital and Analog, but the DVI-D (Digital only) fits it and will transmit the digital portion of the signal.

Any DVI-D to HDMI cable should hook up to your TV.

If you have an HDMI cable you can use many of the DVI-D to HDMI adapters. I believe Walmart has those as well.

If you are unfamiliar with how the different connectors look see here.

John
Jay Gladwell wrote on 4/24/2009, 12:31 PM

"I don't know why you want to buy a different video card."

My current card doesn't have DVI-D...


craftech wrote on 4/24/2009, 1:08 PM
Jay,

You posted a link to the card and I looked at it. It has a DVI-I connector on it like most video cards which is DVI-D and DVI-A . Just plug a DVI-D cable or adapter right into it and it will transmit the digital (DVI-D) portion of it.

John
Jay Gladwell wrote on 4/24/2009, 1:16 PM

Sorry! That was just an example of the S-video connection (what it looked like), not the DVI.

My bad!

craftech wrote on 4/24/2009, 1:28 PM
Sorry! That was just an example of the S-video connection (what it looked like), not the DVI.

My bad!
============
You mean that wasn't your card?

If not, how much do you want to spend? Also, do you have a PCI-e slot on your motherboard? And what is the exact model of your power supply

John

EDIT: Also, I got out my laptop with the overly bright LCD and looked at the TV Out on that card so I could see the pins. If it is the same as yours it isn't a 7-Pin or 8-Pin as I thought. It is an NVIDIA VIVO 9-Pin jack. That has a special cable:. VIVO is Video In - Video Out.