2 monitors + 1 output: adding an output monitor

entilza72 wrote on 6/1/2012, 4:46 AM
Folks,

Looking for edit suite rig playback advice. Vegas Pro 11.

I want to add a dedicated 1080P monitor as an output monitor, via HDMI. DVI and 9 pin/D-sub are not an option.

I wish to continue to use my two Windows monitors for my Vegas Pro layout, but add the new dedicated output monitor to the combo.

I am presuming Vegas supports 2 desktop monitors plus an output monitor on HDMI for 3 in total. And that you can select the monitor that is to act as the output.

Questions:
For those that are already doing this, how are you sending your signal to the output monitor? Do you use a dedicated card? Can it be done on the cheap?

I have a single ATI card with 2 DVI outputs. Both DVI ports are currently being used for an extended Windows desktop (the 2 desktop monitors). But there's an unused HDMI port which I'm hoping will be able to act independently of the 2 DVI's, giving me what I need.

Another option I'm thinking about is buying a second ATI card of the same kind and linking them. I assume Vegas will be able to take advantage of the second linked GPU? But do we have the options to use the ports on that card too?

Thanks in advance.

Comments

ushere wrote on 6/1/2012, 7:17 AM
just get another cheap 'matched' card with hdmi output.

alternatively bm intensity pro.

whatever way you go, make sure the monitor is up to scratch. there's been some discussion here about decent monitors, but i've found my lg ips panel pretty good out the box. however, i also have a spyder to reassure myself ;-)
mudsmith wrote on 6/1/2012, 11:31 AM
I have a 42inch LG that I am using for some tasks, and am considering using as a dedicated output monitor, a la the OP.

So far, I have found that only the "Cinema" setup mode seems to give anything like an accurate, detailed picture. How are you setting yours up to please the sensors on the Spyder?
TheHappyFriar wrote on 6/1/2012, 1:10 PM
As far as I know, Vista/7 require cards to be identical for them to work together. That should be no problem (the HDMI on your card uses one of the two monitor outputs I'm betting).

Might just be easier to buy a GPU that supports 3 monitors.

Couldn't you also use a DVI to HDMI converter/cable if need be? I always thought the only difference between DVI & HDMI was the audio and encryption.
Steve Mann wrote on 6/1/2012, 8:29 PM
"As far as I know, Vista/7 require cards to be identical for them to work together. "

Windows will run as many monitors as you have display adapters. You can mix the display adapters, but you run the very real risk of the programs calling the wrong driver functions. As a result it is strongly recommended that you use similar GPUs from the same manufacturer so that Windows only has one driver.


"... the only difference between DVI & HDMI was the audio and encryption."

This is correct.
Entilza wrote on 6/2/2012, 4:01 AM
Thanks for all the feedback everyone.

I forgot to mention I'm running Win 7.

So, please let me know if I haven't got this right, but in summary ...

It is thought that my card with 2 DVIs and 1 HDMI may not be able to independently control all 3. (I have a feeling this may be case and can only test if I get the monitor on site).

It is also thought that the best way to solve this is to buy a second GPU identical (including manufacturer) to my existing GPU.

It is implied (no one has confirmed yet) that the second GPU's monitor ports will be able to be driven as extra ports by windows 7 and thus be available to Vegas.

* Can anyone confirm that a second card = a second set of independent video ports in windows?

* Can anyone else also confirm if Vegas's GPU render function will take advantage of the 2 GPU's? It's kind of a "bonus" if it can, and makes the risk of buying a second card worthwhile.

Thanks again. These are great forums.
Ent.
ushere wrote on 6/2/2012, 4:42 AM
* Can anyone confirm that a second card = a second set of independent video ports in windows?

yes

* Can anyone else also confirm if Vegas's GPU render function will take advantage of the 2 GPU's? It's kind of a "bonus" if it can, and makes the risk of buying a second card worthwhile.

no - vegas can only access one gpu
megabit wrote on 6/2/2012, 5:22 AM
" Can anyone else also confirm if Vegas's GPU render function will take advantage of the 2 GPU's? It's kind of a "bonus" if it can, and makes the risk of buying a second card worthwhile"

Well, Vegas can certainly only use a single GPU, but I suspect that when an OFX is CUDA-enabled itself (like the newest Neat Video), it can use the other GPU so that the render time would benefit even more...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

entilza72 wrote on 6/2/2012, 6:04 AM
Thanks everyone - some solid information here. I'll post an update when I'm done.

Ent.
Steve Mann wrote on 6/2/2012, 9:26 AM
"It is thought that my card with 2 DVIs and 1 HDMI may not be able to independently control all 3. (I have a feeling this may be case and can only test if I get the monitor on site).

Some GPU's will let you run all three displays on one card - but at a reduced resolution. (They aren't exactly pointing this out in their specs).

Buy two GPU cards. Identical is better, but at least of the same make. Mixing ATI/Radeon with nVidia is asking for problems and easily avoidable at this stage. Using identical display adapters assures that Windows only needs one copy of the driver software installed.

Also note, SLI or other GPU bridging techniques buys you nothing with Vegas. It's strictly for gaming.

I don't have less than three displays on any of my Vegas workstations and every one of them has two identical display cards.