Monitors + Vid Card Replacement - Go HDMI?

Soniclight wrote on 1/11/2012, 11:08 PM
This thread does center on the HDMI issue but also addresses an other issue or question I have in this predicament. And as usual, for those who don't know me, I have to preface this by saying I'm a serious amateur, not a pro and on a tight budget and don't do professional work in terms of for-broadcast, etc..

While I do work in HD rez in Vegas 10, my stuff is for Net viewing. It's also artsy/dreamy/stylized so the kind of strict color and other parameters most of you have to take into consideration are less important. So I buy consumer monitors, consumer vid cards.

One of my dual old-style 4:3 ratio 19" LCD monitors bought back in 2005/6 died yesterday, and my video card is as ancient yet still working fine (GeForce 6800XT). Prices have dropped considerably for LCD since then, and even a pauper budget can get a decent (for this man's purposes) set-up.

While my old 6800XT has had DVI out, for various practical reasons, the connection via extension cables has been VGA and worked just fine. I can't afford cable TV, so I have Netflix and Hulu -- so I watch DVDs and stuff online on my system. I don't have a big-screen or HD TV. In fact, my 2000 model CRT TV is stashed in a closet.

Unlike most these days, I'm perfectly happy watching movies on a computer screen (I have a swivel system so I can sit in my easy chair and watch, but that's kind of OT). Simply put, I don't need surround sound or fancy stuff.

So...

Whether I do this in one swoop or buy one monitor than the other later, I'd like to upgrade to 2 x 22"/23" LCD, preferably LED backlit, @ +/- USD $130-150 each. Video card no more than USD $75 or less. Most generic but acceptable cards have various configurations - 1 DVI + 1 VGA, 2 x VGA or 2x DVI and then some also have an HDMI port. I've done some reading online on HDMI and its advantages (and also that DVI is actually on its way out), but I've never seen a card I can afford with 2 HDMI ports.

Since this is an investment and not just a replacement, I might as well make a choice that considers the future a bit.
VGA works fine for me and probably would continue doing so, but HDMI seems to be the wave of the future.

Question 1:

** I've read enough online including searched in this forum and gotten a bit lost in the true-white vs. RGB issues of LED illumination, etc. etc. -- as well as having taken a trip to my local Fry's to assess monitor differences in-real-life to have a general good sense of this issue.

But truth is that while at the store, the ones that except for the higher end Samsung etc. I could save a small amount on not going with backlit for companies are trying to get rid of them via rebates, but there, maybe it's better to go with the trend of the industry.

Staying with VGA would save me some considerable money for I could use the cables and adapters I have (VGA>DVI at video card), and cards with either VGA and/or both DVI are easy to find -- but I also don't want to end up too retrograde/behind the curve.

Thank you for your input.

Comments

ushere wrote on 1/11/2012, 11:55 PM
a good (cheap) nvidia card, such as gt440 or better will give you 2 X dvi out - good enough to easily drive 2 lcd's at hd or above resolution.

there's no real difference between dvi / hdmi other than hdmi carries sound as well. if you're buying monitors they probably don't have speakers, if they do, hdmi will carry audio for them, but in my experience a tin can on a piece of string sounds better than the sort of cheap speakers the use in cheap lcd's.

i have both fluro and led monitors - the led's do look brighter but somewhat more saturated than i like (that's out of the box). i spyder them all to get some calibration ;-)
Soniclight wrote on 1/12/2012, 6:11 AM
Thanks for reply, ushere - that simplified things for me. I don't need or want on-board speakers anyways since I use an M-Audio card. As to Spyder, even SpyderExpress is out of my price range, so hopefully the card and whatever monitors I get will have enough between them to do a decent calibration job.
megabit wrote on 1/12/2012, 6:37 AM
Even though most modern cards offer HDMI along DVI, there is no special need for it unless you have your system connected to a home theater amplifier with the former (sound).

Otherwise - and this is an interesting one - on nVidia forums they advise to actually _disable_ the nvidia HD audio drivers, in order to obtain better stability of the graphics/video system...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Soniclight wrote on 1/13/2012, 1:58 AM
Thanks for response, Piotr.

I think I've pretty much made up my mind on both monitors and the card. One advantage over my current system is that I've chosen a card that has 2x DVI out and the monitors include HDMI in, so all I need is to get a couple of long DVI>HDMI cables and I'll finally be digital after all these years. Not that the difference from having used VGA will even be that noticeable if at all, but I might as well do it.

I'm at this point planning to get:

2x Acer 23" S-Series Monitors -"S" is for ultra-slim = lightweight as I prefer.
EVGA nVidia GT 430 - just a hair below the 440 suggested by ushere and others.

In terms of where I'm planning on getting these, I've looked at Fry's, Newegg, B&H but have once again settled for TigerDirect due to price with no s/h or taxes. It may not be "nerd-sexy" to use TigerDirect, but I've had no bad experiences with them at all. I will however get my 10 ft DVI>HDMI cables from Amazon, for they cost way way less, yet are brand name (Case Logic)>
ritsmer wrote on 1/13/2012, 2:30 AM
As to my 2 cents I always have difficulties running any monitor or TV-set via HDMI because of a more or less noticeable overscan.

Via DVI I have the impression that 1 pixel from my media corresponds to just one pixel on the monitor - but as soon as HDMI is involved in the chain this 1:1 pixel connection seems to be lost - resulting in less sharpness - no matter how much I try to adjust for it.

So: when you unpack be careful to the boxes - and test it out before the returning time-limit is reached.
ushere wrote on 1/13/2012, 2:39 AM
@ piotr - interesting. how do you disable hd audio drivers? control panel > nvidia hd audio driver? anything else?
megabit wrote on 1/13/2012, 2:58 AM
I have them permanently disabled in Device Manager.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Soniclight wrote on 1/13/2012, 5:59 AM
".... but as soon as HDMI is involved in the chain this 1:1 pixel connection seems to be lost - resulting in less sharpness - no matter how much I try to adjust for it.."

Hmmm. OK, good point. That's sounds like reverting to VGA so it kind of cancels things out. So I'm looking for monitors that have DVI -- the Acer ones oddly only have VGA and HDMI, no DVI. Which may mean going down to 2x 22/21.5" - which is fine.