I have a pair of ATI 64meg video cards in a 450MHz PIII. Oddly enough, despite first booting on the AGP card, as soon as the PCI card was installed, 'it' became primary display device. Anyway, here's the dilemma. It's video performance is immensely faster than that of the Vegas workstation (with 2.2 GHz CPUs) . . . as measured by the time-honored (around here anyways - with a stopwatch no less), "How long does it take for the cards to cascade upon winning a game in Solitire?"
Anyway, I wonder if (as a practical matter), 2 separate video cards running 2 monitors typically 'feels better' than a single card running 2 monitors - or is this symptomatic of an issue I haven't addressed?
Survey: if you run two monitors, do you use 1 card, or two cards?
Anyway, I wonder if (as a practical matter), 2 separate video cards running 2 monitors typically 'feels better' than a single card running 2 monitors - or is this symptomatic of an issue I haven't addressed?
Survey: if you run two monitors, do you use 1 card, or two cards?