I recently posted something where the central question was pretty much overlooked, but the more I'm thinking about it, the more I feel like I either need clarity or I'm really missing something.
We talk about hardware specs a lot in this industry, but I'm confused a bit on what is actually useful and when.
I run 5 monitors. When I set up my fifth one, I realized my card (GTX1060) only supported 4, so I connected #5 to my onboard graphics. After some stability issues arose, the thought occurred to me that it might be because my preview "tab" is displayed on that fifth monitor. But my timeline is on a monitor that is fed by my 1060. I actually threw some 4K footage on my timeline and put some heavy FX on it and moved the preview window around to different monitors and didn't really notice any change in playback FR (it was - expectedly - sluggish everywhere).
But it got me to thinking about how separate/together everything is, in reality. How does making use of more than one graphics source effect playback performance and rendering? How much does it matter what is "on" each monitor and where that monitor is connected? I will likely put my 1060 into my new computer for my fifth monitor - but if I decide to get a Radeon card, is that going to mess things up - like mixing brands of RAM - or doesn't it really matter? Is having your timeline displayed by one card and your preview window displayed by another card help or hurt - or does it make no difference?
I'm not posting this to debate performance issues with an older generation card running four monitors, etc. My question is really about how different video sources incorporate into the overall scheme of things when more than one is involved?