Where/how do video specs actually come into play?

Jessariah67 wrote on 1/3/2021, 8:19 PM

I recently posted something where the central question was pretty much overlooked, but the more I'm thinking about it, the more I feel like I either need clarity or I'm really missing something.

We talk about hardware specs a lot in this industry, but I'm confused a bit on what is actually useful and when.

I run 5 monitors. When I set up my fifth one, I realized my card (GTX1060) only supported 4, so I connected #5 to my onboard graphics. After some stability issues arose, the thought occurred to me that it might be because my preview "tab" is displayed on that fifth monitor. But my timeline is on a monitor that is fed by my 1060. I actually threw some 4K footage on my timeline and put some heavy FX on it and moved the preview window around to different monitors and didn't really notice any change in playback FR (it was - expectedly - sluggish everywhere).

But it got me to thinking about how separate/together everything is, in reality. How does making use of more than one graphics source effect playback performance and rendering? How much does it matter what is "on" each monitor and where that monitor is connected? I will likely put my 1060 into my new computer for my fifth monitor - but if I decide to get a Radeon card, is that going to mess things up - like mixing brands of RAM - or doesn't it really matter? Is having your timeline displayed by one card and your preview window displayed by another card help or hurt - or does it make no difference?

I'm not posting this to debate performance issues with an older generation card running four monitors, etc. My question is really about how different video sources incorporate into the overall scheme of things when more than one is involved?

Comments

Reyfox wrote on 1/4/2021, 8:57 AM

I can't help you on this one. But I did have an AMD HD5870 that supported 6 monitors with 6 mini display ports on the graphics card backplate. It's called Eyefinity.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.2

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

Jessariah67 wrote on 1/4/2021, 8:17 PM

Thanks @Reyfox I'll take a look at that, though I'm still wondering about my initial question.

RogerS wrote on 1/4/2021, 9:00 PM

I doubt anyone else here has direct experience with such a setup. You might try forums for hardware enthusiasts or GPU experts and get their thoughts in general for how processing works in Windows with multiple GPUs and monitors, and potential pitfalls.

Reyfox wrote on 1/5/2021, 4:19 AM

Also, I don't know if current GPU's still support it. On Wiki, it says that they do, but it is up to the AIB partners to implement it on the graphics cards. I haven't seen or hear of any current AMD card (which is what you would want) that has 6 mini display ports or combination of ports on it.

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.3.2

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

walter-i. wrote on 1/5/2021, 4:42 AM

Just a thought:

I would rethink the workflow altogether and reduce the number of screens if possible. I imagine it to be extremely difficult to place and assemble all of them in a reasonably ergonomic way.

@NickHope once presented his system with a - I think 42" 4k TV screen - which I found very interesting. Unfortunately, I can no longer find the article.

 

Dexcon wrote on 1/5/2021, 5:09 AM

@walter-i.  ... I think that may have been in Q1 last year because I had the same problem as NickHope at the time which was just before my 4K monitor failed and I had to replace it (hence identifying the time frame). I can't find the post/comments either, but it was related to short black-outs/flickering on the monitor. I can't recall the possible cause/reason but it seemed to only affect HDMI computer/monitor connections - oh yes I can (as I write) - it was from installing AMD's Enterprise driver - it effectively killed the HDMI connection. If I remember correctly, NickHope used an HDMI connection to the 42" monitor which didn't have a DVI connection. The Enterprise driver didn't affect the DVI connection, only HDMI.

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 19.0.3, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

Jessariah67 wrote on 1/5/2021, 8:03 AM

I haven't seen any flickering issues and my set up is actually relatively compact.

Again, I'm not having "monitor issues" - I'm just wondering how Vegas inherently handles more than one GPU source, and if there is anything that either hinders it or makes it more efficient.

RogerS wrote on 1/5/2021, 8:30 AM

It's not just a Vegas question, it's also about how Windows and the card and its drivers divide up work between cards and monitors.

Some things that do work in Vegas are having one GPU do decoding (i.e. Intel integrated one) while another does timeline and effects acceleration. I don't think you can have two GPUs both do the same thing in Vegas. Just drawing a screen shouldn't be hardware intensive, but would there be lag or latency from having one GPU send data to another to send to a screen? No clue.

NickHope wrote on 1/5/2021, 9:15 AM

Just a thought:

I would rethink the workflow altogether and reduce the number of screens if possible. I imagine it to be extremely difficult to place and assemble all of them in a reasonably ergonomic way.

@NickHope once presented his system with a - I think 42" 4k TV screen - which I found very interesting. Unfortunately, I can no longer find the article.

Hi @walter-i.. the comment is here. I'm still happily running the 4K LG TV with my old ASUS monitor next to it.

@Dexcon My stability issues were documented in this thread. It wasn't flickering; my graphics basically just cut out. It always happened late in the evening, which leads me to speculate that some buffer or cache might be filling up. I tried many different drivers for my RX Vega 64 but would still get it happening occasionally. A couple of weeks ago I switched the HDMI cable to a Displayport cable for my secondary monitor, but still keeping HDMI for the LG TV. No crashes since, but it's definitely too early to say that it's fixed.

@Jessariah67 I have mixed AMD and NVIDIA cards in the past, but one of them was only for doing GPU acceleration/GPU rendering, and not for display. There are so many variables (OS version, driver versions, VEGAS settings, exact GPU models etc.) that it's difficult to answer your questions. In theory it should be OK to mix, but it's really a case of trying it.

Jessariah67 wrote on 1/5/2021, 10:53 AM

Thanks, everyone. I guess we'll see. My new system is gonna likely be a Ryzen 5900 and either Radeon 6800 or RTX 3080 (whichever I can snag first). I plan to plug two monitors into the 1060 and three into the new one. If the "newer" cards stay scarce for too long, I might get less of a card and go with the 5950 for now - the idea being that upgrading the card will be less of a task than upgrading the processor down the road...

Howard-Vigorita wrote on 1/5/2021, 1:47 PM

I think a 6800xt might yield better Vegas performance (a guess based on established little Navi performance) but Nvidia seems more widely supported if you also use other nle's or software. Regarding amd cpus... they seem to have better cpu price/performance specs but their general lack of igpu support availability is a real hindrance to overall editing performance. It cuts you off from load splitting with an igpu that has a back-door private bus to the cpu. The only amd availability like that I've seen so far is limited to oems like on the Lenovo 4700g desktop if you can manage to squeeze an additional high end pcie gpu in there and upgrade the power supply. Though a 5950x might even the odds a bit with a pcie4 gpu/motherboard combo. Not sure what will happen in a multi-monitor setup if you throw a 2nd pcie gpu in. Often cuts picie lanes in half on many newer motherboards but I have seen a Vegas performance gain in one of my older motherboards with a 5700xt/1660ti combo in it... but I only use 1 monitor with the Nvidia assigned to decoding. Looking forward to hearing how you make out.