Is my odd dual monitor setup a problem for VEGAS?

Mohammed_Anis wrote on 8/30/2020, 3:11 AM

I'm having a relatively smooth experience with my set up, even with 4K edits, despite the fact that my CPU is two generations beneath recommendation.

The one thing that I worry about sometimes is that I have two active GPUS in place. One is the built-in intel GPU provided for my motherboard and the other is my NVIDIA 1080ti.

On my 1080ti, I have a regular IPS monitor connected via an HDMI capable. The other ports are DISPLAY 1.4's, so I can't hook up another HDMI to it. Though there is a Dual-link DVI-D


On the other GPU, because my secondary monitor is a regular VGA - I have it connected via VGA-to-HDMI, since I happen to already own a converter.

 

In some tech threads, I read that seperate Display Devices can potentially limit output, but I couldn't find a thread that is specific with my odd set up.

I know that ultimately, its best to have both displays on one card and be done with it. But is there a disadvantage to my current set up?

Thanks.




 

Comments

fr0sty wrote on 8/30/2020, 3:21 AM

In some cases, having a monitor plugged into your onboard display can enable you to use quicksync in your CPU (if you have an intel chip that features it), which enables you to have your GPU do some tasks like rendering, while quicksync decodes the video, for instance... in some limited cases, this can provide a performance boost by spreading the load across multiple devices (assuming the device you are giving it to isn't far weaker than the other GPU, test it both ways).

However, if it turns out that it isn't helping, you can buy DP to HDMI adapters for really cheap on Amazon.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Mohammed_Anis wrote on 8/30/2020, 3:32 AM

@fr0sty

This is interesting! Yes, QSV works, though I didn't know it was made possible by the onboard GPU. I thought it was exclusively a CPU function. I have i7 6700k, which is a skylake chipset.

Can you please suggest as to how I can conduct these tests?

For your reference, this is the data I'm copying from my DXDIAG on the Intel GPU:
__________________________________________________________________________________________
 Card name: Intel(R) HD Graphics 530
        Manufacturer: Intel Corporation
           Chip type: Intel(R) HD Graphics Family
            DAC type: Internal
         Device Type: Full Device (POST)
          Device Key: Enum\PCI\VEN_8086&DEV_1912&SUBSYS_86941043&REV_06
       Device Status: 0180200A [DN_DRIVER_LOADED|DN_STARTED|DN_DISABLEABLE|DN_NT_ENUMERATOR|DN_NT_DRIVER] 
 Device Problem Code: No Problem
 Driver Problem Code: Unknown
      Display Memory: 16446 MB
    Dedicated Memory: 128 MB
       Shared Memory: 16318 MB
        Current Mode: 1440 x 900 (32 bit) (59Hz)
  HDR Support: Not Supported
    Display Topology: Extend
 Display Color Space: DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709
     Color Primaries: Red(0.639648,0.333984), Green(0.286133,0.598633), Blue(0.154297,0.077148), White Point(0.313477,0.329102)
   Display Luminance: Min Luminance = 0.500000, Max Luminance = 270.000000, MaxFullFrameLuminance = 270.000000
        Monitor Name: Generic PnP Monitor
       Monitor Model: BenQ E900W
          Monitor Id: BNQ7905
         Native Mode: 1440 x 900(p) (60.016Hz)

__________________________________________________________________________________________

And this is from my 1080ti:


           Card name: NVIDIA GeForce GTX 1080 Ti
        Manufacturer: NVIDIA
           Chip type: GeForce GTX 1080 Ti
            DAC type: Integrated RAMDAC
         Device Type: Full Device
          Device Key: Enum\PCI\VEN_10DE&DEV_1B06&SUBSYS_120F10DE&REV_A1
       Device Status: 0180200A [DN_DRIVER_LOADED|DN_STARTED|DN_DISABLEABLE|DN_NT_ENUMERATOR|DN_NT_DRIVER] 
 Device Problem Code: No Problem
 Driver Problem Code: Unknown
      Display Memory: 27445 MB
    Dedicated Memory: 11127 MB
       Shared Memory: 16318 MB
        Current Mode: 1920 x 1080 (32 bit) (60Hz)
         HDR Support: Not Supported
    Display Topology: Extend
 Display Color Space: DXGI_COLOR_SPACE_RGB_FULL_G22_NONE_P709
     Color Primaries: Red(0.637695,0.333984), Green(0.308594,0.626953), Blue(0.153320,0.073242), White Point(0.313477,0.329102)
   Display Luminance: Min Luminance = 0.500000, Max Luminance = 270.000000, MaxFullFrameLuminance = 270.000000
        Monitor Name: Generic PnP Monitor
       Monitor Model: 22MP55
          Monitor Id: GSM5A26
 

Last changed by Mohammed_Anis on 8/30/2020, 3:32 AM, changed a total of 1 times.

"I'm a part of all that I've met." Alfred Lord Tennyson

Youtube Channel: https://www.youtube.com/c/VEGASCREATIVEACADEMY


Card name: AMD Radeon RX 6800 XT
Processor: AMD Ryzen 9 5900X 12-Core Processor             (24 CPUs), ~3.7GHz
Memory: 32768MB RAM
Monitor Id: PHLC18F
Native Mode: 3840 x 2160(p) (59.997Hz)
Storage Devices: 2 SSDS, One large HD. VEGAS is installed on SSD

 

fr0sty wrote on 8/30/2020, 4:53 AM

The intel chip has an onboard GPU built into it, your motherboard just plugs into it with the VGA port (assuming there isn't a secondary GPU built into the motherboard). To do the test, go into file i/o preferences and set it to decode using qsv. Then set it to the 1080, see which plays back faster. You can do the same for rendering, but I'm pretty sure the 1080 will win that battle. Ditto for timeline acceleration under the video tab of preferences, try different combinations to see if you can get any performance gains out of using the intel qsv. If not, I'd buy a DP to HDMI adapter and be done with the intel stuff.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)