Years later: GTX 590 Compatibility (Vegas 13 & 14)

CogDiv wrote on 3/8/2017, 8:17 AM

So, a GTX 590 card works equivalently to a GTX 580 in Vegas 13 & 14, particularly for MainConcept encoding, right?

Please share all the details you can of your experiences that may not already be posted here, and experiences with striving to run the latest drivers and maintaining Vegas productivity.

Hard for me to invest in a used card and not get the most I can for the money . . .

(EVGA SuperNova 1300 G2, so not too worried about power requirements, though otherwise that would be a concern)

Comments

OldSmoke wrote on 3/8/2017, 9:13 AM

The GTX590 makes no sense at all, it's worse than having , which is what I had after the 2x GTX570 in the video. Vegas does not see the second GPU of the GTX590 at all. Vegas can however utilize two GTX580 although not officially supported. The two cards would have to be NOT in SLI mode to get the most. If you want the best of both worlds, get a good R9 card and in addition add a GTX580. Keep in mind that 2 cards only make sense if you have a motherboard that can support 2x PCIex16 mode which requires a socket 2011 or the older socket 1136 motherboard.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CogDiv wrote on 3/8/2017, 9:19 AM

I have an XFX R9 290A now. So, the issue with Vegas seeing the two 580 GPUs on the 590 is the fact that SLI is forced on the 590, and Vegas is not compatible with SLI mode?

The reason I've considered a 590 is it is cheaper than two 580s (at least one I found, not all are), but I think I see your point regarding SLI mode.

The next question then is how much does 8x vs 16x PCI-E bandwidth mode really affect the acceleration for Vegas MainConcept encoding?

OldSmoke wrote on 3/8/2017, 9:42 AM

So, the issue with Vegas seeing the two 580 GPUs on the 590 is the fact that SLI is forced on the 590, and Vegas is not compatible with SLI mode?

I don't think this is how the GTX590 works but Vegas will treat the GTX590 as a single GPU; there have been reports about that a long time ago. Since you have a R9 290, just plugin a GTX580 provided your motherboard can support it. I currently use a R9 Fury X together with a GTX580. There are GTX580 around that have 3GB onboard, that would be a much better choice than a GTX590 IMHO. I use the R9 for timeline performance and I can use the GTX580 for MC AVC renders using CUDA. The amazing part here is, that during rendering Vegas will use the OpenCL capabilities of the R9 for anything that is GPU accelerated in Vegas, FX, Transitions and so on and the GTX580 is only used for the final encoding by the MC AVC encoder. 

Last changed by OldSmoke on 3/8/2017, 9:44 AM, changed a total of 1 times.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CogDiv wrote on 3/8/2017, 9:53 AM

I'll be experimenting with two 580s w/3GB soon, in addition to the current R9 290.

My i7-5820K's 28 PCI-E lane limit will be holding me back, but a single 580 3GB @8x compared to two 580s @4x mode would be interesting. I probably never will attain even 8x mode for the single 580 unless I run the R9 290 at 8x as well, due to the M.2 SSD and Intel RAID (3x 3TB WD Red) the system is currently supporting, as well as the onboard devices (USB3, Audio, etc.).

I'll have to upgrade the processor next, for sure.

CogDiv wrote on 3/8/2017, 10:12 AM

The i7-5820K has 28 PCI-E Gen 3 lanes. The 580s are PCI-E Gen 2 cards. I wonder if there is some sort of lane bandwidth provisioning that will take place, or are the lanes more restricted due to addressing?

OldSmoke wrote on 3/8/2017, 10:38 AM

28 PCIe lanes won't give you full bandwidth, you will have both cards running at 2 times x8 or x16 and x8. That is the limitation of the 5820. 2x PCIe x16 requires 32 lanes. That's the main reason I switched to socket 2011 to have 40 lanes and even that isn't enough for what I would like to do.

Last changed by OldSmoke on 3/8/2017, 10:39 AM, changed a total of 1 times.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CogDiv wrote on 3/8/2017, 10:42 AM

Vegas can however utilize two GTX580 although not officially supported. The two cards would have to be NOT in SLI mode to get the most.

To clarify, CUDA acceleration of the encoding utilized only one card, right? The second GTX 580 was for OpenCL support, which is why you recommend a Radeon as the primary, right? Or, did the encoder use both CUDA cards when they were not in SLI mode?

OldSmoke wrote on 3/8/2017, 10:46 AM

Vegas can however utilize two GTX580 although not officially supported. The two cards would have to be NOT in SLI mode to get the most.

To clarify, CUDA acceleration of the encoding utilized only one card, right? The second GTX 580 was for OpenCL support, which is why you recommend a Radeon as the primary, right? Or, did the encoder use both CUDA cards when they were not in SLI mode?

That is how I would see it by looking at the load of the two cards. I would think that one was used for OpenCL, the one that is selected under preferences and that the other would have been used for CUDA only. The performance increase from one to two GTX580 isn't that big, about 10-15% at best. Yes, the R9 would do well with the OpenCL portion, much better then any current Nvidia card.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

CogDiv wrote on 3/8/2017, 11:08 AM

A wrapper that took the CUDA commands and distributed them among the CUDA cards would be nice.

The MainConcept encoder maxes out the number of encoding threads your processor will support (12 for the 5820K). When using the CUDA acceleration it would be interesting to see whether the "number of rendering threads" is actually affecting the instances of dll calls, observable with Microsoft:Sysinternal's Process Explorer (App Properties->Threads). For the Radeon you can see this does affect the number of atiumd64.dll threads, in addition to amdocl64.dll, but those additional threads are not utilized.

Anyone else interested in learning to code with the CUDA code library?