I'm looking for a card to fast rendering

Sebastian Reg wrote on 3/3/2012, 4:24 PM
Hi,
I'm looking for a card to use with Sony VegasPro 11, something like BlueFish - the most important thing is to increase a rendering speed - I usually make videos consisting of HD files (mxf format) and with Neat Video and Magic Bullet Looks. Now, renderng time for 2hour film, with hd clips and using Neat Video and Magic Bullet Looks plugins take about 2-3 days (PC configuration: 2x6-Core Intel Xeon 2.66 GHz, 8 GB RAM wirth Radeon 5770 1 GB) BlueFish is ok but as I know, it goes with Edius only. I'd be gratefull for any help or advise.
Ann

Comments

farss wrote on 3/3/2012, 5:01 PM
There simply is nothing that's going to help you.
Vegas has and no doubt always will be hardware agnostic, it has never supported even the most basic functionality available on cards from BMD etc as it uses the ancient VFW interface. The two plugins that you're using require massive amounts of computation, to some extent they can be accelerated through the GPU but Vegas does not support that kind of GPU accleration as it's specific to nVidia hardware.

If certain tasks / FXs form the core of your work and time to air is vital to you then you need to look elsewhere possibly even using any NLE is not the best path to go down. Realtime "linear" video hardware might serve you better, if you've got deep pockets.

Bob.
JohnnyRoy wrote on 3/3/2012, 6:14 PM
I believe that both Neat Video and Magic Bullet are CUDA enabled and unfortunately you have an ATI graphics card which is not supported by those plug-ins. One thing you could do is replace your ATI card with the fastest NVIDIA card that you can afford. It's not a panacea but it can help with those particular plug-ins that require CUDA.

~jr
rmack350 wrote on 3/3/2012, 9:21 PM
There simply is nothing that's going to help you.

I don't know what the state of MB is these days but it used to be that it was GPU accelerated independently of Vegas, and when it could use a GPU it was a heck of a lot faster.

I'd search this and other Vegas forums to see if you can get a sense of what a fast GPU can do for these two plugins before you spend money on an nvidia card, but I'm hopeful that if those two plugins can be accelerated then your whole render might speed up.

Rob
farss wrote on 3/3/2012, 11:09 PM
I hadn't really thought of that but now that you and JR have raised it I'm curious as to how well that works alongside Vegas's GPU acceleration.

Bob.
Grazie wrote on 3/4/2012, 12:36 AM
Yes Bob. I've raised this aspect of managing GPU<>CPU several times.

I have both an nVidia CUDA enabled GPU and Magic Bullet. I've noticed excellent render times, but have no testing process to ascertain the voracity of my statement. What I will say is that I'm not so concerned at using MBLs anymore.

G

farss wrote on 3/4/2012, 2:05 AM
"Yes Bob. I've raised this aspect of managing GPU<>CPU several times."

My concern is about GPU <> GPU. Vegas itself uses OpenGL whereas the FXs being discussed use CUDA. OpenGL and CUDA are similar but use different code interfaces to give software access to the same hardware. Given that the hardware provides a large number of cores which process requests asyncronously the potential for things to go wrong is considerable.

To look at this another way, Adobe uses CUDA and a number of 3rd party plugins that run under Ppro and AE also use CUDA. In that environment only CUDA has to play nicely. In Vegas CUDA has to play nicely and OpenGL has to play nicely PLUS CUDA has to play nicely with OpenGL. What SCS has done in the interests of being hardware agnostic is double the potential for things to go wrong and in an environment that is very difficult to debug.

Bob.
TeetimeNC wrote on 3/4/2012, 5:25 AM
I know outside the video editing world some folks use multiple video cards (not chained together). I wonder if it is possible to have two cards installed and use one gpu for Vegas and the other for MB?

/jerry

John_Cline wrote on 3/4/2012, 6:06 AM
I have two identical nVidia cards installed and it does appear that both of them get used when rendering in Vegas using MB and/or NeatVideo.
megabit wrote on 3/4/2012, 8:05 AM
John, this is an excellent news for me!

I must say that when building my machine lately, I was tempted to buy 2 rather than a single GTX 580 cards (still several times cheaper than say the Quadro 6000, and with the same total amount of memory - 6GB - and much faster). But - knowing from my experience with Autodesk Moldflow that it can only "see" and use a single GPU - I assumed (correctly) it would be the same with VP11, and the second card would just be a waste of money...

But if you know for sure that Neatvideo and Vegas can use both your GPUs (each of them utilizing a single card, of course), I must re-consider adding another GTX 580/3GB to my system. Not only will my renders containing NeatVideo 3 FX benefit, but - often running several Moldflow analyses simultaneously - I think two of them will be accelerated (currently only one can)...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

JohnnyRoy wrote on 3/5/2012, 1:32 PM
> "I'm curious as to how well that works alongside Vegas's GPU acceleration."

I believe the way it works is that the plug-ins are on their own to do as they like. Some of the Sony plug-ins are GPU accelerated while others are not. I assume these use OpenCL. The BorisFX plug-ins use OpenGL. MB uses CUDA. It doesn't matter. Once the plug-in is in it's own code, it can do whatever it likes for the time it gets from Vegas to "do it's thing".

~jr
MUTTLEY wrote on 3/5/2012, 1:58 PM

Grazie: "I have both an nVidia CUDA enabled GPU and Magic Bullet. I've noticed excellent render times, but have no testing process to ascertain the voracity of my statement."

Ditto here, I've been a long time user and fan of Magic Bullet, enough so that with V10 I couldn't use the 64 bit version of Vegas as the previous version of MB was incompatible. With V11 went to 64 and and the new MB and eventually got the GTX 580. The difference to me seems night and day, just no comparison in render times though, as with Grazie, I have not done any specific testing.

- Ray
Underground Planet
rmack350 wrote on 3/5/2012, 5:15 PM
My understanding of what Vegas is doing was that it is using OpenCL for its GPU accelerated OFX filters. OpenCL can access the GPU on NVIDIA and ATI GPUs, and in the case of NVIDIA GPUs, OpenCL is used to access CUDA.

How's that for alphabet soup?

Last time I looked through the OFX filters in VP11, I thought that all of them were labeled ad GPU accelerated. (I was looking because all of them were crashing for me).

OpenCL and OpenGL are two different things, with openGL accessing graphics-specific functions. Most of what I've seen OpenGL used for has been 3D manipulation in games and modeling but maybe there are other uses.

So we've got Vegas' OFX filters using OpenCL and various third party tools independently using whatever they were built to use - CUDA directly, or OpenCL, or OpenGL.

But I could have all of this muddled up.

Anyway, It sounds like Grazie and Muttly are both saying that the OP would benefit from at least one good NVIDIA GPU.

Rob

Sebastian Reg wrote on 3/7/2012, 6:08 AM
First of all many thanks for getting involved in this topic.
I'd like to know in your opinion, what would work better - one Nvida GTX 590 card or two GTX 580 cards?
Which solution would be more efficient in rendering Bluray videos with MB and Neat Video plugins used?
Steve Mann wrote on 3/7/2012, 9:25 AM
I have two GTS 520's in my system (for four monitors) and Vegas only uses one card for CUDA processing.
rmack350 wrote on 3/7/2012, 9:37 AM
Steve,
Are you running MB and/or Neat video when you check the GPU usage? 1 GPU is all I'd expect but since these don't seem like they're part of Vegas' GPU acceleration it'd be good to know the results with those two specifically.

<edit>John Cline says earlier in the thread that both his NVIDIA GPUs are active when using MB and Neat.</edit>

Rob
JasonATL wrote on 3/7/2012, 11:29 AM
Like others, I use MB Looks and have an nVidia card (GTX 560 Ti). Certain MB Looks effect will preview on the timeline at full rate, with the GPU card clearly doing some heavy lifting. Same with renders. Just wanted to add my experience that Looks does benefit significantly from the GPU in nVidia cards.