Comments

Grazie wrote on 6/29/2009, 11:46 PM
Thanks Pixel. I don't see VP9? Maybe this will cross the gap though to VP9?

OOOOooo..... I like the sound of this: "Takes full advantage of graphics card’s GPU for lightening fast rendering" And that would be from within Vegas? Maybe SCS have figured out a way to allow 3rd Party Plug-in developers to access the Vegas engine without they themselves getting spun around? Interesting, just to recap: "Takes full advantage of graphics card’s GPU . . "

Grazie
rmack350 wrote on 6/30/2009, 12:01 AM
The AAV ColorLab plugin also uses the GPU. Gausian Blur is pretty fast with this filter.

It's definitely possible to have GPU accelerated filters in Vegas and if the filter works it's unlikely you'd go back to an unaccelerated version.
GlennChan wrote on 6/30/2009, 11:37 AM
I believe that once the filter has the data, it can do whatever it wants with it.

2- If Vegas supported the GPU, then if you have multiple filters then you won't have data constantly going from CPU/RAM <--> GPU repeatedly.
MilesCrew wrote on 6/30/2009, 1:21 PM
I'm just still confused as to why this can't happen with just straight video and no plugins. The lack of GPU-assisted rendering is still a mystery to me. Video cards scream so fast these days and are way easier/cheaper to upgrade as time goes on than processors.
TheHappyFriar wrote on 6/30/2009, 2:00 PM
GPU FX don't meddle with the video, they're handled between the video & the output. To accelerate the video each frame would need to be sent to the GPU.

It's been done before but then it normally requires special hardware & software setups. Vegas basicly is anti-that.

One thing I don't like about the nvidia stuff is that it's normally nvidia only. Nvidia is supposed to be a company that supports industry standards with it's acceleration (OGL/DX) but all the plugins/special programs only work with their cards. So they're no different then using hardware acceleration on premiere, avid, etc.
rmack350 wrote on 6/30/2009, 2:23 PM
GPU FX don't meddle with the video, they're handled between the video & the output.

That's not entirely true but you're still on the right track. The problem here is with hardware specificity. Vegas could use the GPU, but which GPU? all of them? It could get to be a bit much to maintain since there are integrated chips, discrete GPUs, multiple manufacturers, constantly changing drivers, etc. Keeping up with this hardware would be a liability for SCS.

On the other hand, if Vegas could pass data to a third party codec or filter that uses the GPU then it's really not SCS's problem if the filter provider doesn't keep up with current hardware and drivers.

In fact, this sort of arrangement has been possible for a while. Magic Bullet uses the GPU, as do the filters mentioned here.

This all became a lot more practical with the move from AGP to PCIe X16 cards. AGP had a lot of bandwidth going out tot the GPU but not much coming back to the computer. PCIe offers the same high throughput in both directions and this makes it much more possible for a GPU to do some crunching and then return the results back to the CPU rather than just passing it on to the display.

Rob Mack
jabloomf1230 wrote on 6/30/2009, 2:46 PM
The net result of all this with Vegas Pro (It does work fine with 32 bit V9. BTW) is the the filter won't slow down the preview very much, if at all, when compared to not having the FX applied. Before you bother with the download, make sure that your nVidia GPU and the driver version meets the minimum requirements. ProDAD has a small test program on their website that will tell you whether you do or not.

Basically, GPU accelerated filters are a good thing, because as you pile more and more conventional FX onto the timeline, your preview gets slower and slower, as the CPU(s) can't keep up with all the computation.


EDIT: I noticed that I said "nVidia". I don't have an ATI card, but I think that ProDAD's add-ons also work with ATI hardware.
TheHappyFriar wrote on 6/30/2009, 3:01 PM
All GPU's support OGL & DX, so there doesn't NEED for hardware specificity. It's just purposely done that way. :/ There's no reason a game can apply post-processing effects to something in OGL/DX, REGARDLESS if who makes the card as long as the card supports specific versions of the API, but can't with video. I mean real post processing effects, not something coded specifically for that game, but using the post processing effects of the GPU. both OGL & DX support post processing in one way or another. That's used all the time (water effects, for example), but nobody has made a plugin that lets you download & use scripts that are written for your own custom FX. Only reason would be that companies don't want to. This isn't something new: older ATI's could apply post process FX to anything that went to the card but they stopped this a few years ago.

Yeah, there's issues (anything behind the effect is effected, even if it isn't meant to be) but that doesn't mean they can't be fixed.

I wouldn't say PCIe has anything to do with this: with AGP GPU's a game could write HD tga sequences to your hard drive pretty fast (well, any GPU really, all depended on how it was coded). A lot faster then what we're rendering at in some circumstances. If you wanted a GPU to be used as a CPU then you'd need more in/out put then AGP, but all we want it to do is process out effects which would in effect make the GPU be very similar to a frameserver.
farss wrote on 6/30/2009, 3:12 PM
What happens with a game and what happens with video are quite different I believe.
With a video game vectors are passed to the GPU to be rasterised. The frame is then in the GPU's local RAM. Applying an FX at that point doesn't mean moving a lot of data over an external buss and then moving it back.

Bob.
jabloomf1230 wrote on 6/30/2009, 3:42 PM
That's correct. Let me restate it. In a video game, the CPUs draw and move the equivalent of a wire frame of 3D objects and then the GPUs "paint" the wire frame. This is why just buying a fast video card and mating it with a slow CPU doesn't help to speed up in-game frame rates. A good series of articles on this topic, which I forgot to add originally :

http://www.digital-daily.com/video/processor_dependency

But with 2D video GPU acceleration, the GPUs on the video card are like "dumb" CPUs. They have a limited instruction set, when compared to an x86 CPU, but even with those limitations, it's possible to use them as surrogate CPUs for certain calculations (like FX), in addition to rendering the video to the output device. It's somewhat akin to a hybrid car.
rmack350 wrote on 6/30/2009, 5:20 PM
We're probably straying from the topic a little. The good news is that more filters and codecs are starting to appear that can use the GPU. A GPU definitely can accept a video stream, process it, and give it back to the CPU. A bit like an extruder or sausage mill. A GPU is especially well suited to tasks where the data streams in, gets processed, and then streams back out.

There are a few ways to use the GPU and each has its strengths. DirectX is one, OpenGL is another, CUDA and Stream (or whatever ATI is calling it now) are still other ways. The last I read about OpenGL, it is designed to be a common standard but since ATI and NVIDIA have their own proprietary systems OpenGL is competing for development resources. And of course both companies would like to bend OpenGL a bit to their own benefit. The short of it is that OpenGL ain't all it's cracked up to be yet. And things like CUDA are supposed to be general purpose. You can use CUDA for audio, or even financial calculations (for example), OpenGL doesn't do that.

Regardless, there are several ways to skin the GPU cat, depending on how you want to do it, and I think we'll be looking at good things down the road as more filters and codec become available, even if SCS doesn't take the lead with this.

Rob
jabloomf1230 wrote on 6/30/2009, 5:33 PM
OpenCL is the new standard.
rmack350 wrote on 6/30/2009, 8:15 PM
Ah. Got it. OpenCL sounds like a move in the right direction to get things working on a variety of processors. The fact that Apple intends to use it in Snow Leopard is encouraging since that's a very real thing in the very near future. I wouldn't start counting those chickens before they hatch but it seems like they really will hatch.
Mahesh wrote on 7/1/2009, 1:10 AM
I got the video enhancer package, Vreveal. It did not support the nividea card in my present edit PC. Nvidea card in my new PC is supported and there is considerable improvement. Motion DSP provide a list of compatible graphic cards.
jabloomf1230 wrote on 7/1/2009, 3:43 PM
I didn't mean to correct you, since I thought that you were saying "OpenGL" in the latter part of your message and it was a typo. The Red and Green teams are already wandering away from the OpenCL standard.
rmack350 wrote on 7/1/2009, 5:39 PM
Happy to be corrected since I was indeed saying OpenGL. A lot of gaming factoids creep into discussions like this and they're only partially relevant. OpenGL seems largly about games and 3d modeling, OpenCL seems more like CUDA in that it's general purpose. It looks like OpenCL can also work nicely with OpenGL.

The teams may be wandering off but if Apple is pushing it and the two companies want to sell to Apple customers then hopefully they'll give it a little attention. They can probably afford to focus on OpenCL and their own proprietary systems at the same time.

One thing about OpenCL is that it seems like it's intended for all sorts of processing units, including things used in cellphones (think I-phone here since Apple is such a major player in this). What you could find is a critical mass of little applets for the iphone that could also use a GPU in a laptop or desktop PC. Under that scenario you might see wider adoption and better support from nvidia, AMD, and intel.

How could it relate to Vegas? Well, I really think the issue for SCS and Vegas has been that they've always wanted to make sure Vegas worked on as broad a set of hardware as possible, and they've always seemed to rely on the most common tools available within Windows. I don't think SCS will ever utilize a GPU unless the functionality is built into Windows. That probably rules out OpenCL for Vegas itself. On the other hand, third parties could use it as a common way to do their processing on the GPU and then produce filters and codecs that are easy to port to several applications.

The view from orbit is still that there are more tools that use the GPU available for Vegas, and there'll probably be even more soon.

Rob
tumbleweed7 wrote on 7/3/2009, 10:12 AM

So...... has anybody gotten these plugins to work in Vegas?

I haven't ... Vegas just freezes up
jabloomf1230 wrote on 7/3/2009, 10:43 AM
They work fine for me with Vegas 9 Pro 32 bit, running under Vista x64.
Jøran Toresen wrote on 7/3/2009, 2:10 PM
Rob wrote: ” The view from orbit is still that there are more tools that use the GPU available for Vegas, and there'll probably be even more soon.”

Is this a pure guess or an informed guess?

Jøran Toresen
rmack350 wrote on 7/3/2009, 2:31 PM
I think off the top of my head there was just one tool that used the GPU in Vegas (Magic Bullet). Now I can think of three. That part's not a guess. The statement that there'd be more soon is purely a guess but I think it's a good bet as long as you don't define "soon" too clearly.

Rob