Will Sony Vegas EVER Use The Power Of GPU's?

Comments

jabloomf1230 wrote on 8/21/2009, 8:10 PM
LOL, a search of this thread would find a post about the Divide Frame GPU decoder. Here's my take:

1) It does work with Vegas 9 32 bit. You can get RT AVCHD previews on Preview/Half with a modest dual core PC with a CUDA-enabled GPU. On a quad core PC , you can get RT previews on Good/Full or even Best/Full if you're lucky. On a high end Core i7 PC, you probably don't need it.

2) It fumbles and stutters with transitions, but that problem is supposedly going to be fixed with the next update or so.

3) It is only an AVCHD decoder for speeding up previews. It only slightly speeds up AVCHD encoding, since each frame must be decoded before encoding.

4) It doesn't help preview speeds with anything except H.264-based files, like for example, those produced by the Canon 5d2 DSLR. or the spectrum of consumer camcorders.

5) It disables MPEG-2 and QT encoding, when installed, but you can disable the GPU decoder temporarily, if you need to do these encodes. MOV files may also not show up in the Import Media dialog box, but you can see them if you select all files (*.*).

6) You need a CUDA compatible nVidia GPU and the most recent CUDA-enabled video driver to get the full impact of the acceleration.

chris_p wrote on 10/31/2009, 11:49 PM
Unfortunately this currently supports only the 32-bit version of Vegas :(
chris_p wrote on 10/31/2009, 11:52 PM
Yes Vegas Programmers,

Please take this to heart and start supporting GPUs- it will make our experience that much better and your software at least 10x more valuable.
MPM wrote on 11/1/2009, 10:08 AM
Totally FWIW, & as always, in case [& hoping] it helps...

Writing code to take advantage of a GPU is daunting, especially when/if you're already trying to multi-thread to multiple cores/cpus at the same time. Last I read the game developers were still learning, figuring it out. Most of the CUDA/ATI Stream specific code I've seen so far is very specific in function, i.e. smaller apps or plugins etc. Unfortunately at least with ATI's Stream tech, the examples so far haven't come close to the level of coding skill/expertise you see in Vegas -- you get speed but not quality generally. Not because the GPU is incapable, but because the SDKs [especially <sigh> ATI] & overall time spent & skills were lacking. This is understandable to a large extent considering the tech is still new/evolving, whole sets of code have to be written (1 per manufacturer), & not everyone using something like Vegas has the latest graphics cards to work with this stuff anyway. That said, there are calculations a GPU is much better at than any CPU, & when you have that particular sort of number crunching going on in an app, moving that portion over to the GPU can have tremendous speed improvements. There's also another facet of coding video apps where the graphics card can matter, & that's through using Direct X [Avery Lee talks about it from time to time on his site].

Personally I see a big improvement is Vegas using an ATI 4870, compared to systems including one that's almost identical using an ATI 2600 Pro. Encoding DVD spec mpg2 for example went from a bit better than realtime to ~1/3 realtime. I get this in both XP Pro SP3 32 & 7 64, so amount of RAM seems irrelevant. GPU-Z shows GPU involvement. ATI Stream encoding speed is impressive using A's Video Converter or ATI's very basic CCC encoder -- going small for a handheld (where I use it since quality isn't 100%) I've gotten well over 500 fps to slow a** encoding wmv.
jabloomf1230 wrote on 11/1/2009, 12:17 PM
Keep in mind that when encoding, the original files on the timeline have to be decoded and previewed (If you also preview while encoding). None of Vegas' built in codecs presently utilize GPUs for encoding, so the only possibility for "GPU involvement" (as you put it) is in either decoding and/or previewing. Vegas will use 3rd party VFW codecs also, but I don't believe there are any that utilize either CUDA or ATI Stream, if that is even possible.

Also, here's an update on the Divide Frame AVCHD decoder. There is a new version, 1.06. Unfortunately, the forum's website is now down and I'm wondering whether the company is also. Maybe it's just an extended vacation for this one person operation.
Laurence wrote on 11/1/2009, 4:58 PM
I can't seem to make Divide Frame's software work at all on my PC. It has a pretty decent 512 mb nVideo graphics card which should, but I can't seem to get the unregistered watermark or improved performance.
MPM wrote on 11/2/2009, 11:49 AM
>"the original files on the timeline have to be decoded and previewed
>(If you also preview while encoding). None of Vegas' built in codecs presently
> utilize GPUs for encoding, so the only possibility for "GPU involvement"
>" (as you put it) is in either decoding and/or previewing. Vegas will use 3rd
>party VFW codecs also, but I don't believe there are any that utilize either CUDA
>or ATI Stream, if that is even possible."

Wellll... I don't know how/why, nor make any claims other than what I experience, & I have tried not to interpret anything as meaning this or that... Encoding in Vegas is MUCH faster with the 4870 installed. It could be my case is infested with magical mites -- I DO need to clean the innards, so I might find out. ;-)

Source can be avi [several flavors], mpg2, vfapi [DGIndex -> AviSynth script -> VFAPI]. I do have all the ATI encoders installed, but not their separate CMS. I have PowerDVD installed with the ATI accel turned on. I usually have the DS EVR renderer set as default in XP, matching 7. In 7 the MS decoder is default -- in XP the Vegas decoder is preferred. No codec packs, no ffdshow. Performance on a dozen or so other encoders remains unchanged.

I went through 3 m/boards with AMD chipsets in a year :-( each time buying a new one while the RMA BS was sorted out [MOSFETs fried on a Gigabyte, & an ABit ate batteries -- 3rd one, a Foxconn has been fine, knock wood]. All the boards were as identical as possible so I wouldn't have to worry about reinstalling everything (I later used the RMA replacements in two PCs). 7's benchmarks were identical with each of the 3 boards. I stuck the 4870 in here after picking it up on sale, so I could use the 2600 Pro with 1 of the RMA boards.

GPU-Z shows roughly the same involvement encoding mpg2 in Vegas as it does running A's converter or the one built into ATI's CCC. Other encoding is also sped up, but as I do much more mpg2 encoding in Vegas, that's the one I mentioned. The other new parts were the sound card & a bigger power supply for the higher demands of the new graphics card -- neither would effect performance. Preview during encoding is normally off, external monitor toggled off etc.

Now why encoding is sped up I don't know. Why the GPU is being used I don't know. I do know that Nero & Roxio are both advertising Cuda & Stream use for video, Cyberlink uses Stream. It would not be unexpected if a major encoding tech company like mainconcept had it too -- I don't know who makes the latest Nero & Roxio encoders, or what code they've licensed.
Quryous wrote on 11/2/2009, 12:52 PM
The newest version as of this post is 1.06.
TimTyler wrote on 11/2/2009, 1:45 PM
I'm not an app developer, but I'd be willing to bet that Vegas' core operation philosophy is the reason we'll never see GPU processing in Vegas.

Vegas, unlike other NLE's that I am aware of, will accept ANY combination of ANY raw media at ANY framerate and ANY resolution on a single timeline without so much as a hiccup.

Want to mix 24p DV with Red RAW files and some DVCPRO-HD 30P, throw in some 4:3 60i SD, add a sprinkling of mega-resolution PNG's, TIFF's and layered PSD's? No problem. Add some HDV GOP's. No problem.

Oh - And it does that without any file transcoding.

Render it all out at ANY frame rate and resolution combo. No problem.