Sony rep says Vegas Pro 9.0c uses CUDA GPUs

BrianJK wrote on 1/15/2010, 6:49 AM
During a Sony NXCAM launch presentation yesterday, the Sony rep stated that Vegas Pro 9.0c render times would be improved if the computer had a CUDA GPU graphics card.

Being a natural skeptic, I raised my hand for clarification and he repeated himself.

I didn't think that Vegas was GPU-aware. Am I the only one who missed the memo?

Thanks

Comments

rmack350 wrote on 1/15/2010, 7:17 AM
Everyone missed this memo. Was this guy from SCS or was he from some other division of Sony?

I hate to say it but I think the guy was making things up. If this was true you'd definitely have head about it from SCS, and people here would have debated it in a 100 post thread by now.

Rob Mack
TheHappyFriar wrote on 1/15/2010, 7:21 AM
that guy doesn't know what he's talking about. Should be fired.... doesn't even know the product he's trying to push.
Tom Pauncz wrote on 1/15/2010, 8:00 AM
He was not from SCS, rather SONY Canada in Acquisition Systems. Very knowledgeable of all their acquisition gear from prosumer to high-end pro and broadcast systems.

Have to admit, I too, was a little sceptical ...
Tom
megabit wrote on 1/15/2010, 8:02 AM
I'm now testing a workstation with the FX4800 Quadro graphics card and the Tesla C1060 card.

I must say that of the CUDA-enabled NLEs I tested, the only one that really makes HUGE difference is the Cyberlink PowerDirector 8:

- an 1:19 HDV clip, transcoded to 1920x1080 H.264, takes 38 secs with CUDA versus 2mins:42 secs without it (on the i7 quad at 2.88 GHz).

Avid Media composer can hardly show any difference, and TMPGenc (same transcoding) only shows minor one (some 10%).

And yes, out of curiosity I did install and try VP9.0; of course no difference at all (as could be expected).

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

BrianJK wrote on 1/15/2010, 8:24 AM
I thought it was too good to be true.

Tom - It sounds like we were in the same presentation. Was that you lurking near the donuts & coffee at the back of the room ? Too bad we missed the opportunity to meet in person. Maybe next time.

Brian
PerroneFord wrote on 1/15/2010, 8:25 AM
I don't think the Avid MC takes advantage for rendering... but you shouldn't need to render much in the way of effects on the timeline and still get realtime playback. At least that is how it is for me with my Quadro FX4800..

And I have to say that is MOST welcome!
rmack350 wrote on 1/15/2010, 8:33 AM
Well, let's temper this a bit. there are several possibilities:

A: The rep was WAY wrong, or worse
B: He was confused and thinking of a future revision of Vegas (Or thinking of PP-CS5)
C: He was thinking of a CUDA enabled CODEC that ships with the software for the camera.

This is all total conjecture and given the lack of any evidence I'd still go with (A).

Rob
megabit wrote on 1/15/2010, 8:56 AM
" don't think the Avid MC takes advantage for rendering... but you shouldn't need to render much in the way of effects on the timeline and still get realtime playback. At least that is how it is for me with my Quadro FX4800"

True. Difficult to assess on this particular machine I'm testing, as the CPU is quite powerful, as well.

In those other apps I'm testing (including my main CAE number-cruncher), one can explicitly enable/disable GPU, and when enabled, they all notify about it - so comparing is made easier.

Anyway, I'd be grateful to SCS for some official clarification...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

PerroneFord wrote on 1/15/2010, 9:23 AM
I spent some time talking with SCS about this a couple months ago. Vegas does not leverage the GPU at all at this point.
TheHappyFriar wrote on 1/15/2010, 9:24 AM
Well, let's temper this a bit. there are several possibilities:

Either way, someone who's job it is to sell stuff to "pro's" got something completely wrong, something that could of been corrected by just LOOKING @ the website for the software. There's enough people here having issues with feature officially supported, we don't need hundreds of new people spouting off how sony lied to them about the software because it doesn't support what they were specifically told. BAD salesperson regardless.
LReavis wrote on 1/15/2010, 11:30 AM
"the only one that really makes HUGE difference is the Cyberlink PowerDirector 8:"

megabit - I presume that such an increase in rendering time is codec specific, right? In other words, if I wanted to render with, say, Cineform instead of H.264, I wouldn't see this GPU acceleration . . . true?
megabit wrote on 1/15/2010, 1:04 PM
Yep - the GPU acceleration is only available when encoding H.264 (and some form of mp4, not sure which).

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 1/15/2010, 1:41 PM
"the Sony rep stated"

Stop reading beyond that.
I'll be spending most of this weekend urgently hacking my code because "The rep told me...".

Bob.
Tom Pauncz wrote on 1/15/2010, 2:04 PM
Brian,
Got me. It was I at the TimBits.
Also the one who got him set with Vegas when he wanted to demo his clips.
I usually try to make it to the Vistek events - especially the SONY ones.
Till next time,
Tom

PS. Where are you located? Perhaps we can grab a coffee sometime??
jabloomf1230 wrote on 1/15/2010, 2:25 PM
"TMPGenc (same transcoding) only shows minor one (some 10%)"

This goes to show you something about CPUs and GPUs working in tandem. TMPGenc Xpress 4 uses CUDA, but only for specific filters and CUDA-aware codecs. You can see this for yourself, since TMPGenc displays %CPU vs. %GPU while rendering output files (a handy option). Also, Badaboom, the rather modestly priced h.264 encoder, really does take advantage of CUDA and you see see that result, in the increased speed of the renders. Unfortunately, there's no way of frameserving from Vegas to Badaboom that I know of.

Lastly, I doubt seriously that Vegas Pro 9 uses CUDA, but various 3rd party plug-ins designed for Vegas, do use CUDA, so it is possible. Maybe that's what the rep was thinking of. Maybe Vegas Pro V10 will use CUDA. But even if it does, my guess is that SCS will do with Vegas, what Adobe is doing with CS5 and only support a very-limited subset of CUDA-aware video cards.
rmack350 wrote on 1/15/2010, 4:49 PM
True.
megabit wrote on 1/16/2010, 1:01 AM
"You can see this for yourself, since TMPGenc displays %CPU vs. %GPU while rendering output files (a handy option)."

Yes, and while rendering, it was suggesting that by far MOST of the job was handled by the GPU - and yet, the overall rendering time was just around 10% shorter than with exactly the same job and CUDA disabled....

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)