Does Vegas v11 support your nVidia card?

John_Cline wrote on 10/19/2011, 4:00 PM
I suppose the question might be "does your video card support Vegas?", either way, for full GPU acceleration, Vegas requires an either an ATI (AMD) or nVidia card. For nVidia cards, you need one that supports "Compute Capability 2.0" or higher and the latest drivers. Here is a chart on the nVidia site which lists each card's Compute Capability:

http://developer.nvidia.com/cuda-gpus

If your card is 2.0 or higher, the relative performance gain will be determined by essentially two things; the number of Cuda cores (sometimes called "shader cores') and the clock speed of the card. If you want more technical information on your card, here is a handy OpenGL, OpenCL, CUDA Graphics Card and GPU Information Utility:

http://www.ozone3d.net/gpu_caps_viewer/

Comments

ushere wrote on 10/19/2011, 5:53 PM
thanks john, most informative and useful.....
johnmeyer wrote on 10/19/2011, 6:20 PM
Perhaps someone can come up with a clever way to create a benchmark, for both rendering and preview speed, using a much smaller download package than the 2GB monster supplied by Sony. A simple solution might be to simply use a VEG w/ generated media and keyframes which each of us would then use to generate media that would then be used for the preview test. The new text generator can create some very nice animations, for instance.

Once created, we could each post our results.

In answer to the OP, no, V11 does not support my card (GeForce 9800GT), but it is listed as 1.0, so it was no surprise when I downloaded the V11-64 bit demo and ran it under Vista 7.

The performance seems to be the main thing about this release, but it looks like it will be $500-800 in hardware costs to be able to get the benefit, even though my card is only two years old and was pretty high-end at that time.

I have found that timeline playback in V11-64 bit under Vista to have more glitches when playing back some m2t HDV files from my FX1. I got black frames at every single scene transition. These m2t files were captured using the Vegas Capture with scene detection enabled, something I don't usually do because it always creates "issues." However, I just tried playing the same files under V10 under Vista, and then V10, V8, & V7 under WinXP, and while there were slight glitches (because the Vegas capture created files that have dup frames) none of the other versions game me those dreaded black flash frames. I didn't try rendering to see if the black shows up in the render.

Given that these files are a little screwed up to begin with (thanks to Vegas' lousy HDV capture), I'm not going to give too many demerits to V11. I only want to point out that it isn't as "robust" in handling files that have problems, or at least these files that have problems.

john_dennis wrote on 10/19/2011, 6:48 PM
"...but it looks like it will be $500-800 in hardware costs to be able to get the benefit."

I don't think the number would be nearly that high to get some of the benefit. With the GTS450 you can start getting some of the benefit for around $100.

From Dave McKnight's review:

"I get between 2 and 6 frames per second (FPS) playback without GPU assist and 45-60 fps playback with GPU assist. Not bad, and I'm pretty impressed."

Of course, with video cards you can spend a lot of money if you have a mind to do it.
johnmeyer wrote on 10/19/2011, 7:28 PM
I don't think the number would be nearly that high to get some of the benefit. With the GTS450 you can start getting some of the benefit for around $100. Yes, I am sure you're correct. However, for me -- and I suspect a lot of others who are furiously posting here in all sorts of parallel GPU threads -- the big question is:

what benefit in preview and render speeds will each card provide?

I'm a big believer in getting almost the top-end CPU when buying a computer (although I actually did get the top one last time) because you usually pay a significant premium to get that last ounce of performance. However, is this same thing true with GPU-assisted preview and rendering?

What I'd love to see (and if Sony had a real product manager, we'd already have this) is something that looks like this:

GTX 590 Preview - 94x; render 26x
GTX 580 Preview - 75x; render 31x
GTX 580 Preview - 68x; render 30x

(those happen to be some specs from nVidia's comparison page, and I don't know if they have any meaning at all for answering the questions posed above).

Also, would it make sense to mortgage the house and build my own unlicensed nuclear power facility in order to add a Tesla card?

It is always difficult making buying decisions about hard drives, motherboards, processors, etc., but I have found most of these decisions to be pretty easy because you don't get quantum differences within a given class of product: one 7200 rpm 12ms drive is about the same as another; one class of motherboard performs, for most things pretty much like another; and while CPU benchmarks certainly vary, once you have the clock speed and number of cores, you pretty much what to know what to expect.

With this GPU thing, however, I don't even know what specs to look at. Specifically:

Which specs will affect preview performance?
Which specs will affect rendering performance?

Put another way, for Vegas, which of these specs matter, and if I double any one of them, how will that translate to preview and rendering performance:

Number of CUDA cores
Graphics Clock
Processor Clock
Memory Clock
Total Memory
Memory Bandwidth
Gigaflops (Single Precision) (only appears on some cards)
Gigaflops (Double Precision) (only appears on some cards)

Until Sony or some other reliable source provides some answers, we are all very much flying in the dark.

Sebaz wrote on 10/19/2011, 10:28 PM
I can tell you that with a GT 430, even though it's supposed to be 2.1, it does worse than with just CPU, playback wise. I applied levels to an event and it plays full fps in Best and Full, but it goes down to 24 or less with the GPU enabled. Not a big surprise considering that I paid $70 for this card.
john_dennis wrote on 10/20/2011, 12:51 AM
Here are my render results for the $120 ($60 in my case) NVIDIA GTS 450 from EVGA. (Driver 280.26). Intel Q9450 on P45 chipset, 6GB DDR2.

Source Video 3 Min. 59 Sec. of MPEG-2 ~ 14 mbps 1280x720-59.94p
Rendered to Sony AVC 1280x720-59.94, 10 mbps High Profile with custom template


Sony Vegas Pro 11

CUDA = ON - 8 Min. 32 Sec.
CUDA = Off - 9 Min. 42 Sec.

Sony Vegas Pro 10

CUDA = ON - 15 Min. 16 Sec.
CUDA = Off - 10 Min. 09 Sec.


Then I rendered the same video with MainConcept AVC using the Internet HD 720p template

Sony Vegas Pro 11

CUDA = ON - 6 Min. 23 Sec.
CUDA = Off - 14 Min. 46 Sec.

I too would like to have a more standard test that everyone could use.
Marton wrote on 10/20/2011, 1:05 AM
Sebaz: that's very interesting. Do you have the latest nvidia drivers installed? Because someone else here sayed that even GT430 with DDR3 is greatly increase the performance, and i almost buy this card. I like the passive cooling solution, and every better card has active cooling. So please give another chance for the GT430!
Marton wrote on 10/20/2011, 1:07 AM
John, seems like Mainconcept GPU acceleration is far superior than Sony's? Can you compare some 3D MVC 1080p rendering too?
ritsmer wrote on 10/20/2011, 1:13 AM
Finding the right CUDA card is not only a question about speed.
Also the power consumption is a factor, and before deciding check if your PCs power supply is strong enough.
A too big GPU card will cause the PC to crash - and that will surely generate the usual whining around here - blaming Vegas, of course...

Found this table showing many Nvidia GPUs at one glance:
http://www.nvidia.co.uk/object/graphics_cards_buy_now_uk.html
See power usage etc. under Show Specs.
ingeborgdot wrote on 10/20/2011, 5:31 AM
I am looking at either and Nvidia OC 460 or a 560. Does anyone care to comment and give some advice on if it would matter. The specs are almost identical but the 460 is 60 bucks cheaper.
farss wrote on 10/20/2011, 6:27 AM
"Put another way, for Vegas, which of these specs matter, and if I double any one of them, how will that translate to preview and rendering performance:"

Increasing any of them is going to help plus in general it's not like you get the option of increasing just one, you spend more and almost all those features improve.

You might also need to factor in how many monitors you are going to connect and at what resolution as that alone would use up more memory.

I did find a guide for running CS 5.5 which basically says more cores, DDR5 RAM and at LEAST 1GB of memory.

My problem is I'm also looking towards 3D and that seems to mean I need a Quadro card, probably a 2000 or 4000. I was going to buy one initially for the 10 bit video capability and then thought maybe I could save some dollars, not like I'll be able to afford a 10 bit monitor in the near futue and then the issues everyone is having with 3D raised it's head. So now I'm back tothe Quadro class cards.

Bob.
JJKizak wrote on 10/20/2011, 7:10 AM
I spent $350.00 for the GTX 8800 and thought it would be a forever thing. Now I have to spend $1200.00 again for a 5000 so I might as well get a new motherboard and 32 gig of memory and Win 7. Like Jethro says on Beverly Hillbillies "I doh know".
What do I do with the old stuff?
JJK
TheRhino wrote on 10/20/2011, 7:29 AM
When you guys are providing benchmarks for various video cards, please also provide your CPU data so that we do not have to look it up in your profile or hope that your profile is up-to-date... Some guys are saying that their GPU is 1.5X or 2X times as fast as their CPU, but their CPU's are only 4-core when most of us who make a full-time living at this are running 6-core or even dual processors...

In our case, it was worthwile to upgrade to a 6-core I7 when they were first released because for $1000 we were guaranteed 2X the rendering speed. However, we are not going to spend $500 - $800 per workstation for only marginal improvements over our already speedy CPU.

I do appreciate all of the benchmark data coming in for v. 11. I haven't been this excited about a hardware upgrade since the 6-core I7's were released.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Marton wrote on 10/20/2011, 8:36 AM
Now that Vegas has GPU support, it would be very good
a homepage , like http://ppbm5.com
Maybe vpbm11.com :)
john_dennis wrote on 10/20/2011, 8:44 AM
@Marton

"Can you compare some 3D MVC 1080p rendering too?"

I'm not currently doing any 3D. If some standard benchmark surfaces with 3D in it, I'll run it and post.

@TheRhino

I updated my post with cpu, motherboard chipset and memory specs.
Marton wrote on 10/20/2011, 9:13 AM
OK, but you can try some short (10 sec) "fake" 3D footage, rendering to MVC. For "fake" i mean you select a 10 sec clip for left view, and the same clip for the right view (or maybe slide a little to different time).
With my 3.3GHz Core2duo PC rendering to MVC needs about 10x time of the footage. I'm just interested how much can help a GPU. If it's about 2-3x time, then definetly worth it for me.
regards
WillemT wrote on 10/20/2011, 9:29 AM
Just a quick test to give some idea of gains with the, what appears to be a lower end graphics card, GTX 460. This on a low end Q6600 clocked at a stock 2.4GHz with 6Gb RAM (don’t ask). Dynamic Preview Ram set to 2048MB.

I placed a 1 minute GoPro mp4 clip, 1280 x 720 @ 25fps (progressive), on the timeline. Full daylight lighting so probably reasonably low noise. With the GTX460 enabled, in preferences, I added effects till the playback frame rate dropped to just below 25fps (to ensure I see full GPU effect). The added fxs were a Gaussian Blur, Color Corrector (primary) and Lens Flare. The playback rate ended up at about 21fps. Disabling the GPU, which required a close and re-open of Vegas (a bit of a pain when testing), the playback rate dropped to about 7fps – giving what seems to be about a 3x gain.

Next I check rendering. First to the Sony AVC. I used the “Full 1280x720-25p @ 15 Mbps” template unchanged. Testing was with GPU enabled/disabled in the preferences along with selecting Encode Mode as CPU and GPU on the Customize window. Results were as follows (the CPU and GPU usage is listed as well):
GPU enabled with Encode Mode GPU: Render time=1:36min CPU=75% GPU=50% file size=69 887Kb
GPU enabled with Encode Mode CPU: Render time=1:40min CPU=80% GPU=40% file size=68 495KB
GPU disabled with Encode Mode GPU: Render time=3:11min CPU=99% GPU=4% file size=69 751KB
GPU disabled with Encode Mode CPU: Render time=3:04min CPU=99% GPU=4% file size=68 321KB

Next I repeated the above using MainConcept AVC . I created a template for 1280 x 720 25p with 15Mbps max and 10Mbps average with Best selected (Single Pass).
GPU enabled with Encode Mode GPU: Render time=1:21min CPU=75% GPU=66% file size=74 464KB
GPU enabled with Encode Mode CPU: Render time=3:08min CPU=92% GPU=22% file size=62 444KB
GPU disabled with Encode Mode GPU: Render time=3:20min CPU=99% GPU=9% file size=74 455KB
GPU disabled with Encode Mode CPU: Render time=4:52min CPU=99% GPU=8% file size=61 927KB

Just for the hell of it I then rendered to Sony MXF. 1280 x 720 25p at 35Mbps and Best. No Encode Mode option here.
GPU enabled: Render time=1:33min CPU=70% GPU=52% file size=136 398KB
GPU disabled: Render time=3:13min CPU=100% GPU=3% file size=136 276KB

I then decided to check with the effects disabled to see what the actual render times are.
GPU enabled: Render time=2:04min CPU=99% GPU=3% file size=195 592KB
GPU disabled: Render time=2:09min CPU=99% GPU=2% file size=195 588KB

Rather strange, so I enabled one fx (Gaussian Blur) and again
GPU enabled: Render time=1:14min CPU=90% GPU=20% file size=129 332KB

With no effects it seems as if Vegas does not use the GPU enhancement for the mxf render and produce much larger files. I rechecked this last set – very strange.

In all cases, ather than mxf, it appears that the GPU rendered files are larger.

Willem
Sebaz wrote on 10/20/2011, 9:45 AM
@ Marton

Yes, latest drivers because the GT430 is on my HTPC computer, the card on my editing computer is a Radeon HD 4850, which Vegas doesn't support, so I uninstalled the ATI drivers, replaced the Radeon with the GT 430 and installed the latest drivers available.

If you see the Sony page on GPU acceleration, you can see that the Radeon HD 6870 has better playback acceleration than the GTX 570, but the latter is a $320 card while the Radeon is a $180 card. If I had to buy a card just to edit in Vegas, I would go with the 6870. However if you also do any editing in Premiere then you have to stick with an Nvidia card, but Premiere is a mediocre NLE anyway, so I would stay with Vegas and a nice Radeon.
Marton wrote on 10/20/2011, 10:20 AM
@Sebaz:

then why somebody sayed here, that even a GT430 is ok for accelerating? I want to stay with nvidia because i work mainly in 3D with 3D vision, and also games or youtube better support this with vision.

Oh, and i correct my post:
"I like the passive cooling"

i want to say:
"I WANT a passive card" :-)
Munster1 wrote on 10/20/2011, 10:54 AM
@Marton:

I'll think you'll have to look around for a bit more evidence before deciding.

I don't have Vegas 11, but I do have a GT 430. In Premiere it provides a significant boost to performance with a number of accelerated effects on the timeline. I would find it hard to believe that the card is SO inferior to the GTS 450 unless there is some other issue affecting performance (or the CPU it's paired with is a monster by comparison).
Sebaz wrote on 10/20/2011, 11:19 AM
@Marton

I don't know about others, I can only tell you what I experienced. I tried this again, and it gives me the same results I stated above.
Munster1 wrote on 10/20/2011, 12:00 PM
@Sebaz:

With your overclocked hexcore I could well believe the GT430 might be a bit slower on one event with just one effect applied than just using the CPU, but it should still be able to manage realtime performance.
Sebaz wrote on 10/20/2011, 12:11 PM
It's not overclocked anymore. I had it at 3.8 working perfectly, passing all the Prime95 tests and similar, but one day I rebooted and it wouldn't turn on again, forcing me to open it and reset the CMOS. After that, I left all the clock values at default.

But it's no surprise that the GT430 doesn't perform that well, it's a very cheap basic card, I only bought it for my HTPC because I read it was the best card to get rid of the Media Center 29/59 Hz, and it does, so I'm happy with it.

Now if I ever decide to go back to editing in Vegas, I will buy a nice Radeon HD 6870.
Munster1 wrote on 10/20/2011, 12:23 PM
It makes me wonder then what Vegas 11 is doing under the hood or is there a much bigger overhead in using OpenCL instead of CUDA on NVidia cards.