I found the best graphics card for Vegas Pro 12.0

JohnnyRoy wrote on 4/2/2014, 6:14 AM
I know that Vegas Pro 13.0 is around the corner and maybe SCS will upgrade their GPU support to use with newer Kepler and ATI architectures but for the time being, the questions that is still on everyone’s mind is, “What’s the best GPU for Vegas Pro 12.0?” I think I may have found an answer.

I recently purchased a 2008 Mac Pro 2.8Ghz 8-Core Xeon with 16GB memory, 128 SSD boot, 2TB RAID 0, and ATI Radeon HD 5870 on eBay for only $740! I configured it to dual-boot with BootCamp and Windows 7 Pro 64-bit and I was really please with the performance I was getting in Vegas Pro 12.0. So much so that I thought I would perform some benchmarks against my Intel Core i7-3930K Sandy Bridge-E 3.2GHz with 16GB memory, 256GB SSD, and Quadro 4000. I was shocked by what I found. (I also posted this on the Creative COW so apologies to those of you who read booth forums. This is a summary of several of my COW posts.)

Here is my little render test: rendertestjr.veg (Vegas Pro 12.0 only)

The purpose of this test is to measure the difference between GPU acceleration for both timeline playback and rendering. The project is 15 seconds long. It has two tracks:. The lower track contains a Generated Media NTSC Color Bars that rotates 360 degrees in 15 seconds. The upper track had Generated Media Noise Texture with the Progress animated so that it would move. I added Sony Bump Map and Sony Glow (both GPU accelerated FX) to the noise texture and the composite level of the event was dropped to 60% so that the rotating colors bars would show through. Compositing random movement ensures that every frame would need to be rendered during the test. This project requires Vegas Pro 12.0 to open.

In order to test on your computer be sure to do the following:

(1) Set your RAM Preview to 200MB (the default) so that frame cacheing doesn’t skew the results.
(2) Set your preview window to Best(Full) and resize to 900x506 for measuring playback fps
(3) Don’t forget to restart Vegas Pro between turning the GPU on and off
(4) For Sony AVC use the Internet 1920x1080-30p template (turn GPU on/off as appropriate)
(5) For MainConcept AVC use the Internet HD 1080p template (turn GPU on/off as appropriate)

That’s it for the ground rules. Now for the results…

ATI Radeon HD 5870

With GPU turned OFF, that project played back at 0.5 fps on my 8-core Mac Pro with Bootcamp (btw, even on my new 6-core/12-thread Intel Core i7-3930K it only played at 0.7 fps with no GPU). With GPU turned ON it plays back at 29.97 fps. So the GPU is giving me a 60x improvement in playback (from 1/2 frame per second to 30 frames per second). That's pretty amazing! :-)

Here are the timings that I recorded for the Radeon:

-------------------------------------------------------
ATI Radeon HD 5870
-------------------------------------------------------
Timeline GPU Acceleration OFF (Playback 0.5fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 1:34
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 1:30 (0x)
MainConcept AVC Internet HD 1080p (CPU Only) . . . 2:40
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 1:23 (2x)
-------------------------------------------------------

-------------------------------------------------------
Timeline GPU Acceleration ON (Playback 29.97fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 0:31
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 0:29 (0x)
MainConcept AVC Internet HD 1080p (CPU Only) . . . 0:54
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 0:15 (3.6x)
-------------------------------------------------------

Apparently, Sony AVC does not take advantage of OpenCL with this card. There was no appreciable improvement (just a few seconds) between CPU Only and Use OpenCL. Maybe if I make the test project longer (like one minute) but the GPU Load meter did not show the Sony encoder using the GPU at all (maybe one or two spikes).

MainConcept was another story entirely. The difference with MainConcept AVC GPU is incredible. MainConcept AVC saw a 3.6x improvement for GPU rendering and 10x overall between no GPU for timeline or render (2:40) and only 0:15 with both turned on!!! On the MainConcept AVC render my CPU was about 29% and my GPU was about 71% utilized. That is closer to what I expected.

At no time did the GPU slow things down. It's important to note that the CPU and GPU are pretty evenly matched. They are both circa 2008/2009 so they compliment each other well so I'm not sure how much having the CPU/GPU combination evenly matched adds to the performance boost. I did use the latest Catalyst drivers from AMD and not the older Apple drivers from BootCamp (which didn’t recognise a GPU with Vegas Pro).

NVIDIA Quadro 4000

Then I set out to do the same tests with my Quadro 4000. Remember, the Quadro 4000 is in my fairly new Intel Core i7-3930K Sandy Bridge-E 3.2GHz (6-core/12-threads) so the CPU Only times are better than my 2008 Mac Pro 2 x 2.8 GHz Quad Core Xeon E5462 (8-cores/8-threads) but much to my surprise, the GPU renders were actually slower on the NVIDIA Quadro 4000 than the ATI Radeon HD 5870!

Here are the times for the Quadro 4000:

-------------------------------------------------------
NVIDIA Quadro 4000
-------------------------------------------------------
Timeline GPU Acceleration OFF (Playback 0.7fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 1:10
Sony AVC Internet 1920x1080-30p (CUDA) . . . . . 1:08 (0x)
MainConcept AVC Internet HD 1080p (CPU Only) . . . 1:17
MainConcept AVC Internet HD 1080p (CUDA) . . . . . 1:07 (0.1x)

-------------------------------------------------------
Timeline GPU Acceleration ON (Playback 2.0fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 0:27
Sony AVC Internet 1920x1080-30p (CUDA) . . . . . 0:36 (0x)
MainConcept AVC Internet HD 1080p (CPU Only) . . . 0:40
MainConcept AVC Internet HD 1080p (CUDA) . . . . . 0:33 (0.2x)

I'm not even sure what's going on here. First I was shocked that the Quadro 4000 could not play the timeline better than 2 fps with GPU ON when the Radeon HD was a solid 29.97fps at Best(Full) all day long. This is why the render times are so poor. My Radeon HD 5870 rendered MainConcept AVC with GPU twice as fast as the Quadro 4000! (0:15 vs 0:33). I'm beginning to think that I should be using my 2008 Mac Pro as my primary Vegas Pro editing workstation because the Radeon performs better than the Quadro with Vegas Pro 12.0.

Obviously this was an artificial test. Generated Media is uncompressed in Vegas Pro so I eliminated any lag due to decoding video on the timeline. But it did isolate just the timeline GPU acceleration and rendering GPU acceleration in a project that doesn't play back smoothly without GPU on any PC.

Real-world Tests

So how about real-world rendering? I thought I would test this with Sony’s own “Red Car” test that is posted on their GPU Acceleration page. So I downloaded the 2GB project and ran it against the two computers. I should mention that Sony did their measurements with the preview set to Best(Full) at 900x506 also (which is why I used that in my previous tests) but they used different render templates so please read the Sony instructions and use the same render templates as below.

Here’s what I measured with the “Red Car” project:

------------------------------------------------
SONY “Red Car” Project
------------------------------------------------
NVIDIA Quadro 4000 w/Core i7-3930K 3.2Ghz 6c/12t
------------------------------------------------
MainConcept AVC Internet 1080-30p . . . . . 1:34 (94 seconds)
XDCAM EX HD 1920x1080-60i 35Mbps . . . . . 1:42 (102 seconds)

------------------------------------------------
ATI Radeon HD 5870 w/Xeon 2.8Ghz 8c/8t
------------------------------------------------
MainConcept AVC Internet 1080-30p . . . . . 0:57 (57 seconds)
XDCAM EX HD 1920x1080-60i 35Mbps . . . . . 1:15 (75 seconds)

What amazed me is that Sony's "Red Car" project plays back at full frame rates 29.97 Best(Full) all the way through using the Radeon HD 5870 but the Quadro 4000 could not maintain the 29.97 fps throughout the timeline like the Radeon HD 5870 did. Again quite shocking to me, because the Radeon is in a slower computer but I guess it might have more to say about Sony making better use of OpenCL with ATI cards than NVIDIA cards for timeline GPU acceleration.

Conclusions

Observation 1: GPU acceleration absolutely works! I got a 10x improvement with timeline GPU on + MainConcept AVC

Observation 2: It appears that Sony makes better use of OpenCL than CUDA, therefore ATI cards give better GPU performance than NVIDIA

Observation 3: The ATI Radeon HD 5870/6970 is about the fastest GPU that works with Sony’s requirement for older cards. (newer cards are actually slower with Vegas Pro 12.0)

From these tests I have concluded that the ATI Radeon HD 5870 is the sweet spot for GPU acceleration in Vegas Pro. Of course the 5870 is no longer available but the Radeon HD 6970 is a newer model of this card and still for sale. Nothing more powerful (i.e., 7xxx series+) will work as good.

Now I realize that the Quadro 4000 isn’t the fastest card from the Fermi line up. I’d be interested in hearing what timings others are getting with their GeForce cards. Maybe there is one that's faster than the Radeon HD 5870. I'm sure everyone would like to know.

I know we’ve been “benchmarked out” on this forum but I thought having a benchmark that we can run in Vegas Pro 12.0 and then in Vegas Pro 13.0 when it is released will tell us if Vegas Pro 13.0 has improved anything in the GPU dept.

Bottom line for me: I got a whole 8-core/16GB Mac Pro with Radeon HD 5870 for less than my $800 Quadro 4000 alone and it performs better than my current computer in real world playback and render tests with Vegas Pro 12.0. I don’t think I would ever waste my money on a Quadro card again.

~jr

Comments

Grazie wrote on 4/2/2014, 6:49 AM
John, that has to be some of the sane-ist analysis of this CPU<>GPU tangle to date. Thanks for your time and posting it here in such a straightforward and attainable way.

I don’t think I would ever waste my money on a Quadro card again.Breathe out and relax . . . .

Grazie




Grazie

PC 7 64-bit 16gb * Intel® Core™i7-2600k Quad Core 3.40GHz * 2GB NVIDIA GEFORCE GTX 560 Ti

PC 10 64-bit 64gb * Intel® Core™i9 - 3.3GHz * 40Gb NVIDIA  GeForce RTX 2070

Cameras: Canon XF300 + PowerShot SX60HS Bridge

paul_w wrote on 4/2/2014, 8:00 AM
Johnny, I remember when v11 came out and a few of us did exactly these tests back then. I personally spent days producing test tables and comparing results from GPU on / GPU off combinations. Its a time consuming task so thanks for taking the time with this. And yes, maybe it is becoming tiresome for some.

Some of the points raised last time during v11:

Dynamic RAM preview settings play a big part in the render speed results, and these settings are also affected by content type. Graphics for example render out with different results compared to video media on the TL.
CPU Threads settings also has a big effect on GPU speed.
Preview frame rate performance is not the same as render speed performance and should not be assumed to be so. A GPU card giving great render speeds may not actually yield a faster preview frame rate for example.
Machine stability and incorrect rendering were noted one some cards that were fast, but not on other cards. Stability and consistency in rendering obviously take priority over speed.
Other hardware, mainly noted was CPU speed, was also a factor. In particular when a users CPU was faster or matching their GPU capability. Turning GPU off helped in those cases.
in short; its complicated!

I remember going round in circles for weeks debating just how to get that magic card and settings.. The answer in the end was there is no magic card or settings, only "whatever works best for you". Because there are too many variables.

One other thing of note was we all wanted to know which card we should use and asked SCS for guidance. Of course they cannot answer this because of their ridiculous company policy of not mentioning or favoring other manufacturers products (if i got that right). However, by looking at their own 'red car' advert render test, their own notes do include references to a few GPU cards as used in 'their' tests. This to me is the answer. Use the cards they do. Now i cant remember the exact cards off hand, it was 2 years ago and i do not have the project on this machine. But i know the GTX 570 was one of them, and that's why i bought one. I believe the 4000 was also tested along with an ATI card.
Lastly as i mentioned above, Preview frame rates and render rates are not necessarily linked with a single cards performance. It was generally found that Nvidia cards were better for render speeds and ATI cards better and screen previews. Things may have changed by now, that was v11.

Hope this contributes.
Paul.

videoITguy wrote on 4/2/2014, 9:51 AM
Johnny should receive high praise for a well done test process and investigation.

The interesting thing is that it actually does not have any surprise for us who have been studying the forums ever since the bugged first release of VegasPro12 over a year ago.

Many forum members have pointed out in an assortment of posts under different headings that we could have well expected (and I think I assumed) the conclusions he has reached.

The reasoning for this is quite sound as VegasPro 12 assorted builds over the year have not drastically changed anything of the underlying core code. Several Sony engineers have alluded to some tweaks that might tune the process but would still lead us to Johnny's findings as a whole.

I think this will be a valid premise for understanding how to deal with VegasPro13.
Barry W. Hull wrote on 4/2/2014, 10:15 AM
Thanks a lot JR. You just rendered my high dollar NVIDIA Quadro K5000 a fancy door stop. Maybe VegasPro 13 will make it all worthwhile. Fingers crossed.

Good advice Grazie, I'm breathing and relaxing.
larry-peter wrote on 4/2/2014, 10:44 AM
Maybe it's time for me to consider moving to ATI cards in the Vegas systems. From what I recall, even back when SCS first published its list of acceleration-capable GPUs, ATI was showing better performance/price.

I won't boat-anchor my Quadro card yet. My 5 year-old system with a lowly (in the Quadro line) FX1800 w/ 306.97 driver (system 1 in my specs) outperforms my newer system 2 with 560ti in timeline playback, GPU rendering and Vegas stability.
NormanPCN wrote on 4/2/2014, 11:29 AM
Thanks JR for the detailed report.

Sony AVC only partially uses GPU for acceleration so it only gets small speedups. Motion estimation is all I think is implemented on GPU.

Mainconcept AVC OpenCL or CUDA are completely unique encoders from CPU encoders and they run mostly on the GPU.

For those considering AMD. The 7xxx series and newer cards are NOT supported by the Mainconcept AVC OpenCL encoder. It just falls back to CPU use. I found out the hard way, and sad for me. Boo Hoo. Old 5850 to a new 7950. Maybe MC AVC OpenCL will be updated in Vegas 13.

Sony AVC, just like Vegas, uses the newer AMD cards just fine.
johnmeyer wrote on 4/2/2014, 1:34 PM
JohnnyRoy, you are my hero! Actually, you were already that before this post, but I found this very, very useful. When Vegas 13 comes out, I'm going to try one more time to upgrade, and I'll be referring back to this post (and whatever it spawns) many, many times.
ushere wrote on 4/2/2014, 5:51 PM
thanks jr - great in depth analysis. will wait for 13, but will bear in mind your thoughts when considering a new video card.

set wrote on 4/2/2014, 7:16 PM
JohnnyRoy, Thanks for your test... I curious to make measurement with my spec#1, ATI Radeon HD5750, and compared with your 5870:


-------------------------------------------------------|-----------
My system specs'#1 ATI Radeon HD 5750| JR's 5870
-------------------------------------------------------|-----------
Timeline GPU Acceleration OFF Playback (fps) 0.4| 0.5
-------------------------------------------------------|-----------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 1:38| 1:34
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 1:36| 1:30
MainConcept AVC Internet HD 1080p (CPU Only) . . . 2:21| 2:40 *
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 1:28| 1:23
-------------------------------------------------------|-----------

-------------------------------------------------------|-----------
Timeline GPU Acceleration ON Playback (fps) 14 |29.97
-------------------------------------------------------|-----------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 0:22| 0:31 #
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 0:28| 0:29
MainConcept AVC Internet HD 1080p (CPU Only) . . . 1:03| 0:54
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 0:22| 0:15
-------------------------------------------------------|-----------


As detected via http://addgadgets.com/gpu_meter/ and http://addgadgets.com/all_cpu_meter/ Win7 gadgets:

During Mainconcept rendering:
While rendering with timeline GPU setting ON and Mainconcept GPU ON, the GPU usage counted 90%
Timeline GPU, mainconcept CPU only: the CPU usage shows 85-99% WITH GPU usage variably goes around 1-10%
Timeline GPU, but Mainconcept GPU on isn't detectable from GPU meter gadget.

Something interesting with my Sony AVC rendering:
Timeline CPU, render CPU, having variable 1% GPU use,
Timeline GPU, render GPU, shows 62% GPU usage, 28 seconds rendering,
Timeline GPU, render CPU, shows 54% GPU usage, BUT took 6 seconds faster !

Set
NormanPCN wrote on 4/2/2014, 7:42 PM
Sony AVC has the following options in "encode mode" at the bottom of the dialog.

Automatic
Render using CPU only
Render using GPU if available
Intel quicksync(speed)
Intel quicksync (quality)

Automatic is like the render using GPU if available option.

The Quicksync options use the Intel Quicksync hardware encoder and this is strictly different from the Sony AVC encoder. Sony just choose to put Intel in their templates.
set wrote on 4/2/2014, 8:34 PM
updated above..., with interesting result for Sony AVC rendering...

But in RedCar test, JR's is far fast :)
Mainconcept requires 2:20
XDCAM requires 5:02
TheHappyFriar wrote on 4/2/2014, 10:40 PM
AMD Phenon 9600
6gb RAM
Windows 8.1
AMD 7850 2gb RAM

-------------------------------------------------------
ATI Radeon HD 7850
-------------------------------------------------------
Timeline GPU Acceleration OFF (Playback 0.081-0.153fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 9:59
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 10:02
MainConcept AVC Internet HD 1080p (CPU Only) . . . 11:34
MainConcept AVC Internet HD 1080p (OpenCL) . . . . gave up (~12 minutes)
-------------------------------------------------------

-------------------------------------------------------
Timeline GPU Acceleration ON (Playback ~21fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 1:06
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 1:02
MainConcept AVC Internet HD 1080p (CPU Only) . . . 2:37
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 2:45
-------------------------------------------------------


Obviously my CPU is a big bottleneck here, but with GPU Accel on & using sony AVC, I am getting times ~30 seconds slower then you guys with I7's. My CPU wasn't even the fastest when I bought it at the start of '08. In fact, besides the GPU, my RAM is slower (400mhz), CPU is slower, odds are my drives are slower. For preview, this is a major increase in speed. For rendering though, I normally make DVD's and I normally render overnight/when out of the house, so the AVC speed isn't that important to me.
JohnnyRoy wrote on 4/3/2014, 5:06 PM
Thanks for all the kind comments. I don't think I qualify as John Meyer's "hero" when after all, he is one of my heroes for his visionary work at Ventura Software back in the day (but I appreciate the compliment John). ;-) After several nights of benchmarking I just had to publish the results somewhere and I'm glad that the community is finding it useful. I'm also glad that others are publishing their results. That's what I was hoping for.

I think TheHappyFriar has proved the theory that the AMD 7xxx series cards start to break down with Vegas Pro because of their newer architecture. The 6xxx and 5xxx are really the latest you can use with Vegas Pro 12.0. Let's hope that Vegas Pro 13 improves on this. Once it's released. I'd be interested to know how those cards perform because I can't benchmark this myself because both my Quadro 4000 and Radeon HD 5870 are from the older architecture.

Keep up the benchmarks. More data points will help everyone understand better. Thanks!

~jr
TheHappyFriar wrote on 4/3/2014, 9:19 PM
I wouldn't say the 7xxx or R series do any worse. My machine is almost a decade older then the other couple of benchmark posts. More benchmarks are needed.
NormanPCN wrote on 4/3/2014, 9:36 PM
I wouldn't say the 7xxx or R series do any worse.

I would say worse for Mainconcept AVC, but better for everything else.

@Hulk did something a while back and collected a bunch of data.
http://www.sonycreativesoftware.com/forums/ShowMessage.asp?Forum=4&MessageID=874761

A lot of AMD GCN cards near the top, but not with regards to MC AVC OpenCL.
There is one 6870 setup in that table.

TheHappyFriar wrote on 4/3/2014, 10:16 PM
Yeah, not sure why that is. Also makes no sense then enabling GPU in Vegas for editing/preview effects the separate render processes.
NormanPCN wrote on 4/3/2014, 10:28 PM
It makes perfect sense. The MC AVC OpenCL encoder does not support AMD GCN chips (Graphics Core Next).

Both playback and "render as" (aka encode) use the exact same video composite and effects engine. This is what renders the unified video stream. Playback just sends this to the display device(s).

The encoder may, or may not use GPU for any acceleration. It does not need to know or care what happens before it. All the encoder knows it that it is given one frame after another in the video stream and it writes a file to disk.

So no matter what you do you will always get GPU acceleration from the video engine. You may, or may not, get additional accel from the file encoder ("render as").
john_dennis wrote on 4/3/2014, 11:04 PM
No time to analyze now, but will post Intel i7-3370k with only embedded graphics.


-------------------------------------------------------|-----------
My system specs'#2 i7-3770k / HD4000 | JR's 5870
-------------------------------------------------------|-----------
Timeline GPU Acceleration OFF Playback (fps) ~0.5 | 0.5
-------------------------------------------------------|-----------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 1:12| 1:34
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . N/A | 1:30
Sony AVC Internet 1920x1080-30p (QuickSync-Q). . . 2:38| N/A
MainConcept AVC Internet HD 1080p (CPU Only) . . . 1:33| 2:40
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 1:34| 1:23
-------------------------------------------------------|-----------

-------------------------------------------------------|-----------
Timeline GPU Acceleration ON Playback (fps) ~ 1.1|29.97
-------------------------------------------------------|-----------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 0:55| 0:31
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . N/A | 0:29
Sony AVC Internet 1920x1080-30p (QuickSync-Q). . . 2:44| N/A
MainConcept AVC Internet HD 1080p (CPU Only) . . . 1:09| 0:54
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 1:09| 0:15
-------------------------------------------------------|-----------
Sunflux wrote on 4/4/2014, 4:03 AM
Here's my benchmarks for reference. 3-year old system with an Intel i7-990X (6-core), 12gb memory, and an ATI 6870 1gb.

-------------------------------------------------------
ATI Radeon HD 6870
-------------------------------------------------------
Timeline GPU Acceleration OFF (Playback 0.5fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 1:08
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 1:08 (0)
MainConcept AVC Internet HD 1080p (CPU Only) . . . 1:25
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 1:01 (1.4x)
-------------------------------------------------------

-------------------------------------------------------
Timeline GPU Acceleration ON (Playback 28-29.97fps)
-------------------------------------------------------
Sony AVC Internet 1920x1080-30p (CPU Only) . . . 0:19
Sony AVC Internet 1920x1080-30p (OpenCL) . . . . 0:20 (-0.1x)
MainConcept AVC Internet HD 1080p (CPU Only) . . . 0:36
MainConcept AVC Internet HD 1080p (OpenCL) . . . . 0:13 (2.8x)
-------------------------------------------------------
TheHappyFriar wrote on 4/4/2014, 7:01 AM
Any reason when timeline accel is off it seems the encoder ignores the GPU setting and doesn't use it. When the timeline accel is on then it uses the GPU, not the specific encoders. It's the same across the board. I even tried turning off the preview completely thinking maybe it was using the CPU for preview and slowing down the render but it didn't change a significant amount.

That's confusing me.
Sunflux wrote on 4/4/2014, 7:10 AM
No, it can use it even then - I saw a significant drop in the MainConcept encoder test on mine.

I think the difference is when timeline acceleration is off, Vegas does not use the GPU when preparing the original, master frames to hand to the encoder - but the encoder still can use it when compressing. However, when timeline acceleration is on, both processes can use it.

Looking at your test results, my best guess is that none of the encoders are making use of your GPU at any time, or at least not properly, but Vegas itself definitely is.
NormanPCN wrote on 4/4/2014, 11:43 AM
I have seen performance differences with the video prefs GPU option OFF and then switching the encoder GPU option ON/OFF. For a complete test there are four permutations one has to test, but in the real world we only care about two permutations, GPU all ON and GPU all OFF.

Back when I had a 5850 card I saw this with MC AVC OpenCL and Sony AVC. Now with my AMD 7950 card, only with Sony AVC due to the reasons previously stated. Sony AVC performance differences are small since it does so little with GPU. Averaging multiple runs is really important with the small diffs you get with Sony AVC.

Most of my tests are done with my own projects, before I DL'd the Sony demo project. That demo project is super heavy on GPU effects. Sony wanted a max demo for GPU.

Cliff Etzel wrote on 4/14/2014, 9:58 AM
I find it interesting that the issue of GPU performance has become such a hot button topic for Vegas users. TBH, I had all but given up on ever using Vegas Pro again due to the lackluster performance in the editing process for me even though my system specs seem pretty decent albeit a generation back.

The issue of nVidia has me beginning to truly wonder now since I've never used anything but nVidia for my graphics card. Since I had made the move to Adobe PPro a couple years ago, I've begrudgingly used it as it was abler to use my graphics card for the most part due to being coded for UDA, but I find the editing experience laborious instead of a joy like it was when I edited in Vegas and now I"m revisiting my primary NLE choice since I refuse to upgrade to a subscription based model that Adobe (and soon AVID) forces on it's users.

Enter the revisit of Vegas Pro. Since I dont' collaborate with others I'm either considering Vegas or the idea of moving to the MAC platform and FCPX (gasp!)

I'm 100% Canon DSLR video footage using the Technicolor Cine Profile with Magic Lantern in addition to stills based work.

JR - I know you said you purchased a used Mac Pro tower and although I have worked on the platform, I don't like the interface, opting instead for Win 7 Pro for all my computers (my third is running Windows Home Server 2011 for file/media sharing on my internal network)

JohnnyRoy - is there really that big a performance difference in your testing that points to AMD being a better solution for GPU/performance as a general rule for Vegas Pro 12?

Cliff
mdindestin wrote on 4/14/2014, 12:08 PM
Sorry, a little off topic. Incidentally, just like Cliff, we're using all Canon DSLRs with Magic Lantern and Cinestyle.

I'm wondering if that particular MacPro will run Maverick without issues?

My daughter is strictly a Mac gal and has gotten really good with iMovie. I'd like to give her the capability to use FCP X while I maintain the ability to continue to use Vegas.