Comments

p@mast3rs wrote on 12/26/2005, 4:36 PM
any graphics card should be fine but remember, Vegas does not utilize graphic cards for any type of rendering. On one of my desktop systems, I have a cheapo $30 graphic card that does the same as the higher priced ones inside Vegas.

It really depends if you plan on doing any gaming on this new system.
Harold Brown wrote on 12/26/2005, 4:45 PM
Some software takes advantage of the graphics card GPU. Do you plan on buying any of those? You never know, Vegas 7 might as well. You should buy the best card that you can afford.
p@mast3rs wrote on 12/26/2005, 5:06 PM
" You never know, Vegas 7 might as well." As cool as that would be, I just dont see it happening. Plus, its kind of nice not being tied to hardware (ala Avid).
Harold Brown wrote on 12/26/2005, 5:30 PM
I agree pmasters but if you could turn if on or off that would be great. In th mean time I have to save my pennies for a dual core!
GlennChan wrote on 12/26/2005, 6:25 PM
The video card doesn't really make a difference. Go for a dual monitor video card if you have two monitors (~$50 or less). ATI or Nvidia or Matrox are fine (there are only old Matrox cards near the $50 price point).

2- I think topics like these have been covered many times. You can do a search and bring up lots of old threads.
kirkilj wrote on 12/26/2005, 8:10 PM
The ideal rendering architecture for me would be one that did NOT require hardware assist, but one which could take clear advantage of it - if the optional hardware were present.

Magic Bullet's v2 Editor claims to do just that. I'm in the middle of a 40-hour software-only render on a P4 2.6GHz machine, but they (Red Giant) claim to deliver real-time performance (25 to 30fps) with an nVidia7800GTX card. More modest acceleration times work with lesser cards such as the 6800 series. I've been researching nVidia's SLI dual-card capability as well, which provides 1.7x of the render speed of a single card config for rendering operations. Magic Bullet's filters require a ridiculous amount of pixel processing, which is offloaded to the graphics card if present. Most of the time-consuming filters in Vegas look like they could benefit from this capability as well. ATI's Crossfire products are newer but also support multi-card configs for faster rendering operations. Since nVidia and ATI are the two leading mainstream GPU vendors, it's not like there would be a ton of APIs to support if Sony wanted to depart from it's software-only rendering platform.

I figure if Red Giant can do it, Sony could too if it was a priority for them. The Sony Media Software development teams obviously have a lot on their plate with several products in an upgrade cycle to nurture and manage, but software-only rendering starts to sound like a mantra as if hardware lock-in was an unescapable consequence of OPTIONAL hardware assist.

See http://www.redgiantsoftware.com/mbe2whatsnew.html for their claimed speed advantages with hardware assist.

And yes, I just downloaded UltimateS v2 with Reelpak1, so I'll see how it compares. I'm also researching an ASUS A8N32-SLI mobo with support for both dual-core Athlons and SLI GPU configs to have the benefit of fast CPUs with the option for multiple graphic cards which could someday be employed by Vegas for time-sensitive functions. Network rendering doesn't provide enough of an advantage as far as I'm concerned. Even on a gigabit network, the data transfer overhead makes all but some scenarios a wash.

- John

mark-woollard wrote on 12/27/2005, 5:12 AM
I certainly support the view that Sony should add optional GPU processing. It helps keep Vegas competitive in some production environments and might bring more Magic Bullet users into the Sony camp.

I love a couple of the Magic Bullet effects but haven't made use of them enough to justify upgrading my video card. If Vegas is updated to use GPU processing, I'd get that new video card in a flash.

Maybe nVidia could give Sony an incentive to undertake the development.
Coursedesign wrote on 12/27/2005, 6:54 AM
Fry's is selling nVidia FX5200 cards for $0 after rebate.

This is their lowest performance current card, but it is rock stable and even has OpenGL 2.0 drivers and both DVI, VGA, and TV out. I have been using one of these in one of my Vegas machines for a long time, highly recommended if you don't need max Magic Bullet boost.

The reasoning of SF/Sony in not wanting to bother with hardware support was very reasonable. It is expensive to develop new hardware products, they didn't have so much volume, and hardware quickly gets obsolete.

With the current OpenGL/DirectX standards, the functionality is well defined, the hardware is made in very high volume by somebody else, and the processing just gets faster every year instead of remaining stagnant.

It really should be a no-brainer for the Vegas team to do this, unless they just want to be last on the market.
GlennChan wrote on 12/27/2005, 11:03 AM
I'm sure the development team does the best that they can to make the program run faster. GPU / video card acceleration may not be the best way to do it, since every filter has to be programmed in a special way to run on the GPU. They have to re-program the filter specifically so it can run on the video card. They also have to make versions for both ATI and Nvidia (notice how Magic Bullet Editor's doesn't run on ATI). And anyone with a Matrox card or integrated graphics will be left out of the party.

Essentially the programmers would have to write three versions of each plug-in... this gets messy really quickly.

An alternative approach is to optimize each filter, and everyone will see the benefits of this.

Anyways, they'll do what they can to make the program faster.
Coursedesign wrote on 12/27/2005, 11:10 AM
No, they don't have to write a separate version for each card manufacturer.

They just need to decide on the version of OpenGL and DirectX they want to utilize, and then it's up to the customer to buy a card from any vendor that supports this level.

Not all nVidia cards support OpenGL 2.0 for example, but I think at this point they all support 1.5.

They need certain shaders especially, but these shaders have been standardized.

So what's with ATI? They have lost their ways. They used to have the best drivers, but that was a long time ago, and today I don't know of any serious software vendor who recommends them, except for their very expensive workstation cards. The rest are purely for gaming, or of course "general office use".
Steve Mann wrote on 12/27/2005, 9:18 PM


Yes, you can use higher-level instructions to send instructions to any compliant GPU, but the problem with this approach is that you need the processor in the workflow to send the instructions to the GPU. This overhead defeats the gains of using the GPU, but your game will run on any compliant card.

If you want the maximum benefit from the GPU, then you need a custom driver unique to a specific GPU chipset, leaving the processor out of it.

Also note that most, if not all high-performance GPUs are written for gaming. Rendering textures, etc. They may not add anything of value to the rather mundane needs of rendering or encoding video.

Finally, if I recall correctly, the NLE's that do use a GPU only do so for previews, not for the final render - which is still done in the processor to a file.

Steve Mann
Coursedesign wrote on 12/27/2005, 10:11 PM
Overhead? You're kidding, right?

With PCI Express, both paths to and from the GPU are very very fast, faster than any prior bus. AGP is fast in one direction only, but this can still speed things up quite a bit compared to the main CPU.

It may sound a bit surprising, but video work can use the standard new shaders especially (I have a technical article about this somewhere, will see if I can dig this out). That is what Magic Bullet uses to speed up its VIDEO rendering by a factor of 10-20 or more. That is FINAL rendering, not a preview.

Perhaps you are thinking of 3D applications, which generally use OpenGL for previews only. That is because different cards have slightly different implementations, and by always using the CPU you get a consistency that matters when rendering is done in render farms, and you have a lot of teamwork on projects. This may change as OpenGL 2.0 and DirectX 9 become more widely available, offering enough functionality for better than preview quality.
Steve Mann wrote on 12/28/2005, 9:00 PM
I am of mixed opinions of using any hardware speed-ups in Vegas. I would worry that adding all the extra hardware-dependent code would bloat the size of Vegas, slow it down, and make it far, far less reliable and stable than it is now.

I'm not sure the speed is worth the risks.

Steve