Comments

Steve Mann wrote on 3/19/2011, 4:48 PM



Might be nice, but be careful what you wish for.
Be careful what you wish for.
GPU support is not the Brass Ring that you imagine it to be.

In my opinion, the architecture of Vegas will not support GPU for preview because by the time the CPU can prepare the data for the GPU to finish rendering, it may as well just send the frame to the display itself. No one with knowledge of the internals of Vegas has ever said otherwise.

Did it ever occur to anyone that Adobe, Avid, etc, use the GPU as a crutch because they aren't using the CPU efficiently?

GPU's excel at things like textures and shading, but they are not much faster than the CPU at basic rendering.

Besides, why would you want to tie your editing hardware to a select list of "supported" GPU cards?
farss wrote on 3/19/2011, 5:33 PM
"Did it ever occur to anyone that Adobe, Avid, etc, use the GPU as a crutch because they aren't using the CPU efficiently?"

Quite the contrary.
Vegas relies on Moore's law and that has been failing us for years now. CPU clock speeds have not increased so instead we are using more cores. Even that has a limitation for what we do, buss bandwidth is also stuck and not moving forward due to physics.
Adobe did sort of hint at this in one of their presentations when they pointed out that spending money on a system that had more than four cores would be a waste with their software, more cores yield no improvement.
Today's top shelf video runs at 3Gbps, cameras and recording systems capable of this are now coming down in price and Sony is new getting serious about cameras that will increase the data rate by a factor of 4 as 4K moves towards being the norm. Professional S3D also doubles the data rate if you want to feed two streams of 10 bit video over HD SDI to your monitor.

Leaving the CPU to handle what it has done well for a long time, switching the native streams of video to hardware capable of the decoding , rendering and packeting for the display system makes a lot of sense. Of course this hardware is not cheap. I and the average Vegas user simply doesn't have that kind of money. Recent developments driven largely by the gaming community have bought the costs of high end GPUs down somewhat although still pretty high if you throw in the HD SDI daughter board from nVidea.

I would say the three "A"s have taken the correct course for their market although Adobe have kind of broken out a bit by bringing the cost of using off the shelf hardware within the reach of the upper segment of the prosummer user. I suspect you are right when you say Vegas would need a complete redesign to work in this fashion and much of what appeals to the average Vegas user would be lost in the process. The market already has the three "A"s vying for market share and arguably SCS would be nuts to try to crack into that marketspace.

For the most part the way Vegas works and its limitations are fine for what I do. Sure the few times I've found myself doing a supervised edit session I've been wishing I wasn't using Vegas as the client complains about dropped frames and "something wrong with my video" but I invent some excuse and it all comes out OK when they get the final result and the invoice. I can do recommend a post facility where frames don't get dropped if that's a real show stopper for them, helps to have their rate card on hand. I also don't have leather lounges and a barrista :)

Bob.
ushere wrote on 3/19/2011, 5:48 PM
actually i'd rather have a bug free (ha!) 10d than have to pay for an upgrade to 11.

in off forum conversations it seems there's quite a number of people wondering just where scs is heading, and whether after many years they (scs) have lost the plot.

my experience was that till 7 vegas was as good as it gets for reliability and robustness - while all around me had endless crashes, bods, vegas just plodded along. since 7 it's been (albeit) slowly downhill.

thankfully i no longer have to depend on vegas in a 'working' environment, that is, a client sitting over my shoulder, nor do i really need to stretch vegas 10 to it's limits with the work i do nowadays - however, other than adding features, little seems to have been done with the basics; bugs aren't repaired or caught before release, or worse, introduced with new releases, vegas must be the last nle using vfw, and little attention is paid to trying to maintain vegas's reputation as a reliable and robust system.

i am much too old to bleed to death on the cutting edge. if 10d does not address my concerns it will be the last version i install. where i might go is another matter, but i will certainly not continue paying for superfluous features at the expense of reliability.
Hulk wrote on 3/19/2011, 7:03 PM
While I would like a serious GPU offloading effort from Vegas I disagree that Moore's Law has failed the Vegas Engineers.

While clockspeed has not increased, IPC (instructions per clock) and overall efficiency has increased dramatically per tick of the CPU. The Core2Duo was roughly twice as fast in Vegas core-to-core as a like clocked P4. So a 3GHz C2D is FOUR times as fast as a P4 at 3GHz. That's a huge increase.

In addition, in moving to the Penryn die shrink the core became not only faster in clockspeed but gained additional efficiency per clock as well.

Another increase to 1st Generation I7, and now a very significant leap ahead with Quad Core Sandy Bridge. I think Sony has quite a bit to work with as far as native CPU power.

Also, video editing by nature is a highly parallel operation by nature. A "dumb" parallel routine could simply divide the screen into 4 horizontal sections for rendering. Or even have each core render a frame so 4 frames are rendered at once. There would be a slight delay in playback for preview but it would be insignificant and all available cores would be floored at 100% for rendering and preview.

Finally I would bet there are many routines that could be coded to the various SSEx instructions in the Intel chips, not to mention the possibility of hand coding in assembly the most used and cpu intensive routines. Perhaps the color correction plug-in to start. Maybe add a few with each update of the program. I have done some Assembly programming (long ago) and I know how time consuming and tedious it is. But I also know there are enormous performance gains, especially when you're talking about doing the same basic operations over and over again.

I'd like to see some of these measures taken before any serious effort is given to the GPU. Just another opinion to put in the grinder....

So I think actually the Vegas engineers have failed Moore's Law.

- Mark
jwcarney wrote on 3/20/2011, 4:22 PM
Adding support for OpenFX was a significant rewrite. Vegas has not failed moores law, and when the Sandybridge Workstation cpus roll out this summer, things will get even more interesting.

Then there is the upcoming release of the new FCP. Waiting to see what that will bring. Probably be more Vegas like, hehehe. (Just joking).

Anyway, every time I look at Adobe and Avid, there just isn't enough to make me want to switch.

I'm going off the wall on this one.
Add better integration with vfx programs like Nuke and Fusion.
Enhanced support for OpenEXR, which is finally becoming the popular as a viable alternative to 16bit TIFF. (like support for multiple layers).
Other than that, no complaints with Vegas and what I'm using it for (Independent movies).