Will Sony Vegas EVER Use The Power Of GPU's?

Simanticus wrote on 8/5/2009, 10:52 AM
Hello Sony Vegas Team,

When will you make use of the full power of today's computing by acknowledging and using SLI MULTIPLE GPU systems, or even acknowledge and use the GPU on single GPU card systems like other NLE systems are starting to do? It would certainly increase the framerate when using the preview, which is the most important part of editing.

9.0a does fix some of the choppiness of 9.0 which made editing simply impossible on time-lines with multiple effects, but accessing the GPU or SLI configurations would make it smooth as butter.

Doesn't it seem a bit bizarre that a NLE would ignore this technology which can do nothing but enhance the performance of their software?

Will this feature ever be addressed?

Comments

gwailo wrote on 8/5/2009, 11:02 AM
If sony can't even make generated text stay on the timeline without screwing up, do you really think they can handle re-programming Vegas to take advantage of GPU acceleration?
bStro wrote on 8/5/2009, 11:33 AM
Hello Sony Vegas Team,

This is a peer support forum. To address a complaint / request to Sony, then write to Sony. There's nothing any of us can do about it.

Rob
Simanticus wrote on 8/5/2009, 11:42 AM
If Sony Vegas developers don't even read their own forums, abandon ye all hope, those who are Sony Vegas customers...They DO read this forum, else how do they know what their customers are saying?

Quote from Sony:

"We monitor these boards periodically."

I just finished searching the web for people with the very same question and it is rampant, so what better place to open a dialogue about this missing, and very important element then here, rather than all over the web.
Simanticus wrote on 8/5/2009, 11:50 AM
Has anyone in this group found a way to speed up the preview, if so, please let me know. I do not know if SLI is of any benefit to NLE, but making use of the GPU for rendering sure would be.

UPDATE:

I just did a search and found this product, DIVIDE FRAME SONY DECODER. I just installed it and will check it out.

Here's the link: http://www.divideframe.com/?p=gpudecoder
drmathprog wrote on 8/5/2009, 12:06 PM
"We monitor these boards periodically."
"periodically" covers a lot of territory! ;-)
wilvan wrote on 8/5/2009, 12:36 PM
"Has anyone in this group found a way to speed up the preview, if so, please let me know. I do not know if SLI is of any benefit to NLE, but making use of the GPU for rendering sure would be" .

Oh yes , for sure , ever done some small few seconds HD material in after effects ? Gosh , that crawls , even on my dell T7500 dual Nehalem Xeon 24 GigsRAM and dual FX3800 workstation.
When then returning to vegas , one wonders whether speed of station has trippled and more .
( am learning my youngest daughter vegas 7 on an old medion , running xp , very strict minimum services and even there it runs HD smoothly enough ( without too much fx ))

Sony  PXW-FS7K and 2 x Sony PXW-Z280  ( optimised as per Doug Jensen Master Classes and Alister Chapman advices ) Sony A7 IV
2 x HP Z840 workstations , each as follows : WIN10 pro x 64 , 2 x 10 core Xeon E5-2687W V3 at 3.5 GHz , 256 GB reg ECC RAM , HP nvidia quadro RTX A5000 ( 24GB ), 3 x samsung 970 EVO Plus 1TB M.2 2280 PCIe 3.0 x4  , 3 x SSD 1TB samsung 860 pro , 3 x 3TB WD3003FZEX.
SONY Vegas Pro 13 build 453  ( user since version 4 ) , SONY DVDarch , SONY SoundForge(s) , SONY Acid Pro(s) , SONY Cinescore ( each year buying upgrades for all of them since vegas pro 4 )
(MAGIX) Vegas pro 14 ( bought it as a kind of support but never installed it )
SONY CATALYST browse 
Adobe Photoshop  CC 2025
Adobe After Effects CC 2025 & Adobe Media Encoder CC 2025
Avid Media Composer 2024.xx ( started with the FREE Avid Media Composer First in 2019 )
Dedicated solely editing systems , fully optimized , windows 10 pro x 64 
( win10 pro operating systems , all most silly garbage and kid's stuff of microsoft entirely removed , never update win 10 unless required for editing purposes or ( maybe ) after a while when updates have proven to be reliable and no needless microsoft kid's stuff is added in the updates )

PerroneFord wrote on 8/5/2009, 1:25 PM
That's awesome... if you're using AVCHD. I tend to work with editing or mastering formats on the timeline, not highly compressed capture formats.
warriorking wrote on 8/5/2009, 1:28 PM
What type of footage are you importing into vegas, I found that when I purchased NEOSCENE and converted my AVCHD footage it played flawless in the timeline when editing, even when using multiple camera editing with 3 different video files in Vegas 9a, where as before the playback would be very choppy and very hard to edit .....Neoscene has been a lifesaver for me....
blink3times wrote on 8/5/2009, 2:20 PM
"If sony can't even make generated text stay on the timeline without screwing up, do you really think they can handle re-programming Vegas to take advantage of GPU acceleration?"

Frist, GPU acceleration is not all it's cut out to be. I spent a fair bit of money upgrading for the gpu acceleration abilities of PP and was surprised to see that all the extra money spent gets me just a BIT further.... but once I get into the more complex effects/transitions the video preview goes right back into the gutter. No doubt it is better than Vegas's present VFW system.... but this GPU stuff is not THE answer.

Between Vegas's dynamic ram system and PP's gpu acceleration... Vegas's system seems to be better in the long run. No matter what you throw at it (as long as you have enough memory), you get full frame rate playback every time. With PP.... once you max out the gpu abilities (which isn't hard), it's back into jerk-ville again with no way out

Without a doubt SCS DOES need to come into this century with preview playback... and i think they know that. Their efforts to bolster the VFW playback in V9 is a pretty clear indication that they recognize it's just not up to par and I wouldn't be surprised to hear they have a few Engineers in some back room figuring the next move as we speak.

But I think this GPU acceleration alone is quite overrated and what we really need is maybe a combination of GPU and Vegas's dynamic ram system... greatly expanded and more detailed.
Simanticus wrote on 8/5/2009, 2:21 PM
I'll check out Neoscene.
enespacio wrote on 8/5/2009, 4:37 PM
I am using Divide Frame's gpudecoder 1.05. The program does have several issues. One is that a clip will go offline if you reverse it or if you change the velocity. They are currently working on that problem. It will also occasionally drop frames and or turn your clip “red”. I’ve also notice problems when using Autoripple. My workaround is to save and close the project. Upon reopening there are no dropped frames or “red clips”. I immediately render at this point.
In spite of these issues, it does improve playback during editing. I film everything on my HFS-100 maxed out at 24mbps. There is also an added benefit that I believe was unintended by the programmer. The gpudecoder “import media” settings bring my .mts files in with more clarity and crispness. I have tried to match the quality every way possible in vegas without gpudecoder installed, from changing the Project Settings to adjusting color, balance, etc. however nothing seems to match the initial import quality, which in turn improves the final output.
The link below is a camtasia video I made showing the difference in the “import media” settings. The differences aren’t as noticeable on the video as they are live. If there is a codec out there that would match the gpudecoder “import media” settings I would love to know about it.



BTW I feel like Vegas is an incredible program. Having done a lot of beta testing for other software, it’s not realistic to expect any software to be flawless unless it is NEVER enhanced. I’ve never worked with any software that didn’t require some type of “workaround” at some point.
cliff_622 wrote on 8/5/2009, 5:26 PM
I could easily be wrong on this,

However, I have always been under the impression that most GPUs are optimized for rendering "VECTOR" graphics. How well do today's GPUs handle full RGB "RASTER" data?

We all know that video NLEs are not handing off vector points, polygon geometry and shading data to a video card. Vector-rendering GPUs have specific hardware chips to handle that task. Raster video math is completely different.

I dont know,...but what cards today will accept, process and send back RGB raster data to the program (NLE) that sent it.? (and send it back "exactly" as the NLE is expecting to see it returned.)

I'm not saying it doesn't or can't exist,...I'm just saying it not as easy as some might think to program and process.

CT
farss wrote on 8/5/2009, 5:37 PM
You're certainly correct that mostly the GPUs excel at rasterising vector graphics.They can however be used to process RGB rasterised data and are used for exactly this. Getting the data back from them is also possible however the buss is not designed for getting it back very quickly which is a limitation.

That said I played around with Scratch a few years ago at NAB. Truly awesome for the money and it aonly can do what it does so cheaply thanks to the nVidia card and the GPU. However the realtime 4K output is only available on the HD SDI port and that's just fine for the purpose, previewing during grading. The final render is still done through the CPU.

Thing is GPUs are very useful, no deny that. Problem is people need to understand the limitations of using them. Certainly Adobe were quite upfront about this during their CS4 roadshow. It could be the greatest thing since sliced bread or a quite useless feature depending on what you're doing and where you work in the market.

Bob.
cliff_622 wrote on 8/5/2009, 6:18 PM
How does this process work in the simplest terms?

1.) NLE reads media from drive
2.) NLE decodes media (whatever it is via CPU) and sends raw o's and 1's to GPU with instructions on what to do with it. (i.e video processing or audio FX)
3.) GPU knows what to process and how and executes instructions.
4.) GPU sends newly processed data back to NLE to preview or for file rendering .

I dunno. Is the GPU processing and execution workflow done in the exact same manner as the current CPU data processing design?

Does a GPU require a hardware codec chip (i.e. AVCHD) onboard to really accelerate this stuff? Or,...is the new data handed back to the NLE uncompressed?

I have no idea. Anybody know?

CT
rmack350 wrote on 8/6/2009, 8:06 AM
I think if you took a break and slept on it you'd find that "knowing" how it works at that level doesn't help you much.

In general, the NLE has a processing chain where the media stream goes from codec to filter to filter to filter to codec (it's a rough example). Any of these points could send the media stream to the GPU and receive it back to send it to the next point on the chain.

The media stream *should* be in an uncompressed state between the points where the codec decompressed it and the point where the codec recompresses it. In other words, it uncompressed as it hits each of the filters - so the filters should be returning uncompressed streams.

A graphics card can process vector data and newer GPUs can deal with generalized calculations. The vector data processing is a traditional function of a graphics card and it can be useful to an NLE if you are manipulating a frame in 3D space. The card can map the frame to a surface and manipulate it in that way.

Newer cards can also do general purpose processing such as financial calculations, scientific calculations, audio and video processing, etc. In this case a stream of data is fed into the GPU, processed, and fed back out to the CPU. The data has to first be fed to a program or library that is capable of addressing the GPU. In the case of Vegas you'd want a filter or codec that was specifically designed to do this.

One of the snags to this is that NVIDIA and ATI GPUs would each require there own custom codecs and filters, which makes it unappealing for an NLE vendor to try to write filters that can use the GPU. There are lots of other snags - for example a GPU crash can leave you without a display until you reboot.

As far as bandwidth goes, PCI Express provides equal throughput to and from the graphics card so as long as the card can support it there shouldn't be a bottleneck.

The older AGP bus had poor bandwidth returning from the GPU so AGP card designs didn't need to support much bandwidth returning from the GPU. Initial PCIe GFX cards were built leveraging AGP designs so they also suffered from bandwidth restrictions. I don't know what the state of this is today but the potential definitely exists for a graphics card to have high bandwidth in both directions.

Rob Mack
CClub wrote on 8/6/2009, 6:43 PM
It'll be interesting to see if Windows 7 with http://tech.yahoo.com/news/pcworld/20090806/tc_pcworld/microsoftboostswindows7graphicswithhardware_1DirectX 11[/link] could help with an NLE.

In the linked article, it states, "Windows 7 could speed up conversion of video for playback on portable devices. Users will be able to drag and drop video from PCs to portable devices, with DirectX 11 enabling video conversion on the fly."

And later, "The ability to break up tasks is an evolutionary step for Microsoft in developing operating systems, said Dan Olds, principal analyst at Gabriel Consulting Group. As users demand heavier graphics from PCs, it is in Microsoft's best interests to offer an operating system that breaks up tasks across multiple graphics cores and CPUs, he said.
rmack350 wrote on 8/6/2009, 10:59 PM
If MS builds GPU coprocessing into the OS Vegas will probably adopt it. That seems like an easy prediction :-)

Rob Mack
Coursedesign wrote on 8/7/2009, 9:20 AM
Yes, that seems a very safe bet. So in Vegas Pro 10.

Hopefully Microsoft looked at how Apple did the FxPlug and Core APIs in OS X, these seem to have worked very well for years to provide GPU support to application developers who don't want the hassle of writing this themselves.
Himanshu wrote on 8/7/2009, 7:02 PM
The limitation with DirectX 11 is that it looks like it's a Windows 7 + new hardware combination feature. That means even Windows Vista may not be able to make use of it, like DirectX 10 features aren't available on Windows XP today. Can SCS risk losing WinXP and WinVista customers? Do they feel like supporting separate binaries, and more importantly, pixel-for-pixel incompatibility between those version? I doubt it.

A saner alternative would be to jump on board the OpenCL bandwagon like several other companies have.

The most expensive alternative is for SCS to write a wrapper on top of 3rd party GPU acceleration toolkits so that they can future-proof themselves, but that would require a good amount of development investment up front. But being able to keep compatibility in the future and have full control of the results/computations would be the benefit.

PS: This topic about GPU acceleration should be in an FAQ by now...this comes up every few weeks now it seems!
K Riley wrote on 8/8/2009, 2:55 PM
Great way of spelling things out here. I'm happy I came across these discussions. I have 8.0 , 8.1, 9.0, and 9.0a. I use them all. The text thing surprised me though and got me upset briefly after redoing the editing about 5 times. For a day I spent editing on a 1 hour job. ARRGGHH!!! Oh well though. I found out the text was on a strange tracks Iwasn't familiar with and deleted them and the problem went away. So I don't know if that was it, but I think I'm switching back to 9.0 until I hear some good news about things.
Seth wrote on 8/13/2009, 12:59 PM
Hey, here's something that may have flown under the radar of many of our user base:

http://www.divideframe.com/
Seth wrote on 8/21/2009, 1:24 PM
Again, it's a third party solution, but that's why it will be developed better, faster:

http://www.divideframe.com/

Comments? Anyone tried it?
Himanshu wrote on 8/21/2009, 4:14 PM
A search of the forum would have yielded this thread about divideframe.