Comments

richard-courtney wrote on 6/17/2008, 7:58 PM
In video editing I have not seen it.

In programs for 3D design such as TrueSpace and Blender, yes.
Nvidea cards seem to get recommended for Blender.
blink3times wrote on 6/17/2008, 8:14 PM
Vegas is one of the few programs I know of that uses pure cpu and memory and does not involve the gpu in some fashion. PP uses gpu for 3d rendering, while Pinnacle studio and avid liquid use gpu for both 3d rendering and what they call background rendering, similar to our ram render except of course the gpu is used instead of the cpu (leaving the cpu free to continue on with the editing).

At one time it was a pride thing to be able to say Vegas is software driven and therefore can be used on any computer, but now I wonder if it isn't rather like shooting one's self in the foot. Today's machines are fast and powerful and the NLE's we use (IMO) should be taking full advantage of the hardware that's out there and available. The graphics cards on the market today are fast, efficient.... and cheap. There is no reason anymore not to take advantage of this.
Himanshu wrote on 6/17/2008, 8:24 PM
One issue with using the GPU may be getting the exact same results from each graphics card as it is possible using the software version of the renderer. Surely one would not want different output depending on whether you had an NVIDIA or an ATI card in your PC! What if you use network rendering and 10 different PCs in your render farm had different types of hardware? Can you imagine how differences in the rendered frames might show up in your final output?

Perhaps the problem can be abstracted out by APIs such as DirectX and OpenGL if they provide the functionality that Vegas needs to render. If the application is cross-platform, then you can rule out DirectX as well. SCS would have to make a significant investment into writing their own library that is optimized for different GPUs while producing identical results. This is my attempt at rationalizing why why we're not seeing GPUs being used widely at the moment.
richard-courtney wrote on 6/17/2008, 8:33 PM
Ironically I don't think 3D programs, such as Blender, use the GPU for rendering,
just 3D movement, CPU is used for rendering.

Best money multiple cores.
Lyris wrote on 6/17/2008, 9:15 PM
I'm on a Quad Core system and previewing is still slower than in the likes of Adobe Premiere. Hopefully Sony does the necessary work to bring this part of Vegas up to speed.
farss wrote on 6/17/2008, 10:01 PM
High end color grading apps such as Scratch and Speedgrade both make extensive use of the NVidia cards with the optional HD SDI boards.
The power of the GPU today is so great that NVidia and Intel are in a quite a spat that'll probably end up in the courts. Intel for over a year have been making much of thi, seemingly begging programmers to stop using the GPU.
So yes, it does kind of beg the question of why SCS has painted themselves into quite a corner over this. To change now would be a pretty big loss of face but maybe the new blood will force a change.
As to getting different results from different cards, I kind of doubt it. While the differing GPUs have different features the results of the calcs are the same although NVidia are the king in the video realm.

I could also point out that our SI-2K camera uses the GPU to provide RT preview with a 3D LUT applied, I doubt there's a CPU around that could handle the debayer and LUT calcs in realtime.

There was a lot fo talk about 'something' happening between AMD and SCS 18 months ago. Given AMD's performance of late and even back then you gotta wonder the wisdom of that.

Bob.
Himanshu wrote on 6/18/2008, 7:07 PM
Bob,



SCS haven't painted themselves into a corner by not using GPU power...there probably isn't one single high-level library that would satisfy Vegas' requirements for one and second, if they wrote one, they would have to make sure it would work across all GPUs from different vendors. That's a significant investment. Given that for the the last so many years CPU speeds have been increasing and multi-core has become reality, SCS is reaping the benefits of that. Harnessing GPU power will come...it will need an underlying library such as NVIDIAs CUDA, or ATI/AMD's SSEPlus (for CPUs) or some other that can work across multiple CPU/GPUs and guarantee the same results.

As far as getting the same results...yes, 2+2 will (should?) always return 4 from each GPU. That's not the point. My point was, if NVIDIA had a routine implemented in hardware or their API for some rendering algorithm, and ATI/AMD had another, and they both were called "ray traced rendering" - even claiming to implement the same algorithm published by <pick your favorite CG pioneer here>, who's to say you're going to get the same look in the rendered output? It depends on the implementation! To avoid this issue, SCS must write all the high-level routines, and ensure that would work on all GPUs, no matter how different the architecture.

So the question then becomes, how much is this going to cost SCS, and how much are we willing to pay for it? Are we willing to trade new functionality for performance?
farss wrote on 6/18/2008, 8:33 PM
Odd then that NVidia supply the drivers for their FX5600 SDI card for AE. How is it that I can plug all manner of wonderous hardware AND software into Adobe's products and it just kind of works?
The answer might be that Adobe have an interface in their software that's sort of up to date. They publish the specs and the hardware / software vendors do the work.
Yes, there's lots of issues with rasterising vectors and AA not coming out the same on gaming video cards. Why is that an issue with video, there's video cards built for gamers and cards built for graphics / video work. Gaming cards have other drawbacks as well, reading data back from them is slow.

I've spoken to several hardware vendors as have others here about getting support for Vegas. It's not that they don't want to but the vfw interface is so out of date. One I spoke to was quite direct and just told me it was time to move onto a NLE from a vendor that had a clue.

Now I'll be the first to admit that not many here are ever going to have the big dollars for a FX5600 SDI card or much of the other desirable hardware out there but so what. Buy a poverty pack PC / Mac and run a NLE from Adobe/Avid/Apple and if you get the work and the dollars you can keep upgrading and upgrading. Thow enough money at it and you can get 4K RT playback. Where can we go with Vegas, buy the fastest PC money can buy and pray?

Bob.
GlennChan wrote on 6/18/2008, 10:16 PM
As far as getting the same results...yes, 2+2 will (should?) always return 4 from each GPU.
I don't believe they do that. There are differences in the floating-point math. 2+2 will be 4, but other math operations/combinations will yield different results... e.g. ATI and Nvidia handle out of range numbers differently.

2- That being said, there are systems out there demonstrating very good performance with GPUs (e.g. Mistika, Flame). These are six-figure systems... Flame previously used expensive SGI workstations. So when you see high-end systems suddenly adopting GPUs, I think there's something there.

That's not to say that GPUs don't raise potential issues... but I think the benefits are worth exploring. e.g. depending on how you code the card, only specific cards will work.

Where can we go with Vegas, buy the fastest PC money can buy and pray?
Well what if background rendering was implemented. (Or even a weak version of it, where a project will auto-render if it's idle. So you can work with multiple .vegs and the inactive ones will render away.)
That would improve performance without needing specific hardware or any hardware upgrades.
DrLumen wrote on 6/18/2008, 10:28 PM
I too would like to see Vegas be able to take advantage of the advanced hardware that is out. However, I certainly wouldn't want Vegas to lose stability (ala Pinnacle) for that extra render ability. By the same token, I really believe Sony should be working on alternative ports to get away from their dependence, or reliance, on Windows. Since Adobe started with Apple, they already have more experience across platforms - the technology gap will only get larger if Sony doesn't get off their duff.

Given the current state of CPU's and taking into account Moore's law, the extra power of the GPU may not be needed. Even with the 2K or 4K cameras/media the CPU(s) should, eventually, overtake any GPU advantage in regards to simple video demands. Simple as opposed to photo-realistic 3D rendering of huge, highly detailed environments.

intel i-4790k / Asus Z97 Pro / 32GB Crucial RAM / Nvidia GTX 560Ti / 500GB Samsung SSD / 256 GB Samsung SSD / 2-WDC 4TB Black HDD's / 2-WDC 1TB HDD's / 2-HP 23" Monitors / Various MIDI gear, controllers and audio interfaces

GlennChan wrote on 6/18/2008, 10:44 PM
Since Adobe started with Apple, they already have more experience across platforms
Going multi-platform also hinders where you can take your software. Adobe Photoshop for example is a patchwork of many different frameworks and APIs, some new and some old. A lot of Photoshop code uses Carbon... unfortunately for Adobe, not all of Carbon will support 64-bit. So if Photoshop were to become a 64-bit app (very helpful for Photoshop for people who want to use >4GB RAM), Adobe would have to port everything away from Carbon. They have so much legacy code around that it would take them a fair bit of work.

Adobe also had to spend a lot of development effort dealing with the numerous changes Apple has made... it took Adobe some time to catch up after Apple switched to Intel and after they release their new OSes (which break Photoshop). So I don't think porting would really help, as you'd just become dependent on another company on top of Microsoft. Also, I'm not sure what need it would solve (I have both a Mac and a PC and am fine with Vegas only running on Windows).

Given the current state of CPU's and taking into account Moore's law, the extra power of the GPU may not be needed.
I doubt it, as I think your expectations will rise in the future.

In the past, we've moved from editing highly compressed SD (e.g. early Avid) to uncompressed SD, then uncompressed HD, and now 4K. In the future, I think we will be looking at 1080p60, 4k @ 24fps, stereo formats (double the bandwidth), and maybe even ultra HDTV (practically 8K).
farss wrote on 6/18/2008, 11:24 PM
Background rendering would be great.
Background rendering subordinate layers / tracks would be very handy. Trying to use Vegas to composite even a few tracks of HD becomes very frustrating.
Aside from the compsositing issue and that could easily just be left to purpose written apps like AE andFusion there's two key areas that it seems to me we need RT full raster playback.

Cuts only editing and color correction. I don't think the GPU would help much with the former but the later judging by Scratch and Speedgrade it certainly can and they're a bit less than 6 figure, not much but a bit :)

Bob.
DrLumen wrote on 6/18/2008, 11:27 PM
In the past, we've moved from editing highly compressed SD (e.g. early Avid) to uncompressed SD, then uncompressed HD, and now 4K. In the future, I think we will be looking at 1080p60, 4k @ 24fps, stereo formats (double the bandwidth), and maybe even ultra HDTV (practically 8K).

True that it's like the adage that you will always spend what you make. That is true to a point but look at the tech now as opposed to some years ago. It used to be that a fairly simple effect on a 256 color gif, using neopaint or corel, took minutes to render. The same effect is almost instantaneous now and the graphic resolutions and color depths have grown as well. Picture processing is almost a moot point now as we have moved into NLE's. In the short term, there is a performance deficit but video resolutions and color depths will have to level off. It really does no good to keep increasing resolution if there is no way, or need, to display it.

Even if the hardware doesn't always keep up, other techniques and technologies will come into being that will reduce the need of brute force hardware even more.

Going multi-platform also hinders where you can take your software.
I would have to disagree with this as multi-platform capability directly affects where you can take the software - if it works on more systems it means you can take it more places. It would also increase market share. ie those people that only use Macs will be able to buy Vegas. If by "where you can take your hardware" you are referring to functionality, then in some instances it should expand the functionality in some places and restrict it in others. As it is right now, Sony is restricted only to windows using an API that is out of date - even by windows standards. What would happen is there would be software forks, like currently in the Linux OS's, but those do not necessarily limit the overall use or functionality. I know it would be harder as Sony would be doubling or tripling their developers, software base and potential problems. But, I believe Sony should keep in mind how many companies that only make windshield wipers (and revenue) for AMC Pacers as opposed to companies that make wipers for all cars and brands.

intel i-4790k / Asus Z97 Pro / 32GB Crucial RAM / Nvidia GTX 560Ti / 500GB Samsung SSD / 256 GB Samsung SSD / 2-WDC 4TB Black HDD's / 2-WDC 1TB HDD's / 2-HP 23" Monitors / Various MIDI gear, controllers and audio interfaces