GPU-based video transcoding

jabloomf1230 wrote on 8/18/2008, 11:22 AM
http://www.nvidia.co.uk/content/forcewithin/uk/download.asp

Okay, so it's not all that practical at the moment. But it does demonstrate that your GPU can be used to encode video. There's a lot of computing horsepower in most of the new nVidia and ATI GPUs, so maybe in a short time, this path for encoding might be viable.

Comments

blink3times wrote on 8/18/2008, 3:30 PM
"Okay, so it's not all that practical at the moment. But it does demonstrate that your GPU can be used to encode video."

GPU has been used for a LOOONG time now in other programs. I have said for some time now that it's time Vegas got into it. Video cards (and other related hardware) is cheap now. I can't think of any reason why one would not want to take advantage of the wealth of cheap hardware out there now.
Seth wrote on 8/18/2008, 3:56 PM
Amen blink. Anyone not catching onto this trend is missing out big time.
The company responsible for creating the 'Badaboom' video transcoder has already created a pro plugin for... drum-roll, PREMIER PRO. It will require a CUDA-enabled Nvidia card from the Quadro line though, not just a GeForce 8,000 series card. I spoke to the Elemental Technologies "Strategic Account Executive" at Nvidia's booth at Siggraph last week, and he assured me that Adobe was very willing to help them market their new H.264 GPU encoder/transcoder. CS4 will have faster-than-real-time AVCHD output (on cuts only material) from AVCHD or HDV timeline content when using this plugin. Nvidia gave Elemental Technologies about $7 Million in Venture Capital to get their product off the ground.

If Sony still has a strategic alliance with AMD they could release exactly the same type of product for the new ATI cards using Brook+ (instead of CUDA), and SCS would further penetrate ENG and EFP markets based on speed of editing and output. The big problem is that neither AMD nor SCS have $7 Million lying around to invest in third party partners.
Seth wrote on 8/18/2008, 4:16 PM
Ok, I just submitted a feature request.
jabloomf1230 wrote on 8/18/2008, 6:35 PM
I think that you missed my point on practicality. I was addressing only the Badaboom demo, which is crippled. There are other pieces of software that use the shaders on nVidia GPUs to encode, instead of your CPU(s). But your point about Vegas is correct.
johnmeyer wrote on 8/18/2008, 6:52 PM
If you have a multi-core or multi-CPU or a multi-core AND multi-CPU computer, the advantage of using some of the computing power in a graphics board is going to be pretty small. Even if the GPU processor was more powerful than your computer's CPU -- which it isn't -- and even if you could use 100% of the GPU -- which you can't -- if you have, say, four cores/CPUs, you'd get at most a 20% boost.

If you want more speed, then get more CPU power.

At one time (3-4 years ago) when this GPU assist first appeared in some programs, it seemed like an OK idea. At this point, I really don't think it makes much sense and I'd be surprised to see many companies continue to support it.
GlennChan wrote on 8/18/2008, 7:53 PM
A lot of the high-end finishing systems are using GPU power of some sort.

Baselight - parallel GPU
Mistika
Inferno, Flame - formerly ran on SGI machines
Da Vinci - CORE (CUDA something Rendering Engine)
Nucoda (GPU + 8 CPUs)
Assimilate Scratch
(Apple Color)

Lustre (kinda; many colorists disable the GPU acceleration since it is not the full feature set; Lustre also works with distributed CPU acceleration onto a bunch of blades)

No GPU (AFAIK):
Quantel iQ
Pandora (FPGA I believe)

?:
Avid


Those are all the high-end systems that come to my mind right now. Many of them used to use more expensive hardware solutions... so while they have the money for them, they choose to go with a GPU card (or variation on it).
GlennChan wrote on 8/18/2008, 7:57 PM
-deleted
farss wrote on 8/18/2008, 8:18 PM
The only one I'm familiar with is IRIDAS Speedgrade which uses the GPU to get RT grading using NVidia's rather expensive hardware. The way it's done reveals the limitation though, you only get the graded output out the HD SDI port.
Same goes inside the SI-2K camera, you get many wonders in real time thanks to the GPU but only out to the monitor.

To really use GPU acceleration sensibly Vegas would need to not be a Swiss army knife combining NLE and compositing functionality in the one app.

Bob.

jwcarney wrote on 8/19/2008, 12:25 PM
If they would just release a decent SDK and let others try their hand at it.
blink3times wrote on 8/19/2008, 3:26 PM
"?:

Yes.
Avid MC, Avid Liquid.... even Pinnacle Studio. In fact liquid and studio use gpu for back ground renders

Weather GPU assist helps a little or a lot I think is irrelevant. If you got it and you're not using it.... then it's nothing but a waste.
rmack350 wrote on 8/19/2008, 3:50 PM
One of the things that PCI Express gets you is equal speeds out to the card and back to the motherboard. I think there's a lot more potential now than there was with AGP cards.

The sort of processing that's being attempted with CUDA or whatever AMD calls their Stream API is a bit different than what's been used in the past. The idea here is to use a GPU for more generalized jobs.

The GPU is supposed to be many times faster than a even a modern CPU but only for jobs that are well suited to the GPU. Compare it to a pasta machine - it'd be much better at rolling out noodles than it would be for making ravioli. In other words, feed a stream of data in and get a processed stream out, but don't expect a lot of logic.

You can use the GPU to process video, audio, financial data, all sorts of things that need bulk processing. This is very different from what PPro currently does (for example) to use the GPU for 3D effects.

It seems to me that codecs are good candidates for using the GPU because they are usually self contained and very limited in the scope of what they do. You could conceivably get codecs that are tailored to a specific GPU and these would be a pretty good application because Vegas can use the codec and then the codec can use the GPU.

For Vegas itself to use the GPU it seems like you'd need some level of abstraction so that Vegas wouldn't really have to be aware of which GPU was present or even that it was present at all. You wouldn't really want to build Vegas for NVIDIA products but not for ATI products.

Rob
Himanshu wrote on 8/19/2008, 7:28 PM
Along with the defragging thread, this topic comes up often. Here are two recent threads with many of the same participants :)

GPU-based h264 compressor .

Does better GPU help HDV preview framerate?

Recent info - Intel has recently pushed more information on "Nehalem" concept/design as a product microarchitecture.
jabloomf1230 wrote on 8/20/2008, 7:16 AM
It comes up often, because HD video encoding is not in RT. We are in the prehistoric era of encoding, just like when it took two hours to burn a DVD with one of the early 1X DVD drives. No one likes to sit around forever waiting for a file to encode, just to see that the end result isn't quite perfect. And what happens when we get to 2K and 4K formats? Do we have to sit around for weeks?

I also disagree with the point that a multi-core CPU is all you need. Keep in mind that in a GPU, the shader processors do the encoding, which means you might have up to 240 individual cores working in parallel, not just 4. The trick in NLE programming will be to make the CPU do all the FX calculations and then have the GPU do the encoding.
rmack350 wrote on 8/20/2008, 8:36 AM
"The trick in NLE programming will be to make the CPU do all the FX calculations and then have the GPU do the encoding."

Perhaps that connection to the GPU can be something handled by codecs rather than by Vegas itself, which might free Vegas from having to communicate with a variety of GPUs. Most likely you'd want either a dual core graphics controller or dual cards.

One of the CUDA articles I read recently noted that when a CUDA application crashed it took the whole graphics system with it. The suggestion was that applications ought to be run on a GPU that isn't otherwise in use at the moment.

As far as wanting to edit 2k and 4k on my home computer, Sure! Sign me up! I'd also like to tow the behind my Miata. (I'd have to get a Miata first)

Rob Mack
johnmeyer wrote on 8/20/2008, 8:37 AM
What you want is faster encoding, not "GPU-based video transcoding." Using a portion of the video card's CPU is only one way -- and probably not the best way -- to provide this. You shouldn't care if a vendor does it one way (GPU) or another (support for multi-core, multi-CPU). The only thing that matters is how fast one program benchmarks vs. another. That is what the discussion should be about.

rmack350 wrote on 8/20/2008, 9:11 AM
Here's a link to what AMD is working on in the same vein:

http://ati.amd.com/technology/streamcomputing/

Rob
Seth wrote on 8/20/2008, 8:45 PM
John, what you might not know is that the newer GPU architectures have literally hundreds of cores, and are capable of crunching through codecs like MPEG2 and H.264 in a highly parallel fashion. The Badaboom video transcoder is just a consumer toy, but what it's packing under the hood is mind blowing; I talked to the owner of the company at Siggraph and he showed me a working demo of the Premier Pro plugin running on CS3, which uses the same transcoding engine under the hood of the Badaboom video converter. With a 31 second clip of HDV (1440x1080) material it took only 29 seconds to render out a Blu-ray ready H.264 file (same resolution, at 15 Mbps)... No one here can boast that kind of speed with any existing CPU configuration.

What's even more awesome is that their software will decode H.264 material once it's released for CS4 next year. That means real time AVCHD editing at full resolution... But only for Premier.

Rmack: unfortunately the language that AMD is using (Brook+ as I mentioned in my first post) is scheduled for replacement by OpenCL, which is precisely why the company designing the Badaboom software did not and will not program for AMD GPUs; it's a really big risk for a new startup to release code that will be scrapped soon.

Again, adding this kind of a feature to Vegas Pro's renderer would be a great shot in the arm to their sales, and would not inhibit the Vegas timeline experience.
rmack350 wrote on 8/20/2008, 9:41 PM
Yah, I was peripherally aware that AMD is changing some things in their system. The point was just to show that there's a breadth to these efforts and also to provide a link to a little reading material that was more than I could write without a lot more reading of my own. It's been a year and a half since I looked into the topic so I'm really rusty on it.

Rob



jabloomf1230 wrote on 8/21/2008, 8:45 AM
Umm, I think that was my point. GPU encoding uses the GPU's shader cores and video cards that are readily available (not exotics) have up to 240 shader cores. The GPU shaders are not elaborate computational beasts like the cores found in CPUs. The shaders have a limited instruction set, but can be tailored to do tasks like video encoding.
Himanshu wrote on 8/22/2008, 6:37 PM
jabloomf1230,

If you're interested, you should read up on Nehalem/Larrabee by Intel - they are basically packing several x86 CPUs on a die to replace GPUs...they demonstrate how their architecture is more flexible because it is general purpose & programmable while also containing some fixed algorithms/paths for things that are faster to implement in silicon. Interesting stuff...I'm not all the way through the architecture paper yet.
jabloomf1230 wrote on 8/23/2008, 6:15 PM
Thanks. I've been following the Intel stuff, since they first proposed having the CPU do the GPU's work. So far, Intel just seems to be reacting to opposite claims made by AMD (ATI) and nVidia. Intel is saying that they will have these new cores available by some time in mid to late 2009. nVidia has their model working right now. Don't you love when a monopolistic company like Intel gets pushed into doing the right thing? Eventually, the distinction between the GPU and CPU may disappear. What's hastening that at the moment is the decline in PC gaming and the increase in the video game console market. The demand for add-on GPU cards is not going to continue to increase, unless something happens in the gaming industry to reverse the present trend. But the demand for desktop/laptop CPUs seems only influenced by global economic conditions. Just like in Intel's "war" against AMD over CPUs, I'd put my money on Intel in the GPU arena in the long run.
Terje wrote on 8/23/2008, 7:38 PM
'd also like to tow the Maltese Falcon behind my Miata. (I'd have to get a Miata first

For all but the most obvious reasons, why would a man want to drive a Miata? With or without the Maltese Falcon in tow?