I just read in another forum that Vegas 7 is still using the stoneaged VFW rather than the newer Directshow approach that everyone else is now using. Can somebody please tell me this isn't so!
And they've never given an explanation as to why. Deigning to address the peons (I mean customers) just wouldn't do, So beneath them. (yes I'm baiting hehehe).
I almost wish I hadn't asked David the question. But I was really really hoping for that one.
Really though, lets wait for more info and a full review by someone. Hopefully that will clear up any questions. So far this appears to be a paper release, actual one will come soon I'm sure and all the folks under NDA will be able to talk.
I thought I was being over-the-top enough to show I was kidding ;-). VFW isn't a showstopper for me. Maybe not for anyone. There are other things that might have a more real impact on whether one decides to use Vegas. Things like whether you can do film matchback (filmy), whether it supports 10bit (farss), etc.
Preface: I know what VfW is, and I know what DirectShow is. I'm just not sure how they fit in here since I don't use many third party apps with Vegas. Therefore, the below is not an argument, but actual questions.
Could someone clue me in on just what specific effect this VfW / DirectShow issue has on workflow? Is the problem that other companies will eventually drop VfW support, leaving Vegas users hangin', or that the VfW interface / connection / whatever is not as fast / robust as what they offer for DirectShow ones. Or...? What exactly is the concern, other than "keeping up with the industry"?
Well Directshow harness the power of a graphics card GPU, using it to decode the video compression instead of putting all that load on the CPU. My 3.06 P4 can play back HD mpeg or wmv video at about 30% or 40% cpu load with GPU acceleration. With a non-Directshow playback engine it will be using 100% and still not keeping up. Cineform uses directshow as well if it is available, and with Directshow, file sizes are something like 20% smaller for the same quality. I know to many of you it will seem like pointless bitching, but to me it is kind of like offering a 2007 car without an electronic ignition. It just seems way backwards. I mean, most of us have decent graphics cards that could be doing a huge portion of the work are CPUs are lumbering along with. The technology is available. Everyone else is using it. Why not Vegas?
with an ATI 92xx+ card you can run HD mpeg, wmv or divx avi at 30/40% load even on an amd 1800xp. I'm not sure that has to do with directshow though, that's enhancements on the card's to run certain codec's faster. IE hardware decomression.
the only MAJOR drawback i see with gpu acceleration is that many gpu accelerated apps do NOT like sharing the graphics card. Many programs will not run if another OGL/DX app is running. That would take out multiple vegas's open at once. Infact, i've only seen 1 set of game's that don't conflict with eachother, all others do.
Maybe Sony is counting on Dual & Quad Core CPU's to eliminate the need to use any GPU acceleration??
I wish Vegas 7 had the ability to use my GPU, a P4 3.0GHz with HT seems pretty anemic at times..... Meanwhile my Radeon X1600 Pro is taking a nap during multi-track renderings.
One downside (I think I haven't got my wires crossed here) to directshow and using the GPU is it's another thingy to go wrong and cause a system hang / crash or bad render.
However from my limited understanding even the pro video interface cards use directshow now, you want to PTT, the stream just goes from the HDD to the card and the card does the hard work, I'm thinking SDI cards here.
Could be totally wrong on all of this, hard enough getting the grey matter focussed on editing.
"Another thingy to go wrong" is exactly what I was listening to all day today. My employer was 8 feet away editing on a PPro2/Axio system. The hardware failures sounded strangly like cursing...
This setup definitely uses directshow, and it definitely ain't stable, at least not by Vegas standards.
Kind of a Tortoise and Hare comparison between the two.
I don't think I'll pass that on :-) I'll just hear more cursing.
The point, though, is that the more points of failure you have, the more cursing you'll do.
A 16 disc array has at least 16 points of failure, not counting the fiberchannel interface, the metadata server, the crunchy little fiber cables, etc, etc.
Same with having to rely on a GPU, although that's only one additional failure point. The main thing I'd think you'd want to allow for is the ability to degrade your FX to non-GPU driven, like when you move a project from a suite to a laptop.
with Directshow, file sizes are something like 20% smaller for the same quality.
Really? How so? Cineform's Aspect HD (for PPro 2) uses HD Link for capturing and encoding. This is the same HD Link that Vegas users are getting with the Connect HD product that we use to encode CFDIs for use in Vegas. Nothing in HD Link asks if you are using Directshow or VFW. If it is using Directshow, clearly Vegas has no problem with that.
My understanding (limited as it is) was that VFW provides an API for encoding and playback. While a particular codec may be engineered to respond to a particular API, the underlying engine that creates the actual files comes from the vendor (e.g. Cineform).
It kind of sounds like saying a jpg is bigger depending on which API you used to create it. Seems more likely it would depend on the compression algorithm, which probably couldn't care less about the API layer riding on top of it.
Or maybe I misunderstood you completely (a big possibility). Can you elaborate? I do remember hearing the Cineform CTO say that HD Link can create files that are smaller but similar in quality to what Vegas produces.
There is an option to "enable Vegas smart rendering" within the HDLink capture utility. There is a little note under this that states that checking this tab will enable Vegas smart rendering at the expense of increased file size. What this tab in essense does is to use the non-directshow approach to writing the Cineform avis so that Vegas can smart render in the same non-directshow approach that Vegas normally uses. Keep in mind that Vegas can read directshow enabled renders, it just can't render or smartrender them. In any case the file size for the same quality when you enable this tab is 20% or so larger.
By the way, I always check this tab. It makes a tremendous difference in Cineform to Cineform renders both in terms of speed and quality as in many cases it avoids generation loss,
Vegas is no more of a pickle of drivers, APIs and multimedia eras than any other environment. To say that the interface to Cineform is limited to VFW thereby branding all other I/O and filter handling to have the same virtues and those alone is bunkum.
Microsoft provide the recommendations and programmers follow them and then thrash out why they don't work. Usually writing all manner of code to fix the unforseen foibles. I'm far more interested in how much private coding and optimization goes on than whether a recommendation is adhered to or followed.
If speed and efficiency is what you are after, running an editor on a general purpose CPU and OS is probably not choice. With time, the whole industry has turned to the PC to do work because of the commodity nature of it not the appropriateness.
Folks think VfW and start thinking all 16bit drivers, poor quality overlay surfaces, lockups and AV syncrhonization issues. Does that sound like Vegas to you?
I'd go back to the fact that Vegas bridges the eras therefore it has both VfW and directshow engineering entwined in it. The more it advances the higher the probability that the older stuff gets kicked out, but even if there is old stuff in there - it could well be optimized to death (within the code that calls on these APIs). Vegas4 was not great on dual CPUs, but 5 was better and 6 was scaled to support multiple CPUs and hosts. Sound threaded enough to you?
I'll stop now as the whole "is it a white elephant?" talk is pointless. Anyone ought to be interested in demo'ing the competitive products once in a while. That should drive you before the spec sheet or rumor-mill.
Don't get me wrong, I love Vegas and am not switching to the competition. I love my wife too, but that doesn't stop me from complaining once in a while;-)
And if you want Adrenaline with that, it's way more expensive.
And, oh, don't forget the laptop. You still have to buy that.
Could get one of the speedier MacBook Pro laptops for $2K, and use the eSATA to work with uncompressed footage without dropping any frames, per a current article in Studio Daily.
Btw, it was just reported from IBC that Final Cut Pro 5.1.2 edits XDCAM HD at all data rates including 35 Mbps.