TBH if that card was certified by SCS to give me 3.5k worth of advantage (realtime playback with all FX, triple rendering speeds etc) when using Vegas, Id buy one tomorrow.
Ive said this before...
SCS should come up with 3 user builds that are guaranteed to work-
Budget
Moderate
Pro
List all specific components from motherboard all the way through to Ram.
We can build our own and know they will work.
One step further,
Design a cool case, (round tidy bins seem to be the go atm) contract out the building to a local PC builder ( at first, before moving to China) and sell Vegas certified PCs online.
Upgrade components with each new version, sell lots of hardware, build a Sony Vegas empire.
Its a business model that works.
:)
There is no such thing as "guaranteed to work", regardless of identical hardware. There are too many other degrees of freedom in the NLE context, and software in general, that account for issues.
Specific source files, in a specific order in a timeline and so on.
Bugs in the app itself are one thing that never will be guaranteed in any way.
Bugs in a GPU driver. Specing a specific driver version does nothing. AMD driver X might have bug 23 and driver X+1 might fix bug 23 but introduce bug 24. Which driver version is "guaranteed". Vendors can only say use the current drivers and hope the hardware vendors deliver stable drivers. The hope being that more bugs get eliminated than created over time so things slowly get better.
Sony corporate was doing so well selling PC hardware they got rid of the business altogether. You think the tiny SCS division would do any better? Maybe if they guaranteed the hardware, but they would want an awful lot of money in return for that guarantee (which would have loopholes of course).
add to that running other software that might conflict with that already installed - and not to forget adding klite codec pack and then asking why isn't my nle working ;-)
what i'd like is simply knowing what drivers scs use with which video cards to give gpu rendering / previewing reliability.
I remembered once that I saw SCS selling complete PC system, with already installed Vegas Pro, perhaps in VegasPro 8 or 9 era?
but looks like 'the package' didn't last long?
K-lite does not affect me. I have 'standard" installed ( standard is DirectShow only) . I avoid AVI files which helps (Video for Windows).
Vegas 12 does not install any video codecs into any codec subsystem on Windows (VFW, DMO, DirectShow, Media Foundation). Gspot and registry search.
For everything but AVI (Video for Windows), and of course Quicktime, Vegas 12 goes directly to it's own internal files. IMO. Vegas even special cases some Quicktime variants and goes internal. Like my DSLR MOV files. Vegas tells us in the media properties dialog which of its file I/O plug-ins is used for input.
Vegas does not use DirectShow. The reason to have K-lite is to be able to play virtually any video file out there in your media player and you only need DirectShow for that.
If Vegas installed any of it's special codecs into ANY subsystem then I could play an HDCAM SR file in Windows media player and I cannot. WMP supports all codec subsystems of Windows.
You all make excellent and valid points.
I guess what I'm dreaming of just isnt ever going to happen for Vegas (on PC anyway)
:(
Not to get into the Mac PC argument again, but it is good the way they make FCX, supply hardware they know works with it, limit what you can throw in there, and your experience is pretty much the same regardless of which Mac machine you choose.
Makes me wonder if a $450 Amd 290 gpu would be a good match for Vegas
I would think so.
Historically the workstation cards used the same GPU chips as the consumer/gaming cards. You have to dig into the specs, and check online reviews that examine the parts on the cards. Sometimes they (AMD/Nvidia) have been accused of crippling some features of the consumer cards in some GPU APIs considered workstation type APIs via the drivers.
Workstation cards used to be much better with multi-monitor features, but the hard core gamers are into the multi-monitor thing so this is not a real difference anymore.
At one point Nvidia users hacked the BIOS/hardware on consumer cards to identify the card as a Quadro card and use the Quadro driver. I would bet Nvidia has since closed that loophole in current chips.
Basically the three main GPU APIs are Direct3D, OpenGL and OpenCL. Nvidia has CUDA which is just like OpenCL but only for their cards.
Direct3D - 3D rendering used mostly by games.
OpenGL - 3D rendering, but historically used by "workstation" 3D apps. A subset of games also support this. Some video apps use this. NewBlueFX, BorisFX, FxHome Hitfilm.
OpenCL - no 3D rendering. a pure compute API. thought of as a "workstation" API. Vegas uses this. Adobe as well. Look at OpenCL benchmarks for indications of Vegas GPU performance. Anandtech actually uses Vegas as a benchmark.
Isn't the W9100 similar to the 290? Makes me wonder if a $450 Amd 290 gpu would be a good match for Vegas 13.
Yup.. the "Hawaii" GPU in the FirePro W9100 is very similar to the GPU in the R9 290X, obviously with 16GB GDDR5 DRAM rather than 4GB, probably a slightly slowr clock, and of course the FirePro drivers. There's rarely much, if anything, actually different between "desktop" and "workstation" GPUs. The economies of scale pretty much guarantee this, neither AMD nor nVidia sell enough "pro" cards to justify anything more than perhaps small tweaks, maybe a different package, or perhaps nothing for the pro series GPUs. And nVidia's doing much the same thing now with their "compute" line of GPU cards.
Oh, and far as I can tell, the regular R9 290 is that same "Hawaii" GPU with some of the stream processors disabled, and some slower clocks. That's probably a yield issue; knocking out a few hunderd (out of 2800-something) stream processors lets them ship GPU chips that would otherwise have to be scrapped.
The big advantage of the "pro" cards is the extra RAM and the CAD-tuned drivers. AMD and nVidia are actively developing and tweaking the "pro" drivers for use with CAD programs, computing applications, and other 3D or OpenCL/CUDA things. And they're not revising them just for the latest games. That alone may be enough for many professional users to pay the extra.
There's ample evidence that nVidia is intentionally crippling desktop drivers, versus professional drivers, on a few select OpenGL primitives. You can find these articles all over the place... the high end gaming card does very well, maybe beating the pro card, up until -- boom! -- some benchmark, maybe CAD or film rendering or whatever, it just smacks into a brick wall... maybe not it's 1/10th the performance of the pro card. The only way that happens is by intent. And in a few of these cases, the article's author went on to implement that same operation on the consumer card, either using multiple OpenGL ops or an OpenGL program, and the consumer card was back up to par, more or less, with the pro. AMD may be doing this too, but it's less obvious.
You'll also find, going back, that nVidia didn't always do this. It started in a particular revision of their drivers (I'm not a nVidia user, I don't recall offhand just which... AMD is so much better on Vegas most of the time, I haven't bothered with nVidia since Vegas 11), and so you had the awkward period in which the new cards dramatically underperformed the older cards on some pretty common 3D stuff. Most of the 3D stuff isn't critical to every video toolchain, but certainly BCC and some Vegas effects DO use OpenGL. I don't know if it ever matters, the crippled ops all seem to be 64-bit.
More recently, it's been demonstrated than nVidia is doing the same thing on OpenCL and probably CUDA... consumer and workstation GPU cards don't perform as well on OpenCL benchmarks as their compute cards.. but again, only very specific benchmarks.
I'm not trying to judge this practice (much), just pointing out that it exists.
Oh yeah... on that "driver hack" thing. The original one I saw was simply hacking the .inf file that comes with the drivers. The first obstacle was getting Windows to even load the driver, since it's tagged for a different (from the card ID perspective) board. So you go in, change a few strings, and you had a Quadro driver that would bind to the GeForce card.
Perhaps there are other checks in-place today. An article last year on Hack-a-Day (http://hackaday.com/2013/03/18/hack-removes-firmware-crippling-from-nvidia-graphics-card/) showed that, at least in the case of the GK-104 GPU, the chip is using a couple of sense resistors to tell the software about the intended use of that GPU. This sounds like something easily checked by the drivers, versus registry or other system-level stuff. Anyway, the article found that changing a couple resistors was sufficient, though I'm surprised they didn't enforce this in the BIOS, too. It's also kind of funny why this hack was done: the author simply wanted multiple monitor support in Linux, and that was only managed in the K5000 driver. As a rule, it's good to not go around annoying smart people for no good reason :-)
Hmm interesting. I have been thinking of trying AMD but I've been with NVidia so long mostly because I got sick of AMD driver issues and basically gave up on them years ago. My gaming pc doubles as my overnight Vegas Pro render box, so Nvidia so far has been a good fit there for total reliability. I'm nervous to go back to AMD but maybe I'll muster the courage to give them a try again. Since Sony is showing off Vegas Pro 13 + 4k on AMD hardware makes me wonder if that is the way to go for full gpu support on Vegas Pro 13.