Next bug-fix build has a price tag?

Comments

OldSmoke wrote on 3/27/2015, 8:13 AM
Here is a good article to read for anyone to draw its own conclusion about GPU acceleration and whether we need it or not.

And I still say that there is no single CPU that can do as much as a CPU and GPU together. I will even go further and say that even the fastest dual Xeon system will still benefit greatly from the addition of a GPU.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

farss wrote on 3/27/2015, 9:25 AM
[I]"Here is a good article to read for anyone to draw its own conclusion about GPU acceleration and whether we need it or not."[/I]

Well I've read it and my biggest challenger is working out which part to quote as pretty well every part contradicts any notion that WE need the GPU for what we do.
You don't seem to be grasping the difference between Graphics and Video.

Graphics and video games are all vector based. That's why all the talk about shaders, anti-aliasing and textures etc. That's all useless to video processing. 3D modelling etc., extremely useful.

[I]"And I still say that there is no single CPU that can do as much as a CPU and GPU together. I will even go further and say that even the fastest dual Xeon system will still benefit greatly from the addition of a GPU."[/I]

I'd suggest you talk to a high end systems integrator who specializes in systems for video editing, they'll probably recommend one of the pro NVidia cards but only because they're the only cards that support 10bit video output but you're wasting your money unless you can afford the very expensive monitors that'll make use of 10bit video. Far from certain Vegas even would support such a thing anyway.

Bob.

VidMus wrote on 3/27/2015, 10:01 AM
"AFAIK there IS a better way, a faster CPU."

I cannot afford one. My credit got used up on cameras which were/are much more important for my needs than a super fast computer. Also, three Zoom H6's for quality 18 track audio recordings.

What good is a super fast computer if one does not have quality video and audio to import into it?

Sub par cameras and audio recorders = sub par videos and audio.

Still, I like having a computer system that is as fast as possible with the GPU I use.

As I type, Vegas is using MC to render for Blu-ray (1920x1080) a video that is one hour and 17 minutes that should take approx just one hour to do. It would take seriously longer without the GPU!

---------------------
Edit and update: It took 01:01:24 to render the video. A little bit faster than real time, but a whole lot faster than it would have been without the GPU. Remember, this is using the Main Concept codec. I will NOT get the same results with other codecs!
---------------------

I will deal with a faster computer system approx two to three years from now. Another three years, I will deal with cameras again. Another three years, I will be all messed-up from arthritis or whatever and will no longer care. LOL!

OldSmoke wrote on 3/27/2015, 10:20 AM
You don't seem to be grasping the difference between Graphics and Video.

By far from it. But you have to understand that it all depends on the codec. If the codec is written for GPUs, like Mainconcept's AVC, then you get the full benefit. That is the reason why only a few codecs in the current Vegas versions work with GPU acceleration and not because the GPU isn't suitable to do the job.

Currently the fastest single CPU is the 5960X and not even such a system is powerful enough to work with native 1080 60 XAVC-S files and be as fast as a lower end system with a working GPU setup.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

larry-peter wrote on 3/27/2015, 10:37 AM
Bob knows where I'm coming from. Dedicated (even cheap) parallel processors on a board could far outperform what we get from the "sometimes" acceleration from GPUs.

IMO the pluses - once properly coded it will simply work - always - with the potential of accelerating every part of an editing/FX/compositing/encoding workflow.

The negatives - it will cost. But those who choose not to go this route can still edit with proxies, or reduced framerate, or selective rendering - which most of us seem to be doing now even with GPU acceleration - at least on occasion.

It would also position Vegas as one of the few NLEs that can be properly configured as a "pro" solution, rather than an acceptable prosumer solution - which virtually all of today's NLEs are.

Edit: I should add that GPU acceleration works with my older video cards and gives me a good timeline performance boost . I don't like the differences I see with GPU assisted rendering, so I never use it during renders.
wwjd wrote on 3/27/2015, 2:11 PM
just installed a globally famous professional subscription based competitor NLE... wow. fast smooth stable - plays 4K >>AND<< GoPro without ANY hiccups. I'm sad I resorted to this....
NormanPCN wrote on 3/27/2015, 2:37 PM
Graphics and video games are all vector based. That's why all the talk about shaders, anti-aliasing and textures etc. That's all useless to video processing. 3D modelling etc., extremely useful.

Mostly correct. The shaders are what image processing (still, video) is using for compute in the GPU. Shaders are a term used in 3D rendering APIs like OpenGL and Direct3D. They do pixel processing and they are a programming language not unlike OpenCL and CUDA, but being more restrictive than the later.

Certainly vertex processing, texture mapping and tessellation have no application to still or video image processing.

In modern GPUs the "shaders" are the predominant part of the silicon. They are used for computations to help the flati'sh looking texture mapping look/feel more "real".
farss wrote on 3/27/2015, 5:47 PM
[I]"Dedicated (even cheap) parallel processors on a board could far outperform what we get from the "sometimes" acceleration from GPUs."[/I]

This is true, there's computers costing zillion with a matching number of cores that excel at certain task and would probably be an epic fail at editing video. Render farms are one example used by the CGI people Video is a different matter.

Sure there's going to be some things where the GPU will help what we need to get done but the cost is not just limited to the hardware and the cost of writing the code. It has to also include the ongoing updating and that's significant in a world where we've gone from SD to 2K to 3D to 4K and who knows what next anf new codecs are a constant threat.

I spent $7K for a system to edit Digibetacam and not that long after SCS stopped supporting the BMD cards and told us to use a different manufacturers cards. Don't know if any of those choices are supported today by Vegas.

Whatever [I]can[/I] be wrangled from the GPU is only the first issue, then there's the cost of support and ongoing development. Clearly the three big "A"s are in a better position to do this with many more users and more programs to spread the cost over. When there's other long term bugs and improvements needed to the existing code base adding another considerable chunk of code, even [I]if/I] it does sometimes give some users some benefits seem a bad decision to me.

Bob.
VidMus wrote on 3/27/2015, 5:48 PM
@wwjd

That "just installed a globally famous professional subscription based competitor NLE..." automatically makes proxy files instead of giving one a choice like Vegas does from what I remember reading somewhere. If a video cannot be played at full rate then instead of using lower quality settings automatically as Vegas can do, it instead automatically makes proxy files.

Are you sure that you are actually playing the 'native' videos and not the proxy files instead?

Then again, do they actually have the GPU figured out while SCS has not? They certainly are not able to make the native videos play 100% using CPU only. They are not going to make our computers magically faster than they are. And even then, GPU can only do so much.

Even Vegas will play proxy files without a hiccup!

Just some thoughts...

wwjd wrote on 3/27/2015, 6:41 PM
the files sure looked sharp and nice and there was no lag - a quick "Import" and it was playing. I suppose it could have made a proxy during the import, but seemed too fast for that. Any way to tell?
OldSmoke wrote on 3/27/2015, 6:54 PM
For me personally, GPU acceleration since VP11 finally got me preview of HDV files at Best/Full with FXs applied and render times that would have taken twice as long with CPU only; I used a Q6600 with a GTX460v2 at that time. Today, on my 3930K with 2x R9 290 I can preview 4K MXF files, those created by Catalyst Browse, at Best/Full with FXs applied and even in 32bit if I wanted too. I would have to spend 6-8K if I want to achieve the same with CPU only, if I can even get that.

And because of the continuously evolving codecs, we can't have hardware encoder cards but need software based solutions that are flexible.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

larry-peter wrote on 3/27/2015, 7:25 PM
@wwjd, Did you set up a 4K timeline to import your file? The reason I ask is I don't think your new NLE automatically creates proxies. Lots of threads on Adobe forum asking how TO create them.
But one suggestion in their RED workflow has been to edit 4K video in a 1080 timeline. The video will scale to the timeline size and give the appearance of full -res while only reading 1/4 of the information. From my reading, it seems in your NLE's case, its not like switching from best/full to preview or scaling in the manner our editor does, but simply disregarding most of the pixels and pretending the clip is just full res 1080.

I will admit the timeline performance of that editor has really impressed me the last few versions, but working with it... I don't have enough hair left to learn its workflow.
wwjd wrote on 3/27/2015, 8:56 PM
I honestly don't know anything about it. I just dragged a file to the time line, after a 1-2 second import message, it plays back just fine. Doesn't seem like it is making proxies and looks the same as playback in vegas at half size. Having trouble understanding where to set things (literally the first time I've used it) compared to Vegas's simpler control. But I want to learn it and expand my ability. Just shocked how fast, smooth and solid it feels.

hahahhaa yeah, I do know the work flow will annoy me as well. But, gotta jump in sometime, right?
VidMus wrote on 3/27/2015, 10:07 PM
@wwjd

Whatever it does, it apparently does not do what Vegas does when playing a video natively with the settings of Best/Full.

Some time ago when playing with the elements version, it said that the project has to match the video file and that there cannot be a mix of video file types. I do not know if that is still the case or not but with a mix of cameras and video file types that can be a major pain to work with!

As for what was said in another post. I also do not have enough hair left (at least on top) to mess with it. I do have a ton of beard and hair on the lower parts of my head but I prefer to leave that alone.

wwjd wrote on 3/28/2015, 11:35 AM
just dropped GH4 UHD 4K and GoPro 2.7k on same OTHER NLE timeline, worked fine. The frame was smaller on the 2.7, needs adjusted but played back flawlesslyside by side. Haven't added ANY effects yet though. But I noticed at the start of a project it asked me if I wanted to use GPU or CPU. But I've derailed this thread enough.

Like they mention above... cost per month of a yearly "Bug fix" version upgrade may not be that much different. Work flow will be. Not abandoning vegas, but learning the options.