Elephant In Room time: GPU or NOT GPU?

Comments

Musicvid wrote on 9/10/2020, 3:57 PM

Just want you to realize that you are one of a handful of individuals who will ever realize that there is even a difference in the application of the technology to different purposes. Certainly any education to that purpose has not come from the gaming side, where diversifying any concept is one tough sell. With your understanding of both, which I do not have, you may be a bigger player than you realize.

john_dennis wrote on 9/10/2020, 5:25 PM

[Techno-Philosophical Value Judgement]

"Without gaming GPUs, video editing would still be in the stone age"

I suspect society could manage with 94.84672% fewer videos to watch.

[/Techno-Philosophical Value Judgement]

Former user wrote on 9/10/2020, 8:13 PM

[Techno-Philosophical Value Judgement]

"Without gaming GPUs, video editing would still be in the stone age"

I suspect society could manage with 94.84672% fewer videos to watch.

[/Techno-Philosophical Value Judgement]

100% agree. I found this fact online today referring to YOUTUBE only:

"300 hours of video are uploaded every minute of the day, on average currently. That's 157,680,000 hours of video in 1 year. That number grows every minute"

Musicvid wrote on 9/10/2020, 8:32 PM

I'd take 1% of that action -- as long as I didn't have to watch it.

Former user wrote on 9/10/2020, 8:34 PM

I would be happy to be the guy that sells them the data storage equipment. They must have to double it every few months.

RogerS wrote on 9/10/2020, 9:42 PM

FWIW, I don't care about gaming but there are quite valid reasons to use GPUs for rendering.

For those of us who also do corporate work, it can be handy to go to the GPU for quick rough cuts to show coworkers before doing the final render. I also had jobs like filming a 2-hour long slideshow/ lecture, putting subtitles, cutting from the projector to the original jpegs, and cutting between two cameras. Nobody cared about the video quality, and I don't need the render to tie up the work computer for hours. Unfortunately Vegas Pro 15 (I believe) had GPU bugs which inserted black frames inappropriately upon render, and finding these in a 2 hour video was a frustrating waste of time. (I did end up resolving it with tricks that are now standard advice- disable GPU preview, dynamic ram preview to 0MB)

GPUs also have uses beyond gaming or content creation, including AI, physics simulations, scientific research, data analysis and more. If the volumes sold for gaming make them more affordable for the rest of us, I'll gladly say "thank you, gamers!"

fr0sty wrote on 9/10/2020, 10:37 PM

The quality is good enough that I use GPU renders as deliverables in some cases (especially in lower budget projects or projects that require a quick turnaround). Never had any complaints. The untrained eye would never spot the difference. I most definitely will use it for my HDR renders whenever VEGAS supports that, as CPU HEVC 4K HDR rendering in VEGAS is horrendously slow (2-3fps max).

Last changed by fr0sty on 9/10/2020, 10:38 PM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Former user wrote on 9/11/2020, 12:07 AM

The vast majority of twitch streamers use GPU for encoding and as long as they have the resolution/frame rate/bitrate combo correct the image always looks higher quality and sharper than anything on Youtube. Youtube is a low bar ofcourse but that's where much of people's content ends up.

There is always the argument for highest quality possible uploads to youtube as eventually they may improve the quality the way they did with the change over from h.263 to h.264, and currently youtube are increasing bitrate to many of the 'trending' videos upto over 5mbit/s (1080P) which is about double their normal bitrate

Steve_Rhoden wrote on 9/11/2020, 7:46 AM

 

Grazie, haven't heard back from you on the matter, as it was you whom we were responding to.

As for my CPU recommendation, The AMD Ryzen 9 3950X and even the AMD Ryzen Threadripper 3990X (woooo those things are a beast) are high on my list and currently the best in the biz for Vegas Pro content creators (Sits beautifully with Vegas and its occupants). Yes, on the expensive side, but then again what isn't lol.

As for GPU, fr0sty has a good combination working well with the slightly older AMD Ryzen 7 1800x in Vegas.

Musicvid wrote on 9/11/2020, 8:55 AM

As @Steve_Rhoden says, I have used GPU for rough cuts and dailies for slide presentations, where quality is not in contention. However, those have never seen the light of day. with Video projects, where quality does matter to me, XDCAM is almost perfect quality, is plenty fast for my needs, and I am proud to share it with my occasional clients.

Steve_Rhoden wrote on 9/11/2020, 9:37 AM

 

Musicvid that's true, my main focus is in the highest output quality and stability, as i create a vast array of Digital Content, Branding, Television Commercials and Corporate Media daily for Media houses, Corporations and Broadcast etc. It's a fast pace industry where i sit, but my years of experience in this biz and technology teaches me how to balance it out beautifully with no compromises, lol.

Musicvid wrote on 9/11/2020, 10:02 AM

If your clients are sophisticated and educated enough to understand that a machine proof may not be wholly representative of delivery quality, then you are in the minority, and are to be congratulated. With my archival scanning work, which I do almost exclusively now, I would consider second-tier proofs to be a liability.

Grazie wrote on 9/11/2020, 1:18 PM

Grazie, haven't heard back from you on the matter, as it was you whom we were responding to.

@Steve_Rhoden - Correct. I’m taking my responses very seriously. There’s much to take in, from some truly wise heads. And much I’m considering as a set of replies. I won’t be hurried along on this one. It’s too important, I feel, to give quick replies. 😉

fr0sty wrote on 9/11/2020, 5:34 PM

As for GPU, fr0sty has a good combination working well with the slightly older AMD Ryzen 7 1800x in Vegas.

Once again, though, if I disable my Radeon VII, playback performance in VEGAS gets really bad. My GPU cannot decode my 10 bit 4:2:2 video from my S1's/EVA1/FS5/FS7/GH5/GH5s, but it definitely is doing something to help with playback, because I cannot edit even 1 4k 10 bit clip on the timeline without proxy if I have my GPU disabled.

Howard-Vigorita wrote on 9/11/2020, 6:08 PM

Only time I don't use GPU is when doing old-fashioned dvd authoring which requires interlaced video. But gpu seems just fine and I prefer it for performance reasons for BluRay 1080p and everything else I do. I find just benchmarking 4k hvec material without gpu acceleration almost unbearable... I wouldn't dream of trying to do any actual hvec work that way.

Vegas can use a GPU to accelerate encoding, decoding, certain effects, and preview display. Vegas puts the choice of gpu encoding in the render template. Decoding options are in File I/O preferences. Gpu preview optimization is in the Preview preferences. Gpu usage for effects depend on what effects sub-list you choose from. Then there's Video and General preferences that have settings for board choices, aces, and Intel options that either conflict or override all the other gpu preference locations. Very confusing. Be nice if Vegas centralized gpu control over each specific gpu function in one and only one place. And while I'm ranting, "timeline acceleration" encompasses all categories except encoding. And I think rendering acceleration encompasses them all except for preview display. Fwiw, I also see gpu utilization going on for copy and 3d operations even when I turn gpu usage off everywhere in Vegas.

Former user wrote on 9/11/2020, 6:34 PM

As for GPU, fr0sty has a good combination working well with the slightly older AMD Ryzen 7 1800x in Vegas.

Once again, though, if I disable my Radeon VII, playback performance in VEGAS gets really bad. My GPU cannot decode my 10 bit 4:2:2 video from my S1's/EVA1/FS5/FS7/GH5/GH5s, but it definitely is doing something to help with playback, because I cannot edit even 1 4k 10 bit clip on the timeline without proxy if I have my GPU disabled.

In the wikipedia entry for vegas pro it says

 on October 17, 2011. Updated features include GPGPU acceleration of video decoding, effects, playback, compositing, pan/crop, transitions, and motion

So elements of decoding, timeline playback, preview accelerated via GPU compute which you can see and feel if you turn your GPU off. It's ability to accelerate basic timeline playback is poor compared to competitors but like you say, anyone should be able to notice the difference to timeline playback which cant' be helped by CPU

 

Musicvid wrote on 9/11/2020, 11:32 PM

Thought we covered that?

fr0sty wrote on 9/12/2020, 12:42 AM

Well, those are pretty convincing reasons to get a powerful GPU if you plan on editing 4K. 1080p folks may do fine CPU only, but going beyond that, I don't see the point in giving up the GPU for stability, the performance hit is too much.

Grazie wrote on 9/12/2020, 3:31 AM

(@Steve_Rhoden) The more I read the responses, the more complex and, at the same time, the more edifying the “picture” becomes. What I’m wanting to do is to capture screen shots of those FXs from companies that DO utilise GPU and or combinations of CPU and GPU and present them here.

 

Last changed by Grazie on 9/12/2020, 3:33 AM, changed a total of 1 times.

Grazie

PC 10 64-bit 64gb * Intel Core i9 10900X s2066 * EVGA RTX 3080 XC3 Ultra 10GB - Studio Driver 551.23 * 4x16G CorsVengLPX DDR4 2666C16 * Asus TUF X299 MK 2


Cameras: Canon XF300 + PowerShot SX60HS Bridge

fr0sty wrote on 9/12/2020, 4:14 AM

Neat Video is one that hugely benefits from using the GPU. RSMB does as well if I remember correctly.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Grazie wrote on 9/12/2020, 4:54 AM

Neat Video is one that hugely benefits from using the GPU.
 

@fr0sty - Indeed. And this is one that has had to work with VegasSoftware to have it work and work very well. I have a few more too. I have one that USES it and then is ignored once away from the GUI.

I’m looking/pursuing the convergence of managing GPU and CPU. Those companies that do and those that need to review their approaches. I’ve travelled to Amsterdam in pursuit of this quest.

 

Steve_Rhoden wrote on 9/12/2020, 7:08 AM

Grazie, Red Giant Software is one such company that now incorporates GPU Acceleration in every one of their plugins, and now all compatible with Vegas pro "(Magic Bullet Suite 13 and Universe 3)".`

Dexcon wrote on 9/12/2020, 7:42 AM

I've been using Magic Bullet Looks for years as it has been compatible with Sony Vegas Pro and VEGAS Pro for a long time, but I am not so sure that that the entire Magic Bullet Suite 13 is VP compatible. On Magic Bullet's website re the Suite, out of the 7 products in the Suite, 5 have VP compatibility, but Denoiser and Colorista are shown as not being VP compatible. @Steve_Rhoden  ... if you have another source showing that the entire Suite range of products is indeed VP compatible, it would be great if you could provide that link.

Going back years and years, Colorista looked great because of its tracking capability (I wanted it but sadly it wasn't SVP or VP compatible), but a similar result can now be obtained by using BorisFX BCC or an applicable Unit - which all are VP compatible.

Cameras: Sony FDR-AX100E; GoPro Hero 11 Black Creator Edition

Installed: Vegas Pro 15, 16, 17, 18, 19, 20, 21 & 22, HitFilm Pro 2021.3, DaVinci Resolve Studio 20, BCC 2025, Mocha Pro 2025.0, NBFX TotalFX 7, Neat NR, DVD Architect 6.0, MAGIX Travel Maps, Sound Forge Pro 16, SpectraLayers Pro 11, iZotope RX11 Advanced and many other iZ plugins, Vegasaur 4.0

Windows 11

Dell Alienware Aurora 11:

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 2TB Samsung 990 PCIe 4.0 NVMe M.2 PCIe SSD

D: drive: 4TB Samsung 870 SATA SSD (used for media for editing current projects)

E: drive: 2TB Samsung 870 SATA SSD

F: drive: 6TB WD 7200 rpm Black HDD 3.5"

Dell Ultrasharp 32" 4K Color Calibrated Monitor

 

LAPTOP:

Dell Inspiron 5310 EVO 13.3"

i5-11320H CPU

C Drive: 1TB Corsair Gen4 NVMe M.2 2230 SSD (upgraded from the original 500 GB SSD)

Monitor is 2560 x 1600 @ 60 Hz

Steve_Rhoden wrote on 9/12/2020, 7:56 AM

 

Dexcon.....No, unfortunately Colorista and Denoiser are the only two in the suite that is not compatible with Vegas.

What i use to replace those and makes them look like they are not even trying are 3D Lut Creator and Neat Video.