Denoise is too slow to use

john-rappl wrote on 11/21/2020, 4:28 PM

I just upgraded from 17 to 18 mainly because I saw the new Denoise plug-in. I was debating buying Neat but decided to upgrade instead - big mistake! This new Vegas Denoise is really way too slow to be of any use in normal video editing!

Machine is an i7-8700K (6 core), 32GB, all fast NVMe SSDs, Built-in Intel UHD 630 + Radeon RX 580 - both enabled.

I've tried a few different usages with the same result, he's a sample. 14 sec 4K XAVC clip from a Sony NX-80. Dropped on a 1080 timeline. Clip takes about 14 seconds to render using MainConcept (software only) to a 1080/8Mbps mp4 file. Add the denoise plug-in to the clip (lum- 1.5, chrom - 3.0, sharp - 0). The same render that took 14 secs before now takes almost 10 minutes - for a 10 sec clip! The only difference is the Denoise Video FX added to the clip!!!

He's the interesting part. During the render if I pull up task manager I see usages of - CPU: 2%, Memory: 10%, Disks: 0-1%, Intel GPU: 0-1%, RX 580: 0-1%. So, it's not using any resources - what is it doing????

If I try a GPU accelerated render (either Intel or AMD) it gets slightly faster 7-8 minutes but again the usage is almost nothing with the usage for the GPU selected maybe increasing 1%.

Am I missing something? Is there a secret switch somewhere to make this work? Why isn't it using resources and rendering faster?

Comments

john-rappl wrote on 11/21/2020, 4:37 PM

I understand there are other products. That doesn't mean the built-in should be unusable.

michael-harrison wrote on 11/21/2020, 5:06 PM

"I was debating buying Neat but decided to upgrade instead - big mistake! This new Vegas Denoise is really way too slow to be of any use in normal video editing!"

Hard lesson but maybe you're still within the return window.

When there's a trial, always try the trial before buying.

System 1:

Processor        Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz, 3192 Mhz, 6 Core(s), 12 Logical Processor(s)

BaseBoard Product        ROG STRIX Z390-E GAMING

Installed Physical Memory (RAM)        32.0 GB

Adapter Description        NVIDIA GeForce GTX 1660

Driver Version        Studio <the latest stable>

Resolution        1920 x 1080 x 60 hertz

Video Memory 6G GDDR5

 

System 2:

Lenovo Yoga 720

Core i7-7700 2.8Ghz quad core, 8 logical

16G ram

Intel HD 630 gpu

Nvidia GTX 1050 gpu

Musicvid wrote on 11/21/2020, 7:23 PM

All kernel and wavelet based noise reduction filters are terribly slow. In any program. You can have good noise reduction, or you can have faster noise reduction. Neat is good temporally, but it isn't exactly a speed demon.

The same render that took 14 secs before now takes almost 10 minutes - for a 10 sec clip! The only difference is the Denoise Video FX added to the clip!!!

That's actually quite good!!!!!!!!!!!!!!!!!!!!!!!

Play around with NLMeans, QTGMC, Hqdn3d, and other command-line solutions if you don't trust this advice. It's your expectations, not what it actually does.

lenard wrote on 11/21/2020, 8:39 PM

One thing I noticed is that it's using 100% of cuda engine GPU which is unusual, it is creating the bottleneck in performance. 1.7fps on 1080p project. Doesn't seem right

EDIT: Could people with stronger GPU's try this filter, 1080ti, 2080 or greater, vega7 etc. I wonder if it bottlenecks all gpu's

RogerS wrote on 11/21/2020, 10:15 PM

Yes, the noise reduction Fx is very slow. Alternatively you can try a temporal approach by using Flicker Control. It introduces blurring, so reduce sensitivity (try around .1) depending on how much non-noise motion there is in the image.

Then under tools/video, get motion tracker data. Magix explains in the help "To get the motion data for the entire image, move the motion tracking region completely out of the image and activate the Track outside mask option"
Then apply this data to the flicker control Fx.

For me I'm able to get this to play back in real time with Good/Auto.

john-rappl wrote on 11/21/2020, 10:33 PM

I understand that some FX is slow, 10 minute to denoise 14 seconds (420 frames) is terrible! In addition, task manager reports no usage so what is Vegas doing? Spending time waiting on something instead of using the resources to encode? I've been in software for 40+ years (systems and production with lots of experience with optimization on compilers and systems) Seems like something is very wrong here. In addition I have used other video editors with build-in denoise plugins that are much faster - much...

Dexcon wrote on 11/21/2020, 10:58 PM

Here's a comparison in VP18. The footage is 3840x2160x25fps from a Sony AX100 and is exactly 14 seconds long. Project properties is 1920x1080x25fps and the render was to 1920x1080x25p using the MAGIX Internet mp4 selection. The same video event was used for each render with each noise reduction FX at default or basic settings.

Vegas Pro Denoiser 13' 15"

Neat Video 26' 32"

BCC Noise Reduction 64' 35"

Using the same shot and the same or similar timeline and renders settings but in HitFilm Pro, the render with BCC Noise Reduction (default setting) took just 1' 40" - that is 2.6% the time it took in Vegas Pro.

Dell Alienware Aurora 11

Windows 10 Home

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 1TB M.2 PCIe NVMe SSD

D drive: 6TB WD 7200 rpm Black HDD 3.5"

E & F drives: 2 x 2TB Barracuda HDDs 2.5"

 

lenard wrote on 11/21/2020, 11:05 PM

I understand that some FX is slow, 10 minute to denoise 14 seconds (420 frames) is terrible! In addition, task manager reports no usage so what is Vegas doing? Spending time waiting on something instead of using the resources to encode? I've been in software for 40+ years

@john-rappl With an Nvidia GPU, Vegas will use 2 different engines, CUDA and 3D, but by default Windows only shows 3D. You can check the other viewing options when using denoiser, maybe activity in OpenCl or compute. Your GPU heating up will also verify it's actually doing something. Not saying it's working properly, but your GPU might be doing something

 

lenard wrote on 11/21/2020, 11:12 PM

Vegas Pro Denoiser 13' 15"

Neat Video 26' 32"

BCC Noise Reduction 64' 35"

HitFilm Pro, the render with BCC Noise Reduction (default setting) took just 1' 40" - that is 2.6% the time it took in Vegas Pro.

I don't know what units you're using. I am assuming this is American freedom unit notation. Can you confirm 13' 15" = 13minutes and 15 seconds, or is it 13.15 seconds?

Dexcon wrote on 11/21/2020, 11:51 PM

Yes - 13 minutes 15 seconds. Australian (so presumably British) notations of time duration. For clarity, I'll use 'm' and 's' in future.

Last changed by Dexcon on 11/22/2020, 12:05 AM, changed a total of 1 times.

Dell Alienware Aurora 11

Windows 10 Home

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 1TB M.2 PCIe NVMe SSD

D drive: 6TB WD 7200 rpm Black HDD 3.5"

E & F drives: 2 x 2TB Barracuda HDDs 2.5"

 

RogerS wrote on 11/22/2020, 12:06 AM

Did you try the method I suggested above to do temporal noise reduction? See what you think for quality and render time.

Grazie wrote on 11/22/2020, 12:11 AM

Neat Video - ‘Nuf said, and move on

Musicvid wrote on 11/22/2020, 1:13 AM

Vegas Pro Denoiser 13' 15"

Neat Video 26' 32"

BCC Noise Reduction 64' 35"

Believe @Dexcon's numbers. CLI filters are even slower. Nick knows too. Not sure the OP understands the bipredictive matrix calculations going on. Both spatial and temporal domains must be considered simultaneously.

I also wonder how it came to be that we were taught "Australian" notation in 1950s rural Nebraska ;?)

Dexcon wrote on 11/22/2020, 2:08 AM

@Musicvid

I also wonder how it came to be that we were taught "Australian" notation in 1950s rural Nebraska ;?)

Just to clarify ... I only meant 'used in Australia'. Most weights, measures, distances, speeds, etc used here are metric. Before conversion to metric during the 1970s, the British Imperial system was in use. I'd be very surprised if AU has ever developed a standard or notation in these areas ... unless it is considered that Crocodile Dundee defined the size of knives in the 1986 Australian movie of the same name.

 

Last changed by Dexcon on 11/22/2020, 3:48 AM, changed a total of 1 times.

Dell Alienware Aurora 11

Windows 10 Home

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 1TB M.2 PCIe NVMe SSD

D drive: 6TB WD 7200 rpm Black HDD 3.5"

E & F drives: 2 x 2TB Barracuda HDDs 2.5"

 

lenard wrote on 11/22/2020, 2:12 AM

Vegas Pro Denoiser 13' 15"

Neat Video 26' 32"

BCC Noise Reduction 64' 35"

HitFilm Pro, the render with BCC Noise Reduction (default setting) took just 1' 40" - that is 2.6% the time it took in Vegas Pro.

Yes - 13 minutes 15 seconds. Australian (so presumably British) notations of time duration. For clarity, I'll use 'm' and 's' in future.

I wasn't sure due to times. as comparison I tried resolve with 2160P source, 1080p25 14 second project. Difference in GPU use is instead of 100% cuda, Resolve uses 100% 3d (doesn't necessarily mean anything) but it completes 11 seconds (fast mode) and 19 seconds (better mode)

I also tried the same settings with Premiere Pro and Neat Video. Using recommended CPU+GPU mode for neat VIdeo time was about 1minute, and cpu only 2minutes. So times are all over the place, Neat video has best quality though, so the 1minute processsing for a 14 second 1080p25 seems about what it should be

 

 

Dexcon wrote on 11/22/2020, 2:24 AM

Neat video has best quality

100% agreed - it is by far superior to any other NR VFX that I've used or tested over the years. I only recently got Neat Video and ended up redoing all the NR processing in my current projects where BCC NR had previously been used. And because of the rendering time issue with NR VFX, I usually create a VP project just for NR processing and then render to an intermediate for use on the main project timeline.

Dell Alienware Aurora 11

Windows 10 Home

10th Gen Intel i9 10900KF - 10 cores (20 threads) - 3.7 to 5.3 GHz

NVIDIA GeForce RTX 2080 SUPER 8GB GDDR6 - liquid cooled

64GB RAM - Dual Channel HyperX FURY DDR4 XMP at 3200MHz

C drive: 1TB M.2 PCIe NVMe SSD

D drive: 6TB WD 7200 rpm Black HDD 3.5"

E & F drives: 2 x 2TB Barracuda HDDs 2.5"

 

john-rappl wrote on 11/22/2020, 8:03 AM

I do understand there is lots of processing going on but it would get done faster if the resources of the computer were being used and not left sitting at 1-3% usage on all fronts!

Musicvid wrote on 11/22/2020, 10:15 AM

@Dexcon

It was a joke, mate. I believe the standard notation was almost universal in English-speaking countries then.

Musicvid wrote on 11/22/2020, 10:20 AM

It is Neat Video's temporal smoothing that I think most people prefer. For spatial (frame based) accuracy, it is maybe not quite as good. It is also a favorite for analog source, as it was originally developed.

JN- wrote on 11/22/2020, 10:20 AM

@john-rappl Theres a test method that comes with NV, so you can get the most out of your cpu, gpu. I have only ever used NV, I even had a need to reduce some flicker, and it was a great help. Great value, super tool for what it does.

---------------------------------------------

VFR2CFR, Variable frame rate to Constant frame rate etc

Benchmarking thread

Codec Render Quality tables

---------------------------------------------

PC ... Corsair case, own build ...

CPU .. i9 9900K, iGpu UHD 630

Memory .. 32GB DDR4

Graphics card .. MSI RTX 2080 ti

Graphics driver .. latest studio

PSU .. Corsair 850i

Mboard .. Asus Z390 Code

 

Laptop ... (Acer Predator G9-793-77AC)

CPU .. i7-6700HQ Skylake-H

Memory ..32 GB DDR4, was previously 16 GB

Graphics card .. Nvidia GTX 1070

Graphics driver .. latest studio

john-rappl wrote on 11/22/2020, 10:40 AM

I did just now try the Flicker Control and I can see where this may be useful for some of the video I deal with. It does seem to be much faster also.

Doesn't detract from the fact that denoise would be unusable in most of my cases. I do lots of recordings of 2-20 minute scenes in poorly lit high school rooms and/or poorly lit stages. Some shows are 2+ hours with many dim scenes. Upgrading the lighting is not an option - I keep trying though!

It's not unusual for me to record an event with twenty 3-4 minute performances for chorus or theater departments. At the above rates to denoise would take about 3 hours for each or 60+ hours for the complete event. I currently turn these around for teaching reviews or for families to have the videos in 24 hours of less. 60 hours for just the encoding is not an option.

I guess we will live with a little noise!

Musicvid wrote on 11/22/2020, 11:30 AM

https://www.bhphotovideo.com/explora/video/buying-guide/top-10-camera-video-lights