Denoise is too slow to use

Comments

walter-i. wrote on 11/22/2020, 11:30 AM

@john-rappl

Would a more light sensitive camera be an option?

john-rappl wrote on 11/22/2020, 1:28 PM

I have decent cameras, Panasonic HC-X1, Sony NX-80 and a Full Frame EOS R. The video is decent, I'm looking for the last 5% here. I generally don't have ideal conditions and in most cases certainly not studio/set lighting.

I also have some camera lights but in many cases they cannot be used for many different reasons.

Just would like the Denoise feature used in the marketing material to work in a usable manner

Musicvid wrote on 11/22/2020, 9:17 PM

It's of course better to take care of lighting problems at the shoot, rather than attempt it in post. Now that you're aware of the state of the art and choices available, you can be thankful for the choices you do have, and not belabor the collateral. I'm sure it will all be done with AI and ThZ processors someday, ...

Here's a teaser for stills -- I trialed it and am impressed with the spatial results. They look great but are structurally dissimilar as hell compared to the originals.

https://topazlabs.com/denoise-ai-2/

john-rappl wrote on 11/22/2020, 9:55 PM

Yes, but not an option in many cases.

Musicvid wrote on 11/22/2020, 10:22 PM

😶

lenard wrote on 11/23/2020, 1:05 AM

It's of course better to take care of lighting problems at the shoot, rather than attempt it in post. Now that you're aware of the state of the art and choices available, you can be thankful for the choices you do have, and not belabor the collateral. I'm sure it will all be done with AI and ThZ processors someday, ...

Here's a teaser for stills -- I trialed it and am impressed with the spatial results. They look great but are structurally dissimilar as hell compared to the originals.

https://topazlabs.com/denoise-ai-2/

I have tried their Video upscaler, but not in upscale mode. I had hoped the AI would excel in noise reduction. With the video I gave it, the results were poor compared to Resolve, and Neat Video, and it took 7x longer than Neat Video while using a huge amount of GPU. I don't normally see my GPU at 70 degrees

lenard wrote on 11/23/2020, 4:40 PM

I would actually say dont' use this software it's dangerous. I was completing a project in Resolve yesterday, the render time 15minutes with 100% 3d GPU load due to fx chain. GPU was stable at 50 degree's c, but this topaz has an ability to make it run at 72 degree's C, and I"ve never seen temperatures like that

Grazie wrote on 11/23/2020, 5:47 PM

Just had an Evening session DeNoise some crappy footage I shot. Lotsa noise. I used NV with the new MAGICYUV Codec thru' HOS RE - blisteringly fast. The Ole MONSTA! fans were revving and the CPU was shoveling the frames through. I used a Combo of Temporal and Spatial variables with the outcome looking great.

G

RogerS wrote on 11/23/2020, 7:49 PM

I wouldn't worry about GPU heat. My laptop GPU regularly gets to around 70C. NVIDIA states they can run between 40 and 90C. If it gets too hot the system will automatically protect itself:

NVIDIA GPUs are designed to operate reliably up to their maximum specified operating temperature. This maximum temperature varies by GPU.  Refer to the nvidia.com product page for individual GPU specifications. If a GPU hits the maximum temperature, the driver will throttle down performance to attempt to bring temperature back underneath the maximum specification. If the GPU temperature continues to increase despite the performance throttling, the GPU will shutdown the system to prevent damage to the graphics card.

https://nvidia.custhelp.com/app/answers/detail/a_id/2752/~/nvidia-gpu-maximum-operating-temperature-and-overheating

lenard wrote on 11/23/2020, 9:12 PM

I was thinking about the advice to not buy ex crypto mining cards, which would run the card just as hard, but it would be more a warning about 100% load for weeks and months and possibly with poor ventilation. This AI software more likely to cook or force shutdown of a power supply where previously system had no problem

RogerS wrote on 11/23/2020, 9:37 PM

I don't think you can "cook" a card no matter what you do. Systems read the temperature for CPU and GPU and throttle and then shut down before damage can occur. Software like this would be run intermittently, anyway. Rendering, performance benchmarks, and gaming can all put high load on a CPU and GPU. They are designed for it.

On my laptop I get heavy throttling (CPU frequency drops from 3.4ghz to 700mhz) when CPU and GPU are used heavily and the system temperatures get too hot. The fans run and performance is reduced until temperatures drop. Then the cycle repeats. It can be mitigated by undervolting the CPU and cleaning the fans.

Crypto mining specialty systems may not have the same safeguards in place; I don't know. This article suggests concerns are unfounded and just make sure the GPU ventilation system is working:

Despite rumors to the contrary, cryptomining does not degrade your GPU anymore than high-performance gaming would… which is to say, not much at all.

https://medium.com/@saladchefs/does-mining-for-cryptocurrency-damage-my-gpu-5a74827a0742

lenard wrote on 11/23/2020, 10:13 PM

I don't think you can "cook" a card no matter what you do. Systems read the temperature for CPU and GPU and throttle and then shut down before damage can occur.

I was mentioning my other concern, the power supply. You may have had a perfectly stable system for longest time then this AI software uses record power draw causing premature capacitor failure and possibly system shutdown. This software draws so much power but adds no improvement to regular noise reduction in my limited testing with limited AI models and settings and is very slow compared to Neat Video.

People rave about Topaz Gigapixel for uprezing to get detail and noise reduction for images, but this software purely for noise reduction of video is not something Vegas people should use, especially if you're not certain about your power supply quality or wattage. Mine is 450w, Never a problem, but using this software could be fatal

Illusion wrote on 11/24/2020, 1:59 PM

I did the test with denoise on a 1080p video on my system out of curiosity. With my RTX 2070, I was getting around 1.8 fps, pretty bad. I decided to do a CPU denoise by disabling the "GPU acceleration of video processing" in Preferences/Video. This is one of the first time I see all 12 cores hyperthreaded being so busy! And I got only 0.1 fps! GPU is still better.

  • ASUS ROG Strix X570-E
  • Ryzen 9 3900x 12-core
  • 128GB RAM (4x32GB)
  • Nvidia RTX 2070 8GDDR6
  • 1TB WD Black NVMe M.2 for OS/Prog
  • 1TB WD Black NVMe M.2 for Media
  • 1TB/2TB/4TB SATA3 SSDs for projects/media
  • 4TB WD Red drive for local cache
  • 10TB EXOS Enterprise, 14TB Toshiba drive for local cache
  • 32in 4K main monitor, 24in 1080 second
  • Win 11 Pro
  • 28TB NAS for long term archive storage
  • Sony a6000
  • Sony A7C
  • GoPro Hero 8
  • GoPro Hero 11
  • Sony BRAVIA XR X90J 75" 4K HDR10
  • Nvidia Shield TV Pro media player (GigE wired)
  • JBL Bar 5.1

 

Reyfox wrote on 11/24/2020, 4:12 PM

@Illusion ouch!!!! 0.1fps??? That's enough to force you to buy NeatVideo!

Newbie😁

Vegas Pro 22 (VP18-21 also installed)

Win 11 Pro always updated

AMD Ryzen 9 5950X 16 cores / 32 threads

32GB DDR4 3200

Sapphire RX6700XT 12GB Driver: 25.5.1

Gigabyte X570 Elite Motherboard

Panasonic G9, G7, FZ300

Grazie wrote on 11/25/2020, 1:41 AM

NV + HOS RE+ MAGICYUV + i9 is stunningly fast. How do I know this? Rapid FPS and my PC Fans going into overdrive 😂! And, watching Program Manger making upper 80s % redrawing at speed. Luv it. 😎
 

 

michael-harrison wrote on 11/25/2020, 1:47 AM

@Grazie FPS was?

System 1:

Windows 10
i9-10850K 10 Core
128.0G RAM
Nvidia RTX 3060 Studio driver [most likely latest]
Resolution        3840 x 2160 x 60 hertz
Video Memory 12G GDDR5

 

System 2:

Lenovo Yoga 720
Core i7-7700 2.8Ghz quad core, 8 logical
16G ram
Intel HD 630 gpu 1G vram
Nvidia GTX 1050 gpu 2G vram

 

Grazie wrote on 11/25/2020, 2:45 AM

@Grazie FPS was?

@michael-harrison - It was going THAT fast, it was all a bluuuurrrr! I’ll do it again, just for you, and report back 😉.

Grazie wrote on 11/25/2020, 3:56 AM

@michael-harrison - I'm not being shown FPS in the Render Tool? Weird? Anywhooos, if my maths aint wonky that's 6.99 FPS. Here's the Numbers:

Running 29.97 Media

Event Length 5min 15sec  = 315 secs ≡ 9,441 (frames @ 29.97fps)

Render Duration 22:43 or 1,363 seconds

Or: 6.99 FPS

So, roughly a 5min 15sec took 22min 43secs to render OR 4x LONGER than real time. So, an hour would come in at 4 hours.

Please check my maths!

Hulk wrote on 11/25/2020, 10:19 AM

I don't believe this thread has addressed the OP's primary concern. It's not that the denoise algorithm is inherently slow, it's not the quality of the output, it's simply that Vegas isn't utilizing a significant portion of the available compute, whether it be CPU or GPU.

It seems to me this is a valid concern. When you are waiting 10 minutes for a 14 second clip to render while CPU and GPU loading is less than 5% I think it's safe to say there is a problem. Perhaps it's simply the OP's setup or something deeper in Vegas?

Would be nice for one of the developers to step in and provide some "under the hood" details on this.

Grazie wrote on 11/25/2020, 10:25 AM

Would be nice for one of the developers to step in and provide some "under the hood" details on this.

@Hulk - Indeed!

john-rappl wrote on 11/25/2020, 11:24 AM

@Hulk, thank you

Musicvid wrote on 11/25/2020, 1:16 PM

Mapping every pixel to spatial and temporal matrices simultaneously requires a lot of thinking. I tried it last night, and didn't sleep a wink.

lenard wrote on 11/25/2020, 6:55 PM

I don't believe this thread has addressed the OP's primary concern. It's not that the denoise algorithm is inherently slow, it's not the quality of the output, it's simply that Vegas isn't utilizing a significant portion of the available compute, whether it be CPU or GPU.

It seems to me this is a valid concern. When you are waiting 10 minutes for a 14 second clip to render while CPU and GPU loading is less than 5% I think it's safe to say there is a problem. Perhaps it's simply the OP's setup or something deeper in Vegas?

Would be nice for one of the developers to step in and provide some "under the hood" details on this.

I noted before that Vegas, much like Premiere Pro uses 2 different GPU engines. cuda(for Nvidia cards) and 3d, but Windows by default only shows you 3d

This ran my GPU at 55c which is as hot as it gets under 100% 3d load (The topaz AI stuff increases temps by almost 20degrees but it's unusual) so it's possible it is using all of your GPU but I don't know enough about this stuff. Should more of that cuda activity be in 3d, so card is inefficiently wasting power?

john-rappl wrote on 11/25/2020, 8:25 PM

As I specified in the original post, I have 2 GPUs enabled, The built-in Intel UHD 630 and an AMD Rx580. Task manager shows the complete usage of the device on the left hand side. When using denoise for an encode, task manager shows 1-2% CPU usage , 10-15% Memory usage, 1-3% for the RX580 (main display GPU) and 0% usage for the Intel GPU. Neither of the GPUs are being used much at all and neither is the CPU. Temps all stay the same as they are at idle.

The GPU's are not being used and neither is the CPU. I don't know what Vegas is doing but it looks like it is spending most of the time blocked waiting for something instead of doing the work required.