Magix AVC NV rendering issue

ion-marin wrote on 10/28/2017, 7:58 AM

Hello!

Does anyone know what can be wrong with the MAGIX AVC - NV codec in Vegas15?

I am trying to render some 4k color graded footage and i'm having a hard time with it. First of all Magix AVC is a realy fast codec but it still renders 4min of video in about 20min of real time AND the bigger problem is that every time it gets to a crossfade, the preview window goes almost completly black.(also is the video in the rendered file since that point).

Seems like a problem with the GPU rendering, but using CPU render is not an option since it takes about 1,5 hours to render the 4 min video.

*I have a NVIDIA GTX 1080Ti video card. It is very cool (i use watercooling) and i have the latest nvidia drivers.

So is this a bug, or am i doing somthing wrong?

Comments

alexander-nNevis wrote on 10/28/2017, 1:55 PM

When You ask about 4K video, please specify the model of the camcorder and the format of the original video, its frame rate, its bit rate and its container and the version of your video driver. All the settings for 4K videos, especially for videos with high fps and high bitrate, are still in the testing phase.

As an example: the rendering of 35 seconds of 4K 4:2:0 video 59.94 fps,150 megabit, 3 transistions, 2 color filter + Mercalli stabilization, using the NVENC codec with maximum quality, takes 15 minutes.(see my configurations in my signature)

P.S.

Also check you have installed Microsoft NET Framework 4.6.2

Last changed by alexander-nNevis on 10/28/2017, 1:58 PM, changed a total of 3 times.

Camcoders: Panasonic Lumix-DC-GH5, Sony DSC-RX10M4, Sony FDR-X3000

Windows 10 Pro 64-bit 2004, Intel Core i9 10920X, ASUSTeK "X299 PRIME EDITION 30" (SOCKET 2066), 64 Gb DDR4-3200 G.Skill RAM, 200GB INTEL SSD (System), 800 Gb INTEL SSD for NLE, Same HDD for data, Creative Sound Blaster AE-9, NVIDIA GeForce RTX 2080Ti.

alexander-nNevis wrote on 10/28/2017, 5:49 PM

 

Cornico, Thanks for your demo. I am shocked! I did render my video file using your settings, but I got the same damn result: 30 seconds rendered in 15 minutes. I need to understand what the problem is. What are the differences between the two systems?

1. Operating system (If I'm not mistaken, you wrote that you use Windows 10. But I have Windows 7).

2. CPU. (My CPU has no graphics core and can not use technology QSV). Although it should not affect in this case.

3. The format of the original video. It seems Vegas doesn't like the format of my video. Practically my camcorder outputs - RAW 4K format, Packed into AVI container and encoded with the Canopus HQX codec with 1 Gigabit bitrate! Maybe this is the problem.

If possible, can You upload somewhere 10-15 seconds of your source video so I can test my system? I would be very grateful.

Last changed by alexander-nNevis on 10/28/2017, 5:52 PM, changed a total of 1 times.

Camcoders: Panasonic Lumix-DC-GH5, Sony DSC-RX10M4, Sony FDR-X3000

Windows 10 Pro 64-bit 2004, Intel Core i9 10920X, ASUSTeK "X299 PRIME EDITION 30" (SOCKET 2066), 64 Gb DDR4-3200 G.Skill RAM, 200GB INTEL SSD (System), 800 Gb INTEL SSD for NLE, Same HDD for data, Creative Sound Blaster AE-9, NVIDIA GeForce RTX 2080Ti.

alexander-nNevis wrote on 10/29/2017, 9:25 AM

Thank you again Cornico for the video!

As I expected, the rendering speed is highly dependent on the quality and format of the original video. Using your rendering settings, and using your video, I got your result.

However, just the application to this video "Mercalli" filter doubles the rendering time.

Nevertheless, I'm glad you helped me find the true cause of a long process of rendering my video, and I like that my system is OK. 😊

Best regards.

Camcoders: Panasonic Lumix-DC-GH5, Sony DSC-RX10M4, Sony FDR-X3000

Windows 10 Pro 64-bit 2004, Intel Core i9 10920X, ASUSTeK "X299 PRIME EDITION 30" (SOCKET 2066), 64 Gb DDR4-3200 G.Skill RAM, 200GB INTEL SSD (System), 800 Gb INTEL SSD for NLE, Same HDD for data, Creative Sound Blaster AE-9, NVIDIA GeForce RTX 2080Ti.

dream wrote on 10/29/2017, 10:36 AM

Hello!

Does anyone know what can be wrong with the MAGIX AVC - NV codec in Vegas15?

I am trying to render some 4k color graded footage and i'm having a hard time with it. First of all Magix AVC is a realy fast codec but it still renders 4min of video in about 20min of real time AND the bigger problem is that every time it gets to a crossfade, the preview window goes almost completly black.(also is the video in the rendered file since that point).

Seems like a problem with the GPU rendering, but using CPU render is not an option since it takes about 1,5 hours to render the 4 min video.

*I have a NVIDIA GTX 1080Ti video card. It is very cool (i use watercooling) and i have the latest nvidia drivers.

So is this a bug, or am i doing somthing wrong?

So is this a bug, or am i doing somthing wrong?

Looks not a bug to me.
I'm very satisfied with the renderoptions of that codec and especially the Intell and NVidia possibilities.
My normal renders are HD 50p and those took for my avarage projects less than half-realtime.
For you I did a test with footage from a DJI drone in a 4K 50 p project of 20 seconds with colorcorrection (SeMW full color correction), 4 crossdissolves and played and rendered that with that codec while OBS was screencapturing on 50p. Result 58 seconds, less than 3x realtime.
My Nvidia is the GTX 1050 Ti, look for yourself

you can do it faster with,qsv (intel).

check and please update

joseph-w wrote on 10/29/2017, 11:03 AM

 

Cornico: Which NVidia driver number are you using?

matthew-zisk wrote on 11/6/2017, 2:56 PM

I used Vegas for about 2 decades until recently when I moved over to DaVinci Resolve when I upgraded to UHD video (Vegas at that time -- I think it was 13 but I also tried 14) could not handle UHD even on my WIN 10 Pro (64bit) HP Z840 with a Xeon E5-2650 v3 and 64MB RAM.

With the marketing promise to take advantage of a GPU in a meaningful way, I upgraded to Vegas 15.

I have an NVIDIA Quadro M6000 24GB card (in addition to an M4000) dedicated to rendering and it shows up and is selected for video acceleration in the preferences menu video tab.

Any idea why rendering using the MAGIX AVC Internet HD codecs (at various resolutions; with or within Nvidia NVENC) uses only about 5% (peaks briefly at 9% and then levels off at about 4 or 5%) of the GPU load? The CPU is maxing out (all 20 virtual cores) during render, so it seems as though the codec is not using the GPU.

Rendering times are pretty pathetic, I'll add -- in contrast to DaVinci Resolve -- 25-30 minutes to render using Vegas what takes about 3-5 minutes on Resolve -- same footage, same resolutions, same LUT to convert Slog3/Cine3 to 709. Resolve uses between 50% and 70% of the GPU load, I'll add.

Is there another codec in Vegas that takes real advantage of a GPU?

Former user wrote on 11/7/2017, 12:01 AM

Davinci resolve is able to use about 57% of my gtx1070gpu, Vegas only about 25-30%. max I've seen is 37%

When I do a straight render to main concept avc, basically a transcode doing nothing to the video. GPU use is only 5% as there's almost no video processing to be done. It's the plugins that use most of the GPU & only if it's gpu accelerated. There are those that aren't.

You sound like you have a commercial setup so you would never use NVENC hardware encoding as it's quality is noticeably lower than software. A hardware encode without plugins takes 55seconds as compared to software MC AVC which is 2m15s.

So I think something is wrong with your setup but I can't imagine your 25minute software render will ever equal the 3m30s of davinci although if what you are using for LUT conversion is GPU accelerated & you use NVENC hardware encoding, BUT currently have NO GPU video processing or encoding it will get much faster. The quality of an NVENC encode is usable but not to be desired

liork wrote on 11/7/2017, 3:34 AM

I used Vegas for about 2 decades until recently when I moved over to DaVinci Resolve when I upgraded to UHD video (Vegas at that time -- I think it was 13 but I also tried 14) could not handle UHD even on my WIN 10 Pro (64bit) HP Z840 with a Xeon E5-2650 v3 and 64MB RAM.

With the marketing promise to take advantage of a GPU in a meaningful way, I upgraded to Vegas 15.

I have an NVIDIA Quadro M6000 24GB card (in addition to an M4000) dedicated to rendering and it shows up and is selected for video acceleration in the preferences menu video tab.

Any idea why rendering using the MAGIX AVC Internet HD codecs (at various resolutions; with or within Nvidia NVENC) uses only about 5% (peaks briefly at 9% and then levels off at about 4 or 5%) of the GPU load? The CPU is maxing out (all 20 virtual cores) during render, so it seems as though the codec is not using the GPU.

Rendering times are pretty pathetic, I'll add -- in contrast to DaVinci Resolve -- 25-30 minutes to render using Vegas what takes about 3-5 minutes on Resolve -- same footage, same resolutions, same LUT to convert Slog3/Cine3 to 709. Resolve uses between 50% and 70% of the GPU load, I'll add.

Is there another codec in Vegas that takes real advantage of a GPU?

NVENC rendering must work on your system and is miles faster than CPU only. Maybe its a matter of drivers?