VP17 - State of GPU Encoding and Playback?

rtbond wrote on 10/30/2019, 1:30 PM

I have VP 15 and a 10 year old desktop. I plan to update both and wanted to get some VP17 perspectives of the practical benefit of GPU-based rendering and playback for 1080p and 4K content (AVC and HEVC). I am considering an Nvidia RTX 2070 Super or GTX 1660 GPU, with an 8+ core CPU (@64 GB RAM, NVMe M.2 based storage).

I would generally not be applying a larger number of effects (primarily limited to titles, transitions, etc)

So, is GPU-based encoding and playback mainstream, or are the benefits uneven?

Thanks!

Rob Bond

My System Info:

  • Vegas Pro 22 Build 194
  • OS: Windows 11.0 Home (64-bit), Version: 10.0.26100 Build 26100
  • Processor: i9-10940X CPU @ 3.30GHz (14 core)
  • Physical memory: 64GB (Corsair Vengeance LPX 64GB (2 x 32GB) DDR4 DRAM 3200MHz C16 memory kit)
  • Motherboard Model: MSI x299 Creator (MS-7B96)
  • GPU: EVGA GeForce RTX 2070 SUPER XC ULTRA (Studio Driver Version =  536.40)
  • Storage: Dual Samsung 970 EVO 1TB SSD (boot and Render); WDC WD4004FZWX, 7200 RPM (media)
  • Primary Display: Dell UltraSharp 27, U2723QE, 4K monitor with 98% DCI-P3 and DisplayHDR 400 with Dell Display Manager
  • Secondary Display: LG 32UK550-B, entry-level 4k/HDR-10 level monitor, @95% DCI-P3 coverage

Comments

j-v wrote on 10/30/2019, 2:07 PM

So, is GPU-based encoding and playback mainstream, or are the benefits uneven?

Depends also if you have plenty of time or not.
I have the time and render most to FHD AVC and HEVC with the Magix codecs, but with my desktop from signature it goes pretty fast and good.
Previewing realtime in full preview(best) without proxies a project with 3 timelines with all 4k HEVC GoPro7 files and texts goes realtime in a 50p UHD 50p project and rendering to my FHD AVC 50p goes 0,76x realtime(0,95 x realtime for HEVC) with Nvenc help and NVDEC settings.
 

 

Last changed by j-v on 10/30/2019, 2:16 PM, changed a total of 1 times.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Musicvid wrote on 10/30/2019, 2:15 PM

... get some VP17 perspectives of the practical benefit of GPU-based rendering and playback for 1080p and 4K content (AVC and HEVC).

It's faster.

So, is GPU-based encoding and playback mainstream, or are the benefits uneven?

It is mainstream with consumers, and the benefits for producers are unbalanced, quality being the undisputed loser in this decade. Start around QC 14 if you must have the product now, but the files will probably be larger than x264 / x265 software encodes of comparable quality.

rtbond wrote on 10/30/2019, 2:20 PM

It is mainstream with consumers, and the benefits for producers are uneven, quality being the undisputed loser in this decade.

Would you elaborate on the quality issue? Are you referring to the encode quality if you use GPU-based encoding (relative to CPU-based) for the same bit rate?

Last changed by rtbond on 10/30/2019, 2:40 PM, changed a total of 2 times.

Rob Bond

My System Info:

  • Vegas Pro 22 Build 194
  • OS: Windows 11.0 Home (64-bit), Version: 10.0.26100 Build 26100
  • Processor: i9-10940X CPU @ 3.30GHz (14 core)
  • Physical memory: 64GB (Corsair Vengeance LPX 64GB (2 x 32GB) DDR4 DRAM 3200MHz C16 memory kit)
  • Motherboard Model: MSI x299 Creator (MS-7B96)
  • GPU: EVGA GeForce RTX 2070 SUPER XC ULTRA (Studio Driver Version =  536.40)
  • Storage: Dual Samsung 970 EVO 1TB SSD (boot and Render); WDC WD4004FZWX, 7200 RPM (media)
  • Primary Display: Dell UltraSharp 27, U2723QE, 4K monitor with 98% DCI-P3 and DisplayHDR 400 with Dell Display Manager
  • Secondary Display: LG 32UK550-B, entry-level 4k/HDR-10 level monitor, @95% DCI-P3 coverage
rtbond wrote on 10/30/2019, 2:44 PM


Previewing realtime in full preview(best) without proxies a project with 3 timelines with all 4k HEVC GoPro7 files and texts goes realtime in a 50p UHD 50p project
 

 

Is the real-time playback performance largely attributable to the GPU? I am guessing yes.

Last changed by rtbond on 10/30/2019, 2:50 PM, changed a total of 1 times.

Rob Bond

My System Info:

  • Vegas Pro 22 Build 194
  • OS: Windows 11.0 Home (64-bit), Version: 10.0.26100 Build 26100
  • Processor: i9-10940X CPU @ 3.30GHz (14 core)
  • Physical memory: 64GB (Corsair Vengeance LPX 64GB (2 x 32GB) DDR4 DRAM 3200MHz C16 memory kit)
  • Motherboard Model: MSI x299 Creator (MS-7B96)
  • GPU: EVGA GeForce RTX 2070 SUPER XC ULTRA (Studio Driver Version =  536.40)
  • Storage: Dual Samsung 970 EVO 1TB SSD (boot and Render); WDC WD4004FZWX, 7200 RPM (media)
  • Primary Display: Dell UltraSharp 27, U2723QE, 4K monitor with 98% DCI-P3 and DisplayHDR 400 with Dell Display Manager
  • Secondary Display: LG 32UK550-B, entry-level 4k/HDR-10 level monitor, @95% DCI-P3 coverage
j-v wrote on 10/30/2019, 2:50 PM

Is the real-time playback performance largely attributable to the GPU? I guessing yes.

I think so, because the use of the first in VP17 option in File I/O to use the NVDEC of my GTX 1660 Ti makes much difference in playback performance.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Musicvid wrote on 10/30/2019, 3:20 PM

Would you elaborate on the quality issue?

1. I did elaborate, above. Bit-for-bit, it sucks next to x264/ x265.

Are you referring to the encode quality if you use GPU-based encoding (relative to CPU-based) for the same bit rate?

2. Yes. I do use it for quick previews, proofs, and dailies. Not anything I would charge a customer for.

 

rtbond wrote on 10/30/2019, 3:54 PM

Would you elaborate on the quality issue?

1. I did elaborate, above. Bit-for-bit, it sucks next to x264/ x265.

Are you referring to the encode quality if you use GPU-based encoding (relative to CPU-based) for the same bit rate?

2. Yes. I do use it for quick previews, proofs, and dailies. Not anything I would charge a customer for.

 

Perfectly clear. Thanks!

Rob Bond

My System Info:

  • Vegas Pro 22 Build 194
  • OS: Windows 11.0 Home (64-bit), Version: 10.0.26100 Build 26100
  • Processor: i9-10940X CPU @ 3.30GHz (14 core)
  • Physical memory: 64GB (Corsair Vengeance LPX 64GB (2 x 32GB) DDR4 DRAM 3200MHz C16 memory kit)
  • Motherboard Model: MSI x299 Creator (MS-7B96)
  • GPU: EVGA GeForce RTX 2070 SUPER XC ULTRA (Studio Driver Version =  536.40)
  • Storage: Dual Samsung 970 EVO 1TB SSD (boot and Render); WDC WD4004FZWX, 7200 RPM (media)
  • Primary Display: Dell UltraSharp 27, U2723QE, 4K monitor with 98% DCI-P3 and DisplayHDR 400 with Dell Display Manager
  • Secondary Display: LG 32UK550-B, entry-level 4k/HDR-10 level monitor, @95% DCI-P3 coverage
Musicvid wrote on 10/30/2019, 6:57 PM

It's a personal opinion, of course. But my first tests looked so bad I didn't bother with analyzing the data. May be the only time anyone hears me say that.

NickHope wrote on 10/31/2019, 12:51 AM

This thread shows AVC encoding with AMD's VCE to actually be very good: https://www.vegascreativesoftware.info/us/forum/cpu-render-vs-vce--114009/

After reading that, read this, and the posts it links to: https://www.vegascreativesoftware.info/us/forum/nvenc-avc-rendering-quality--117269/

Former user wrote on 10/31/2019, 5:59 AM

Nvenc has different settings. depending on the application required. I know at 6mbit/s at 1080p60 for streaming with fast motion Nvenc (pascal) falls apart , and x264 fast is noticeably better to anyone's eye, but nvenc when used for recording uses look ahead with dynamic B-Frames so should be better

.

The higher bitrates where Nvenc can be an acceptable solution such as uploading to video sharing sites where re-encoding occurs.

It's much less acceptable where you want to encode at end user, mass distribution bitrates, examples facebook, twitter, instagram, youtube, Vimeo, or even the higher bitrate platforms like Netflix

Musicvid wrote on 11/1/2019, 11:13 AM

I don't have any issues with comments posted (or deleted) in this, or the two threads linked by Nick, including the things I said. Relative differences between hardware encoders are both acknowledged and appreciated. It seems only a year or so ago that QSV was listed at the top of the heap.

Based on the differences that we've discussed, I went ahead and sought out side-by-side comparisons, and I still wasn't blown over. As I said, this subject is one big departure from my "empirical" attitude, because I'm pretty certain I'll never own a card with a spiffy on-board GPU. If I need a fast cut to send to someone, QSV is spiffy enough. (Note to self: Revisit conjectures in two years.)