RTX 3080 Ti Disappointing Render Time but Hopeful for Future Update

CrespoFTW wrote on 6/19/2021, 4:17 AM

I got extremely lucky and got a Asus 3080 ti TUF so before switching from the 1080ti I rendered a few videos in 4K60 with MAGIX AVC/AAC MP4 (NVENC) to see the render time difference.

I rendered a 5:40 long video using 1080ti which took 8:40 to complete

But with the 3080ti it took 9:09

 

I'm just surprised that the GPU isn't support even with the amount of power it has. Quick question but what do these preset actually mean? they just seem very confusing to me. I tend to use "High Quality" but I tried "Loseless" which seemed to render faster.

 

Sorry for the odd post I know CPU is more important but I just thought the GPU would had a small impact on performance

 

My Spec

Cpu: Ryzen 3900X

Ram: 64GB 3200hz

GPU: RTX 3080ti

 

 

Comments

j-v wrote on 6/19/2021, 4:21 AM

What is the program and buildnr. you are using?
Which driverversion is installed for your new GPU?

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

JN- wrote on 6/19/2021, 4:32 AM

@CrespoFTW Nvenc render settings. Items 2 and 7 are pretty similar and good, see chart. I use a Preset of HQ and RC mode of Vbr HQ, Some combinations with high data rates only produce typically zero byte files, items 3, 8 and 12. Other combinations are suitable for streaming, but therefore have extremely long seek times, not suitable for editing, for example item 11.

.

I found similar issue with a laptop RTX 3080 card. In VP it appears that these newer cards are not optimised for rendering, when the project is non trivial. It appears aok for HW acc. playback performance within VP.

I did a render comparison outside of VP and found that it performed as expected, very well indeed.

I used my VFR2CFR ffmpeg based util (link via signature) to test using a single video clip, no special effects. You could use say Handbrake or similar also.

My guess is that you are getting the benefit within VP if its a trivial project, but not if FX are used, then you might as well be using a previous generation card.

Last changed by JN- on 6/19/2021, 4:56 AM, changed a total of 5 times.

---------------------------------------------

VFR2CFR, Variable frame rate to Constant frame rate link to zip here.

Copies Video Converts Audio to AAC, link to zip here.

Convert 2 Lossless, link to ZIP here.

Convert Odd 2 Even (frame size), link to ZIP here

Benchmarking Continued thread + link to zip here

Codec Render Quality tables zip

---------------------------------------------

PC ... Corsair case, own build ...

CPU .. i9 9900K, iGpu UHD 630

Memory .. 32GB DDR4

Graphics card .. MSI RTX 2080 ti

Graphics driver .. latest studio

PSU .. Corsair 850i

Mboard .. Asus Z390 Code

 

Laptop… XMG

i9-11900k, iGpu n/a

Memory 64GB DDR4

Graphics card … Laptop RTX 3080

Howard-Vigorita wrote on 6/19/2021, 1:15 PM

@CrespoFTW you need to post the media info for the footage you shot. I know if its mov or mp4 with avc or hevc, gpus can decode it in hardware... which has an impact on render speed. Not sure about mxf clips from Canon... I heard Vegas was working on that if the streams are avc or hevc in the container. Also, if its avc, I think it has it has to be 420 color... the gpu makers are only just starting to support 422 color but I think its only for hevc at the moment using the Intel Iris/Xe. You can evaluate that by watching the utilization charts in Win10 Task Manager while playing in Vegas. Or better yet, do a transcode with ffmpeg using either the h264_nvenc or hevc_nvenc codec with "-hwaccel auto" ahead of the input stream. Just make sure you have the latest and greatest Nvidia Studio drivers... they were updated specifically for the 3080's.

studio-4 wrote on 6/19/2021, 9:32 PM

From Magix' marketing material, they state Vegas has built-in acceleration for AMD GPUs, as if it's an exclusive feature to AMD products only (i.e., not NVIDIA). That's why I chose an AMD over NVIDIA for my desktop. Am I missing something here?

asus laptop system specifications:
Asus 17.3" Republic of Gamers Strix G17 model: 77H0ROG1.
Ryzen 9 5900HX 3.3GHz (4.6GHz boost), eight-core CPU.
Nvidia GeForce RTX 3060 (6GB GDDR6).
32GB Crucial 3200MHz DDR4 (x2 16GB 120-pin SO-DIMMs).
512GB M.2 NMVe PCIe SSD (available second M.2 slot).

OS: installed on 7/1/2021:
Windows 10 Home 64-bit; OS version 20H2; build 19042.1052.
Windows Feature Experience Pack 120.2212.2020.0.

asus laptop installed applications:
Vegas Movie Studio 17 Platinum; version 17.0 (build 221); purchased via download 29 May 2021.
Microsoft Edge (default browser; no plug-ins).

asus laptop OpenFX add-ons:
BorisFX Continuum 2021.5 (subscription).
NewBlue Elements 3 Overlay.

HP desktop system specifications:
HP Z440 Intel Xeon E5-1650 v3 3.5GHz (4GHz-boost), quad-core CPU.
32GB DDR4 ECC RAM.
1TB SATA SSD.
AMD Radeon RX470 4GB
AMD Radeon R7200.

OS:
Windows 10 Pro 64-bit; OS version 20H2; build 19042.985.
Windows Feature Experience Pack 120.2212.2020.0.

HP desktop installed applications:
Vegas Movie Studio 17 Platinum; version 17.0 (build 221); purchased via download 29 May 2021.
Blackmagic Design Media Express 2.3 for Windows 10.
WinDV 1.2.3.
Microsoft Edge (default browser; no plug-ins).

HP desktop OpenFX add-ons:
FXhome Ignite Advanced VFX pack.
BorisFX' Stylize Unit 2020.5.
NewBlue Elements 3 Overlay.

cameras/VTRs:



Sony NEX-FS100 Super35 1080p24/50/60 digital-cine camera.
Sony NEX-FS700 Super35 1080p24/50/60/240/960 high-speed digital-cine camera.
Sony NEX-5R APS-C 1080p60 cameras (x3).
Sony DSR450WSL 2/3" 480p24 16:9 DVCAM camera.
Sony VX1000 1/3" 480i60 4:3 miniDV camera.
Sony DSR11 DVCAM VTR.

personal websites:

YouTube channel: modularfilms

photography/iighting website: http://lightbasics.com/

Musicvid wrote on 6/19/2021, 10:45 PM

Look up "RTX 3080 Disappointing" on the internet. It's not a Vegas shortcoming.

FernC wrote on 6/20/2021, 12:00 PM

Look up "RTX 3080 Disappointing" on the internet. It's not a Vegas shortcoming.

This is most likely why the 3000 series perform badly with Vegas

The RTX 3070 has a far greater CUDA core count, but it’s not quite as simple as the numbers suggest. The RTX 3000-series includes its INT32 cores in the count, as they can also operate as more typical FP32 CUDA cores when needed. That means that the RTX 3070 has 2,944 typical CUDA cores and an additional 2,944 that can be used for the same purpose. The RTX 2080 Ti has 4,352 (FP32) CUDA cores and an additional 4,352 INT32 cores. That means that it has significantly more CUDA cores in total than the RTX 3070, but they are less versatile. They’re also based on a last-generation design and operate at a lower clock speed, so in many games, the RTX 3070 could well pull ahead — especially when it can leverage more of its multi-faceted cores for FP32 calculations.

Vegas doesn't know what to do with the new cores, it can only use the old cores

TheRhino wrote on 6/20/2021, 4:35 PM

In this post on 12-19-2020, I compared a 3060 Ti to my AMD VEGA 64... For the type of work I do, the new Nvidia was slower than my old AMD GPU... On a 1 minute portion of video from an actual / average paid project, the 3060 TI rendered it 17% slower. On the "Red Car" sample project discussed on these Vegas forums, the 3060 TI was 21% slower... And, the Nvidia GPU could not render more than (3) instances of Vegas to MP4 at once. Since I like to edit new paid projects on one PC while batch rendering 4-5 clients' finished projects on a 2nd PC (with a copy of the source files & final VEG), the Nvidia is not able to handle my current workflow... I ended-up selling the 3060 TI to fund a 2nd PC build using a VEGA 56 I got for $200 before the mining craze hit...

All of that said, old GPUs are way overpriced today, including my VEGAs which now go for $600-$850 used... Also, most apps including NLEs like DaVinici & Adobe PP, benefit from the new GPUs as benchmarked by Puget Systems here. So a future release of Vegas could actually utilize the new GPUs to a greater extent as well as CPUs with more cores...

However, I'm all about best bang/buck NOW since I can always upgrade parts when they are better supported... My 2 main editing workstations have an Intel 9900K & 11700K CPU and VEGA 64 & 56 GPU. In V18 they are faster than many single systems that cost as much as I paid for both...

 

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

FernC wrote on 6/20/2021, 7:50 PM



All of that said, old GPUs are way overpriced today, including my VEGAs which now go for $600-$850 used... Also, most apps including NLEs like DaVinici & Adobe PP, benefit from the new GPUs as benchmarked by Puget Systems here. So a future release of Vegas could actually utilize the new GPUs to a greater extent as well as CPUs with more cores...

2080TI is a classic card, 10% more powerful then rtx3070 with software that's fully compatible with 3070 but Vegas is not so it is much more powerful in Vegas but selling cheaper then 3070's

alifftudm95 wrote on 6/21/2021, 3:07 AM

@TheRhino to make things simple, VEGAS Pro optimize very well if we use Intel CPU and AMD GPU?

 

Editor and Colorist (Kinda) from Malaysia

MYPOST Member

Laptop

MacBook Pro M4 Max

16 Core CPU and 40 Core GPU

64GB Memory

2TB Internal SSD Storage

Anti-Glare 4K HDR Screen

 

PC DEKSTOP

CPU: Ryzen 9 5900x

GPU: RTX3090 24GB

RAM: 64GB 3200MHZ

MOBO: X570-E

Storage:

C DRIVE NVME M.2 1TB SSD GEN 4

D DRIVE NVME M.2 2TB SSD GEN 4

E DRIVE SATA SSD 2TB

F DRIVE SATA SSD 2TB

G DRIVE HDD 1TB

Monitor: Asus ProArt PA279CV 4K HDR (Bought on 30 August 2023)

Monitor: BenQ PD2700U 4K HDR (RIP on 30 August 2023)

 

 

 

TheRhino wrote on 6/21/2021, 12:47 PM

For Vegas 18, I prefer Intel CPUs & AMD GPUs. Current versions of Vegas thrive on CPU core SPEED vs. core count as long as the CPU is paired with a capable GPU... According to Techgage:

"...a GPU is really important to video encoding, and if for some reason you still wish to stick strictly to CPU-only, you’ll want a beefy chip. Adding a GPU to the mix almost breathes new life into a CPU, allowing both to work in tandem to get the encode done much quicker. In some cases, the faster clock speeds of smaller core-count chips can help propel them high in a chart."

Paired with a RTX 3070 GPU, Techgage demonstrated how the higher-clocked 6 & 8 core Intel CPUs out-performed CPUs with more cores running at slower clocks...

My 9900K & 11700K CPUs can run all cores at 5 ghz stable & on certain tasks, the processors' included UHD 730 & 750 iGPUs also add to the performance. Because the 11700K also has more Instructions per Cycle (IPC), on certain tasks it performs up to 30% faster than my 9900K although they have the same 5 ghz clock speed...

Regarding GPUs, I expect future releases of Vegas to better-utilize Nvidia 3xxx GPUs like BM DaVinci & Adobe PP have done... However, V18 runs great on older AMD VEGA & Radeon VII GPUs as noted in another Techgage benchmark. A while back I only paid $350 for my liquid-cooled VEGA 64 & $200 for my fan-cooled VEGA 56... I have no intention of throwing money at new GPUs unless I see significantly better numbers... However, Vegas 19 will likely be released within a couple months & it may better-utilize multi-core CPUs & newer GPUs, so the best bang/buck CPU & GPU could change once we see some new benchmark numbers with V19...

Last changed by TheRhino on 6/22/2021, 9:01 AM, changed a total of 1 times.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

FernC wrote on 6/22/2021, 5:18 AM
 


My 9900K & 11700K CPUs can run all cores at 5 ghz stable & on certain tasks, the processors' included UHD 730 & 750 iGPUs also add to the performance. Because the 11700K also has more Instructions per Cycle (IPC), on certain tasks it performs up to 30% faster than my 9900K although they have the same 5 ghz clock speed...

Vegas is accelerated by OpenVino. Gigapixel, a photo enhancer, and Vegas have some things in common. Both are fastest with Intel, and the new AMD GPU's instead of the New Nvida GPU's. Gixapixel like Vegas has not been updated to support Nvidia 3000 cards, and AMD cpu's do not support OpenVino