Comments

fr0sty wrote on 9/19/2020, 11:48 AM

AMD. I am using a Ryzen 7 1800x on my desktop, a I9 9980HK on my laptop, and I don't get much if any of a performance advantage in VEGAS over my desktop's 3 year old CPU.

Last changed by fr0sty on 9/19/2020, 11:48 AM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

john_dennis wrote on 9/19/2020, 12:24 PM

@fr0sty

While I might not disagree with your conclusion, I don't think using a processor targeted for low power work (TDP =45 Watts) as an example in a comparison of two desktop CPUs supports your case.

i9-9900K

AMD Ryzen 9 3900X

i9-9980HK

Reyfox wrote on 9/19/2020, 12:53 PM

There is no "standard" for coming up with a TDP for a processor. It's left up to each company how they determine the figure. Although I would think that laptops run on the less powerful side to save battery life for sure.

But still, there is Techgage that has done the CPU testing already....

https://techgage.com/article/magix-vegas-pro-18-processor-graphics-card-performance/2/

Steve_Rhoden wrote on 9/19/2020, 12:59 PM

The AMD Ryzen 9 3900X is the best one for Vegas Pro....

TheRhino wrote on 9/19/2020, 1:23 PM

But still, there is Techgage that has done the CPU testing already....

https://techgage.com/article/magix-vegas-pro-18-processor-graphics-card-performance/2/

The problem with the Techgage's V18 CPU testing is that they turn-off GPU rendering so for that particular benchmark you do not get the big picture. For VEGAS, you really want to know how well a particular CPU & GPU work together unless you intend to turn-off GPU rendering because you believe there is a quality difference, which I personally have not been able to visually notice after 1,000's of MP4 renders... (My clients receive the high quality intermediates along with an easy-to-view MP4 file set, so if they don't like my MP4 renders they can make their own...)

In the case of my 9900K, when GPU rendering is turned ON, I get the benefit of the 9900K's onboard Quicksync (iGPU) video PLUS my PCIe GPU, a VEGA 64 LQ. The end result are MP4 & HEVC render speeds that rival a much pricier AMD 3950X & 2080ti combo... For instance, my 9900K @ 5.1 ghz does the RedCar MP4 benchmark in 13 seconds... If I unplug the monitor connected to my iGPU, renders slow a bit, so I know that the iGPU is contributing...

Personally I am waiting to upgrade a 2nd workstation until new CPUs are released, but if I needed something ASAP, especially with TB3, I would likely consider the <$450 10-core 10850K and one of the new Nvidia GPUs, provided one can be purchased at normal retail pricing....

The AMD Ryzen 9 3900X is the best one for Vegas Pro....

Another issue I have with Techgage's benchmarks is that we cannot replicate the tests on our own systems. However, these forums have benchmarks available to all & we're getting similar performance from similar configurations, drivers, etc... The fastest times are from a 9900K & 5700XT combo running V18... Show me a 3900X & currently available GPU (5700XT, VEGA 64, Radeon 7) that completes the RedCar benchmark in <11 seconds or RedCar 4K benchmark in < 25 seconds and I will believe you... Also, I can render (3-4) instance of Vegas at once, render to/import from Thunderbolt 3 RAID devices, etc. with 100% stability... When multiple instances are running, all (3), my CPU, iGPU & PCIe GPU are contributing to each render instance...
 

Last changed by TheRhino on 9/19/2020, 1:48 PM, changed a total of 2 times.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

fred-w wrote on 9/19/2020, 8:34 PM

@TheRhino
Thanks, you've covered more actual ground than most in making REAL WORLD determinations.

 

RedRob-CandlelightProdctns wrote on 3/25/2022, 5:47 PM

Re-opening this thread...

Looking to upgrade my Lenovo laptop ASAP.

Considering a Legion 5 gaming laptop (Core i7-7700HQ @2.8, Intel graphics + AMD Radeon RX560 w/8GB + 16GB laptop RAM) .. two options, both which will offer CPU boost over 3x:

  1. Ryzen 7 5800H, RTX 3060 6GB, 32GB RAM OR
  2. 11th Gen Intel Core i7-11800H, RTX 3060 6GB, 16GB RAM

I'm leaning towards the Ryzen config... thoughts/

 

Vegas 21.300

My PC (for finishing):

Cyperpower PC Intel Core i7-7700K CPU @ 4.2GHz, 64GB mem @ 2133MHz RAM, AMD Radeon RX470 (4GB dedicated) with driver recommended by Vegas Updater (reports as 30.0.15021.11005 dated 4/28/22), and Intel HD Graphics 630 driver version 31.0.101.2112 dated 7/21/22 w/16GB shared memory. Windows 10 Pro 64bit version 10.0.19045 Build 19045.

My main editing laptop:

Dell G15 Special Edition 5521, Bios 1.12 9/13/22, Windows 11 22H2 (10.0.22621)

12th Gen Intel Core i7-12700H (14 cores, 20 logical processors), 32 GB DDR5 4800MHz RAM, Intel Iris Xe Graphics, NVIDIA GeForce RTX 3070 Ti Laptop GPU w/8GB GDDR6 RAM, Realtek Audio

 

 

Former user wrote on 3/25/2022, 9:46 PM

I'd get an Intel12700H, if it's priced reasonably and also an option with your preferred laptop. For this benchmark he says the main difference is due to CPU, you might be able to gauge the GPU factor by comparing the 2 12700H results and extrapolate to your CPU choices with the lower powered GPU

 

Yelandkeil wrote on 3/26/2022, 1:00 AM

Just for fun, I also do now and then an online benchtest.

I think such comparison can lead to no end. It depends on how you combine the hardware together and configure them, assumed that you understand something about PC.

I'm planning to replace my GPU with an RX6800XT, but along with the time not so sure to invest more than 1000Euro for such a replacement.

Here you see my online benchtest just minutes ago:

-- Hard&Software for 5.1RealHDR10 --

ASUS TUF Gaming B550plus BIOS3202: 
*Thermaltake TOUGHPOWER GF1 850W 
*ADATA XPG GAMMIX S11PRO; 512GB/sys, 2TB/data 
*G.SKILL F4-3200C16Q-64GFX 
*AMD Ryzen9 5950x + LiquidFreezer II-240 
*XFX Speedster-MERC319-RX6900XT <-AdrenalinEdition 24.12.1
Windows11Pro: 24H2-26100.3915; Direct3D: 9.17.11.0272

Samsung 2xLU28R55 HDR10 (300CD/m², 1499Nits/peak) ->2xDPort
ROCCAT Kave 5.1Headset/Mic ->Analog (AAFOptimusPack 6.0.9403.1)
LG DSP7 Surround 5.1Soundbar ->TOSLINK

DC-GH6/H-FS12060E_HLG4k120p: WB=manual, Shutter=125, ISO=auto/manual
HERO5_ProtuneFlat2.7k60pLinear: WB=4800K, Shutter=auto, ISO=800

VEGASPro22 + XMediaRecode/Handbrake + DVDArchi7 
AcidPro10 + SoundForgePro14.0.065 + SpectraLayersPro7 
K-LitecodecPack17.8.0 (MPC Video Renderer for HDR10-Videoplayback on PC) 

RogerS wrote on 3/26/2022, 2:45 AM

For Vegas I'd rather have the Intel QSV decoding as it is a known quantity and performs well.

Is Core i7-7700HQ @2.8 the old laptop? It's what I have on a 2017 machine.

Here's a Vegas benchmark you can try on your current and new machine. You can also see how different CPUs and GPUs compare: https://docs.google.com/forms/d/1Exbi4K3hbxw6snJuisR1ble-0tCPVNcIcNnx0BAtSIM

TheRhino wrote on 3/26/2022, 9:34 AM

@RedRob-CandlelightProdctns
I'd definitely choose Intel over AMD, especially in a laptop, because IMO Intel CPUs handle the heat better/last longer, support Thunderbolt 3/4 connections and Vegas can utilize BOTH the Intel iGPU and the additional Nvidia GPU at the same time.

A while back I got a $999 Walmart Evoo 17" laptop with the I7-9750H CPU & Nvidia 2060 GPU that has (2) M.2 slots and (1) 2.5" bay. It's based on the Eluktronics 17" case.

Today you can get the Eluktronics MAX 17 with a I7-11800H & Nvidia 3080 for $1800 USD:
https://www.eluktronics.com/max17-i7h-3080-512ssd-16r/

The 12700H is an even better CPU option, but if you pair it with a lesser GPU to keep the price down, the CPU/GPU combo will not perform as well in Vegas as a slightly lesser CPU with a better GPU. A lot of times Vegas doesn't even max-out the CPU unless I have (2-3) instances of Vegas running at the same time...

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Howard-Vigorita wrote on 3/26/2022, 11:09 AM

For Vegas I'd rather have the Intel QSV decoding as it is a known quantity and performs well.

1+ on what @RogerS said. The built-in igpu added to a pci gpu yields a double-barrelled video processing setup. I have 9900k & 11900k systems, the 11900k is the faster and cooler running of the two. Newer 12th gen should be even better.

TheRhino wrote on 4/2/2022, 10:43 AM

Here's an example to put the whole (current) Intel vs. AMD CPU matter to rest if your primary NLE is Vegas...



My current client's work is based on 10+ hours of 16mm films that were transferred to AVI and then stabilized in Mercalli 5.0 SOL to Lossless AVI. He wants 4K HEVC, 1080p AVC, and 4K ProRes. If I render the 4K HEVC using an AMD VCE template & 1080p using an Intel QSV template, Window's Task Manager reports 91% CPU, 87% VEGA 64 & 82% Intel iGPU usage. The VCE files are created at 60 fps at the same time the QSV are created at 75 fps, so 135 fps combined.

While both MP4 file sets are batch-rendering, I have my 2nd workstation (11700K / VEGA 56) creating intermediates (ProRes / AVID DNxHR) from a copy of the source video & final VEG shared across an affordable 10G network (using $30 Mellanox PCIe cards...)

I was able to upgrade (2) older I7-980X / Xeon workstations to the 9900K / VEGA 64 & 11700K / VEGA 56 for about $2,000 (combined). Therefore, IMO, Intel makes the best bang/buck CPU for Vegas... IMO the newer 12700K combined with affordable DDR4 motherboard/memory would make a current best bang/buck combo...

However, IMO currently Intel also offers fastest (Vegas-performing) CPU with the recent release of the 12900KS clocked at 5.5 ghz. Since Vegas thrives on core speed, not core count, and benefits from the internal iGPU, the 12900KS is gonna have some great Vegas numbers especially when paired with an AMD 6800 XT GPU...

I actually tested a $1300 AMD 6800 XT & was very happy with the increased performance over my VEGA 64 & 56 GPUs. However, I returned it & will wait until the price drops back down to the initial retail release prices of $700-ish... If you are building a new workstation, you could get by with the 12900KS' iGPU for now until GPUs drop in price. The money saved by waiting would actually offset the higher price of the 12900KS & DDR5 memory, etc...

 

Last changed by TheRhino on 4/2/2022, 11:52 AM, changed a total of 2 times.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Former user wrote on 4/2/2022, 11:04 AM

@RedRob-CandlelightProdctns I can't go into details like in the comments above but i bought an AMD CPU, I'm stuck with it now but i sort of wish I'd got an Intel, it seems a lot of fxs/software are built with Intel in mind then AMD comes second, so Intel is prob a safer bet, & as is shown in the benchmark results RX GPU beats my RTX 3090, at the moment the best i can get is about 0:43secs

RedRob-CandlelightProdctns wrote on 4/15/2022, 12:11 AM

Today you can get the Eluktronics MAX 17 with a I7-11800H & Nvidia 3080 for $1800 USD:
https://www.eluktronics.com/max17-i7h-3080-512ssd-16r/

The 12700H is an even better CPU option, but if you pair it with a lesser GPU to keep the price down, the CPU/GPU combo will not perform as well in Vegas as a slightly lesser CPU with a better GPU. A lot of times Vegas doesn't even max-out the CPU unless I have (2-3) instances of Vegas running at the same time...

I jumped the gun and purchased the Lenovo with lovely screen, 32 GB, RTX 3060 and Ryzen 7 5800H. What I found is performance about the same as my 3 or 4YO desktop. Not bad.. but not "HOLY (creative) COW!" Think that's going back to Amazon... still have 10 days.

Considering DELL G15 i7-12700 w/ RTX 3060 w/16GB RAM, and swapping out one or both cards to with Crucial RAM to boost it to 24 or 32 GB (thunderbolt.. yay!). OR.. to get better display (2560x1440 400nit) and RTX 3070 Ti (same processor) would cost $400 more.. wondering if the 3070 Ti would really be worth the extra $$, or only the better display.

Looked at the Eluktronics MAX.. don't want the 17" and definitely would still want to upgrade the RAM.. never heard of Eluktronics before now... looks interesting, but don't love having to wait until mid-May.

Vegas 21.300

My PC (for finishing):

Cyperpower PC Intel Core i7-7700K CPU @ 4.2GHz, 64GB mem @ 2133MHz RAM, AMD Radeon RX470 (4GB dedicated) with driver recommended by Vegas Updater (reports as 30.0.15021.11005 dated 4/28/22), and Intel HD Graphics 630 driver version 31.0.101.2112 dated 7/21/22 w/16GB shared memory. Windows 10 Pro 64bit version 10.0.19045 Build 19045.

My main editing laptop:

Dell G15 Special Edition 5521, Bios 1.12 9/13/22, Windows 11 22H2 (10.0.22621)

12th Gen Intel Core i7-12700H (14 cores, 20 logical processors), 32 GB DDR5 4800MHz RAM, Intel Iris Xe Graphics, NVIDIA GeForce RTX 3070 Ti Laptop GPU w/8GB GDDR6 RAM, Realtek Audio

 

 

RedRob-CandlelightProdctns wrote on 5/15/2022, 11:54 PM

Update:

The Lenovo (32 GB DDR4, RTX 3060 Ryzen 7 5800H) actually performed *excellent* for playback of a MV that contained 8 cameras -- 4@1080p + 3@4K + 1@2K. Full 29.97 fps even at 3x speed! It rendered out a 1-minute segment of my footage (some w/4K, some 1080p) using the Magix NVENC Codec in 0:33 (Avg 53.1 fps).

I returned it and just got in my Dell G15 Special Edition (32GB DDR5, i7-12700H, RTX 3070 Ti). Ran identical tests and found that the new config:

  • Renders 2x faster -- same render took 0:18.. similar results to other render CODECs I tested too.
  • FFMPEG ran 2x faster and used full 100% CPU (latest version of ffmpeg)
  • Neat Video was SUPER happy using CPU Cores+GPU, giving 14fps preview!

BUT -- playback of that same MV isn't as smooth, bouncing around 27-28 fps and much slower in fast forward. And I only got that speed when Dynamic RAM Preview was 200 -- if I set that to 0 (like I usually do) playback was horribly slow (7fps-ish).

I tried different combinations with the Video->GPU-Acceleration-of-Video-Processing (NVIDIA vs Intel Iris Xe) and FileIO->Hardware-Decoder-To-Use (NVIDIA vs Intel QSV). And I watched Performance Monitor for the CPU and GPUs to settle before doing testing, as it seems media is cached and decoded even while sitting idle on the timeline, and I didn't want that to skew results. Accepting the defaults (NVIDIA for GPU-Accel and Intel for Decode), performance monitor confirms the Intel is doing the decoding for playback... but I didn't see better performance when the NVIDIA was doing both either.

 

Gotta say -- I'm surprised and disappointed by the playback, that the Ryzen+RTX3060 seemed to be smoother. How can this be? Is this Intel Iris Xe a problem?

Last changed by RedRob-CandlelightProdctns on 5/16/2022, 12:04 AM, changed a total of 2 times.

Vegas 21.300

My PC (for finishing):

Cyperpower PC Intel Core i7-7700K CPU @ 4.2GHz, 64GB mem @ 2133MHz RAM, AMD Radeon RX470 (4GB dedicated) with driver recommended by Vegas Updater (reports as 30.0.15021.11005 dated 4/28/22), and Intel HD Graphics 630 driver version 31.0.101.2112 dated 7/21/22 w/16GB shared memory. Windows 10 Pro 64bit version 10.0.19045 Build 19045.

My main editing laptop:

Dell G15 Special Edition 5521, Bios 1.12 9/13/22, Windows 11 22H2 (10.0.22621)

12th Gen Intel Core i7-12700H (14 cores, 20 logical processors), 32 GB DDR5 4800MHz RAM, Intel Iris Xe Graphics, NVIDIA GeForce RTX 3070 Ti Laptop GPU w/8GB GDDR6 RAM, Realtek Audio

 

 

fr0sty wrote on 5/16/2022, 12:39 AM

The Ryzen 5800H's integrated GPU is likely faster for playback decoding than the Intel's. Maybe that is what contributed to the better performance.

Last changed by fr0sty on 5/16/2022, 12:40 AM, changed a total of 1 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Former user wrote on 5/16/2022, 1:45 AM
 

BUT -- playback of that same MV isn't as smooth, bouncing around 27-28 fps and much slower in fast forward. And I only got that speed when Dynamic RAM Preview was 200 -- if I set that to 0 (like I usually do) playback was horribly slow (7fps-ish).

 

Gotta say -- I'm surprised and disappointed by the playback, that the Ryzen+RTX3060 seemed to be smoother. How can this be? Is this Intel Iris Xe a problem?

It doesn't sound right. Your encode speed being 2x compared to AMD laptop means it's able to utilize the extra resources of Intel Laptop but then you have this problem. Assuming all your camera files are AVC and compatible with Legacy AVC , what happens when you choose that for decoder?

You won't have GPU decode , 12700 may be strong enough to take up the extra processing load of those 8 cameras, Legacy AVC is more efficient at using CPU than SO4. Do you get your full 30fps now, and what about playback at 2x or 3x?

This isn't a fix, just interested in the result without GPU decoding or SO4. (If any of your AVC's are 10bit they will force SO4)