New Techgage rendering benchmarks for VP 18 373

RogerS wrote on 3/12/2021, 5:49 AM

Techgage includes Vegas Pro in hardware benchmarks. This time it's looking at rendering using different high-end AMD and Intel CPUs, comparing many cores with lower clock speeds to fewer cores and faster speeds. There is also a test using GPUs and of one of the Vegas AI options.

https://techgage.com/article/best-cpu-for-rendering-video-encoding-spring-2021/3/

Comments

Reyfox wrote on 3/12/2021, 6:24 AM

Thanks for the link!

It's always good to see Rob (who is also in this forum somewhere) including Vegas Pro 18.

lenard wrote on 3/12/2021, 6:57 AM

Thankyou RogerS for bringing that to our attention. The benchmarks show how poorly adapted VP18 is to using a GPU. With cpu alone, we get the results expected, but with cpu + gpu, everything is very similar. This ties in perfectly with users complaints that their expensive multicore cpu's are under utilised , using maybe 25% CPU, while a 4-6core may use close to 100%

This is something Vegas NEEDS to fix.

JN- wrote on 3/12/2021, 7:58 AM

@RogerS Good spot.

---------------------------------------------

VFR2CFR, Variable frame rate to Constant frame rate link to zip here.

Copies Video Converts Audio to AAC, link to zip here.

Convert 2 Lossless, link to ZIP here.

Convert Odd 2 Even (frame size), link to ZIP here

Benchmarking Continued thread + link to zip here

Codec Render Quality tables zip

---------------------------------------------

PC ... Corsair case, own build ...

CPU .. i9 9900K, iGpu UHD 630

Memory .. 32GB DDR4

Graphics card .. MSI RTX 2080 ti

Graphics driver .. latest studio

PSU .. Corsair 850i

Mboard .. Asus Z390 Code

 

Laptop… XMG

i9-11900k, iGpu n/a

Memory 64GB DDR4

Graphics card … Laptop RTX 3080

Howard-Vigorita wrote on 3/12/2021, 1:59 PM

... The benchmarks show how poorly adapted VP18 is to using a GPU. With cpu alone, we get the results expected, but with cpu + gpu, everything is very similar. This ties in perfectly with users complaints that their expensive multicore cpu's are under utilised , using maybe 25% CPU, while a 4-6core may use close to 100%

This is something Vegas NEEDS to fix.

That benches the Vegas implementation of Median FX. Which they cite as the most grueling Vegas FX. Another Vegas bench is of the Colorize FX... I assume because that's grueling too. If they're going to bench FX in an article discussing different NLE's, be nice if they used the same footage and FX on more than one of them.

Reyfox wrote on 3/13/2021, 5:57 AM

Rob Williams (of Techgage.com) gave me this reply when asking about the new crop of GPU's and testing with Vegas Pro:

"Yes, we're in the process of refining the test scripts, and plan to tackle Vegas and a whack of other tests soon. We need to rework the playback test a bit to make sure it's as accurate as possible. We also have been in touch with MAGIX recently to make sure our current tests are not too bad, and we've had the a-OK so far, so I guess we'll keep using these until VP19 comes out. I am hoping for some new AI tests with that one :D"

My suggestion for those that have questions or suggestions concerning the tests, to ask Rob. He answers. This might help him refine the testing, and give us the answers to the questions some are asking.

https://techgage.com/article/best-cpu-for-rendering-video-encoding-spring-2021/#comment-5301878875

 

TheRhino wrote on 3/13/2021, 7:46 AM

...with cpu + gpu, everything is very similar. ...

I've been preaching for the last 2 years that because Vegas is not optimized to take advantage of larger numbers of multi-cores or newer GPUs, my 2 year-old 5 ghz 9900K & VEGA 64 LQ are about as fast as any of the newer CPU/GPU combos costing 2X to 4X as much... For my longer paid projects, I have always chosen to use (2) affordable workstations (one for editing & the other for rendering) vs. overpaying for a single high-end system...

I expect Vegas will get a boost from the NEW 8-core Intel 11900K CPU that is supposed to clock higher, have 19% higher instructions-per-cycle, an improved iGPU, and improved HEVC/AVC encoding. I'm looking at upgrading another Vegas workstation around the time V19 is released... My local Microcenter has the 9900K for just $250 & 10850K for $320 so I'll need to see how Vegas 19 performs on the 11900K (or any other CPU) to justify the price-difference before making a a decision... As noted in my other posts, I already returned a new RTX 3060 TI because in Vegas it did not perform any better than my VEGA 64 LQ...

Last changed by TheRhino on 3/13/2021, 7:48 AM, changed a total of 1 times.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Paul-Jonack wrote on 4/18/2022, 5:44 PM

...with cpu + gpu, everything is very similar. ...

I've been preaching for the last 2 years that because Vegas is not optimized to take advantage of larger numbers of multi-cores or newer GPUs, my 2 year-old 5 ghz 9900K & VEGA 64 LQ are about as fast as any of the newer CPU/GPU combos costing 2X to 4X as much... For my longer paid projects, I have always chosen to use (2) affordable workstations (one for editing & the other for rendering) vs. overpaying for a single high-end system...

I expect Vegas will get a boost from the NEW 8-core Intel 11900K CPU that is supposed to clock higher, have 19% higher instructions-per-cycle, an improved iGPU, and improved HEVC/AVC encoding. I'm looking at upgrading another Vegas workstation around the time V19 is released... My local Microcenter has the 9900K for just $250 & 10850K for $320 so I'll need to see how Vegas 19 performs on the 11900K (or any other CPU) to justify the price-difference before making a a decision... As noted in my other posts, I already returned a new RTX 3060 TI because in Vegas it did not perform any better than my VEGA 64 LQ...

Hello, have you returned the RTX3060ti ? because my RTX3060 12 GB works far worth then my RX8GB which i purchased 3 years ago. I deinstalled all old drivers, i have latest studio drivers from nvidia, i gave vegas 17 a reset.

nothing works...this cards works perfect some games i have but vegas runs half the speed of my old RX470.

RX470 was on 100% usage and my rtx 3060 on maybe 40%...

TheRhino wrote on 4/18/2022, 6:44 PM

@Paul-Jonack A while back I got a Nvidia 3060 Ti for $400 from Best Buy online, tested it using V18 with the "Sample Project 4K" benchmarks, and got UHD render scores (1:20) slightly better than my AMD VEGA 64 LQ (1:33). However, I decided to sell it while resell prices are high & wait for something better... The only problem I had with V18 & NVENC encoding was that I could only open & render using (3) instances of Vegas whereas with AMD VCE I think the only limitation is system RAM...

In December 2021 I got an (overpriced IMO) AMD 6800 XT for $1300 which completed the "SampleProject 4K" UHD test in 0:59 and FHD test in 0:30, compared to my VEGA 64's scores of 1:33 and 0:48 respectively... So, I was getting about 35% better performance according to that benchmark...

HOWEVER, for my day-to-day work, in reality, the 6800 XT only gave me 15% better performance tested across about 5 different actual projects, so I returned it within the return window. Once the AMD 6800 XT comes back down to the original release price of $650 I will re-consider or at least wait until the next generation of GPUs are tested within Vegas. It did have faster Preview Window FPS while editing multiple streams of 4K, so it's not just render speed that benefits from a faster GPU...

Note, one year ago I upgraded an ENTIRE workstation to an 11700K CPU ($350), Asus W480 motherboard ($200), DDR4 ($100), and a (used) VEGA 56 ($200) for just $750 total, so spending $1300 for just the GPU was not enough bang/buck when I could upgrade another whole system for that price...


 

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Paul-Jonack wrote on 4/18/2022, 7:23 PM

Thank you very much for the fast Response.

I will return my RTX3060 12GB also I think.

It is way slower as my RX470 with 8gb.

I think it is a problem with the Vegas 17 Software. Games are good with the RTX3060 but I need the gpu only for video editing.

The RX470 is also at 100% and it works great even after 2, 5 years.

The rtx 3060 is only at 50% but Vegas can not handle the RTX3060. Even with "only" 50% I must be way faster as the old RX470. But it isnt and this is very strange.

TheRhino wrote on 4/18/2022, 7:42 PM

@Paul-Jonack  I'm really happy with the bang/buck performance of my Intel CPUs combined with AMD VEGA GPUs... I get the benefit of having the CPU, internal Intel iGPU, and AMD GPU all working together. Recently I posted in a Vegas forum how I open (3) instances of Vegas and batch render (3) file types at once utilizing the CPU to render the DNxHR or ProRes files, the AMD VEGA to render to 4K HEVC VCE MP4s, and the iGPU to render the 1080p QSV MP4s. With this setup, my CPU, iGPU, and AMD GPU are all above 80% usage which is about as good as it gets. My 9900K & VEGA 64LQ system is all liquid-cooled with separate radiators for the CPU & GPU, so that one's a great one for sound work.

Since the VEGA 64 is liquid-cooled, it takes up a lot less space, so replacing it with a large, hefty air-cooled 6800XT requires removing another PCIe card I need and rethinking my cable layout. Although it sounds easy to just replace a single component like the GPU, in reality it can take some time to make everything fit and run the tests needed to ensure the new card is stable. This all takes time away from my paid work, so IMO it's not worth it for a mere 15% increase in performance. I really need to see 50% performance gains to make it worth my time.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

RogerS wrote on 4/18/2022, 7:52 PM

Thank you very much for the fast Response.

I will return my RTX3060 12GB also I think.

It is way slower as my RX470 with 8gb.

I think it is a problem with the Vegas 17 Software. Games are good with the RTX3060 but I need the gpu only for video editing.

The RX470 is also at 100% and it works great even after 2, 5 years.

The rtx 3060 is only at 50% but Vegas can not handle the RTX3060. Even with "only" 50% I must be way faster as the old RX470. But it isnt and this is very strange.

What scores do you get on this benchmark with the 3060?
https://docs.google.com/forms/d/1Exbi4K3hbxw6snJuisR1ble-0tCPVNcIcNnx0BAtSIM/

Former user wrote on 4/18/2022, 8:12 PM

@RogerS RTX 3090 SUPRIM, If i was to put my time on that spreadsheet I'd be 14th with 43secs FHD NVENC

RogerS wrote on 4/18/2022, 8:21 PM

You should add yourself- more data points the better!

Makes sense as the 3070 encoder is the same from what I understand and that time is about the same as the best NVENC render.

TheRhino wrote on 4/19/2022, 6:37 AM

@RogerS RTX 3090 SUPRIM, If i was to put my time on that spreadsheet I'd be 14th with 43secs FHD NVENC

For a while now Vegas has performed better on Intel CPUs paired with AMD GPUs. Even my older VEGA 56 & 64 compare well with modern Nvidia 3xxx GPUs. Before the mining craze I was able to get an "open box" VEGA 64 LQ (liquid cooled) from Newegg for $350. The radiator is part of a sealed all-in-one design & because it does not have fans on the actual PCIe card, it does not block the adjacent PCIe slot or blow heat onto other internal components. Looking back, I should have purchased a bunch for that price... Later, also before the mining craze, I got a used VEGA 56 for $200, so it's hard for me to justify sinking money into pricier GPUs that really don't make a difference in my day-to-day workflow...

I mean, I'm not going to spend $1000 extra just so my renders complete 15% faster. Because I have (2) comparable workstations, I just start the next paid project on the other system while the first batch renders all of the clients' video choices...

Note that many other Apps favor the Nvidia 3xxx series over the AMD 6xxx series, so you have to weigh all factors when choosing a GPU. However, I use Vegas for 90% of my paid NLE work, so I'm sticking with AMD for best bang/buck performance.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Former user wrote on 4/19/2022, 7:23 AM

@TheRhino I'm not too pleased for the amount of money ii spent but i'm ok with it at the moment, hopefully Vegas will update their system one day,

Yeah different Apps - I've got/had MEP Premium since 2004, i had a 4k 25min project, Vegas renders it in about 16mins, MEP exports it in about 8mins (i haven't got it anymore so can't remember the exact numbers) but MEP uses my system to it's max, it plays 4k transitions at good res 1st time around fluidly, Vegas often needs to loop play once or twice before playing them at full frame rate, n yeah i've tried all settings suggestions, I don't really notice any difference with variable & constant, not the files i use anyway 😉, If i inc pictures or text in the project i either prerender that section to a new track or i export the project using the option without NVENC at the end, rendering pics & text with NVENC seems to mess them up, they either flicker, one pic alternates with another pic from the timeline or they are just totally out of place n messed up in the rendered out file, even saying that tho Vegas is my first choice when editing, 🤷‍♂️ When i see your numbers in the benchmark spreadsheet i sort of wish i'd seen that list before & got an RX instead but it is what it is now n i hope Vegas as i say make some changes one day.

RogerS wrote on 4/19/2022, 10:36 AM

As the Vegas video engine improves and addresses bottlenecks, it should make better use of high-end NVIDIA cards. You should still get benefit with Fx that support it well like NeatVideo.

The AMD/NVIDIA/QSV timeline playback and encode times are all pretty strong even if AMD is in the lead at the moment.

rendering pics & text with NVENC seems to mess them up,

I'm surprised you're still seeing this- maybe file a support request on it as 19 was supposed to fix issues along these lines. They were the bane of my existence a few years back as I did a work project which was a slideshow + talk and couldn't for the life of me figure out what was going on!

pierre-k wrote on 4/19/2022, 12:09 PM

I will also write my upgrade experience.
I had a 4 core Xeon and a GTX 970 and was very unhappy with the preview while editing. When Vegas came up with GPU I / O support, I was relieved.
I felt like I was working in a completely different software. My old GTX970 got another chance.

I decided to upgrade my PC to a 12-core AMD Ryzen and first an AMD Vega 56 and then an RTX 2060 test.
Rebuilding for crazy money, but no dramatic revolution took place. For example, 4k MXFs were still tearing at me. I went back to Xeon and the GTX 970.

I am convinced that the future of Vegas lies in automatic pre-rendering at runtime (as in Premiere) and a quality working proxy with the option to choose a codec with GPU support.

Seb-o wrote on 8/12/2022, 6:28 PM

I will also write my upgrade experience.
I had a 4 core Xeon and a GTX 970 and was very unhappy with the preview while editing. When Vegas came up with GPU I / O support, I was relieved.
I felt like I was working in a completely different software. My old GTX970 got another chance.

I decided to upgrade my PC to a 12-core AMD Ryzen and first an AMD Vega 56 and then an RTX 2060 test.
Rebuilding for crazy money, but no dramatic revolution took place. For example, 4k MXFs were still tearing at me. I went back to Xeon and the GTX 970.

I am convinced that the future of Vegas lies in automatic pre-rendering at runtime (as in Premiere) and a quality working proxy with the option to choose a codec with GPU support.

I think your basic premise is correct, for most hardware purchases, up to a certain level. IOW, not difference making.

Studying the Techgage test results on playback, the Vega 64 is pretty decent, and I would imaging the new AMDs will (or should) match and that performance at - all things considered - a lower price point. A used 2080ti also might be an option, and the sweetspot with their new stuff looks like 3080, and I'm talking about timeline playback 'difference maker.' I have the 2060 as well, it helped a bit, but to really feel some significant difference, I think you have to go a bit higher. You sort of 'went there,' but not really far enough
 

Former user wrote on 8/12/2022, 8:20 PM

He say he intends to do testing for VP20, but also he says the following:

  • As usual, we intend to explore the latest VEGAS Pro from a performance stand-point soon, and get an article published showing how current GPUs (and possibly CPUs) fare. Unfortunately, we didn’t get this done for VEGAS Pro 19, as initial testing showed sporadic behavior (where slower GPUs would perform better than faster ones). We hope to have better luck this go around, and will work with the developer if we do encounter any particular issue that stifles progress again. It’s been too long since the last in-depth look, so we want to get caught up!

Unfortunately nothing changed. The problem with Nvenc transcodes using MagixAVC being slower then AMD and Intel for reasons unrelated to GPU processing and encoding power still exists, making the test or any test which doesn't significantly constrain a GPU a poor test of the GPU, and using Voukoder, a 3rd party plugin to get around this bug while giving a more accurate picture of the true GPU power sounds good, he won't be testing pure Vegas anymore.

 

Seb-o wrote on 8/12/2022, 8:46 PM

He say he intends to do testing for VP20, but also he says the following:

  • As usual, we intend to explore the latest VEGAS Pro from a performance stand-point soon, and get an article published showing how current GPUs (and possibly CPUs) fare. Unfortunately, we didn’t get this done for VEGAS Pro 19, as initial testing showed sporadic behavior (where slower GPUs would perform better than faster ones). We hope to have better luck this go around, and will work with the developer if we do encounter any particular issue that stifles progress again. It’s been too long since the last in-depth look, so we want to get caught up!

Unfortunately nothing changed. The problem with Nvenc transcodes using MagixAVC being slower then AMD and Intel for reasons unrelated to GPU processing and encoding power still exists, making the test or any test which doesn't significantly constrain a GPU a poor test of the GPU, and using Voukoder, a 3rd party plugin to get around this bug while giving a more accurate picture of the true GPU power sounds good, he won't be testing pure Vegas anymore.

 

I don't thing "testing Vegas" is the idea, and I may certainly be misinterpreting your statement. Techgage is testing the hardware, (I'm probably too obvious there) and how that works with Vegas - exactly how it is, not how it's supposed to be or could be. Techgage saying there'll be vegas pro 20 test in the near future (and apologizing for Vegas 19 gap) is actually referring to some of the hardware that has come about since, AND I say that because I do very much AGREE with you, not much has changed with VP (actually from the beginning, and they need a ground up re-write, IMHO) but HARDWARE is almost to the point that it can, at least somewhat, compensate for whatever it is VEGAS is not maximizing internally. And not only that, but the newer hardware has had time to settle in and drivers are going to be improved as well in that two year gap. I'm not letting Vegas "off the hook." I bemoan this lack of playback speed, so much so that I've become a broken record and the butt of jokes, but numbers don't lie, and the uber high end cards are difference makers according to the test results and that will only improve.

Former user wrote on 8/12/2022, 9:36 PM
 

I don't thing "testing Vegas" is the idea, and I may certainly be misinterpreting your statement. Techgage is testing the hardware, (I'm probably too obvious there) and how that works with Vegas - exactly how it is, not how it's supposed to be or could be.

@Seb-o He's mostly a hardware guy, likes using various software as comparisons for testing GPU/CPU. The way Vegas works with only having a bug with NVENC, but not with Intel and AMD hardware encoders makes the transcode tests accurate for How Vegas actually works, but a poor test of the hardware. The results of NVENC transcodes will always be artificially poor, 20-25% lower performance at 1080P in last testing I did.

Interesting for Vegas users, but terrible as a benchmark tool. The tests where he loads up the GPU with FX reduces the NVENC latency problem because more time is used for GPU processing. Even so Voukoder is still faster with high GPU loads such as the Vegas Benchmark test, although there is an area in that benchmark test towards the end where it becomes basically a transcode test, so potentially brutal GPU fx does make the testing more accurate as long as it doesn't contain a portion that is equivalent to transcode

The results for CPU encoding will accurately reflect the real performance of the GPU's but it's also a benchmark of that particular CPU more so than hardware encoding. The encoder be it cpu or hardware should not be the bottleneck when testing GPU processing performance.

RogerS wrote on 8/13/2022, 2:07 AM

I hope he waits a bit before testing 20 as I don't see significant changes from 19 (or 18) and don't see the point at this time. He's included incidental tests of 19 over the past year which are enough for me:

https://techgage.com/article/intel-i9-12900k-i5-12600k-workstation-performance-review/3/

https://techgage.com/article/amd-ryzen-7-5800x3d-workstation-performance-review/3/

One odd thing here is AMD CPUs winning over Intel on the AI Fx benchmarks. I wonder if the iGPU is disabled and if that's part of why "each time we revisit, we seem to encounter odd performance scaling, or just odd behavior in general. This is the reason why there is no CPU+GPU test above; the end results are just unpredictable."

I think it's useful to see how different hardware fares with the Vegas environment, flaws, bottlenecks and all. His Premiere and Resolve tests also show differences in their design architectures.

jetdv wrote on 8/13/2022, 8:58 AM

@RogerS, well I went through the form and did the first three but got tired after that. What I originally tried to do: Open the form 6 time, fill in each field one at a time on all six. When I got to the second one it popped up an error message that it need to "update form". I figured it would be easier to copy the information for each field and paste 6 times rather than having to redo everything each time.

So after the last one, I clicked on the "submit another" option. I was hoping that all the base information would be preserved so that I only needed to change the 3 fields that really changed. But it was all blank again so I got tired and quit after the first three...

And messaging on this forum only allows you to attach images.

RogerS wrote on 8/13/2022, 9:10 AM

I see what you tried- good idea and too bad it doesn't work. I haven't figured out a better batch alternative for this given how forms are structured.

If the data were laid out like the Google Sheets I can easily just copy and paste into the master sheet- I can click on a link to a spreadsheet saved online or with Dropbox, etc. in messages.