Impressions: Moving from VP14/4770K to VP19/12700K

Comments

Hulk wrote on 12/23/2021, 8:43 AM

Results below as well as the templates I used. As expected and confirmed by the extremely CPU usage statistics from HWinfo the test is entirely bound by GPU performance. Just to confirm this conclusion I ran the test using the same render template but without Intel QSV at FHD. The result was nearly identical at 1:41 (vs. 1:38). Also as expected the GPU usage dropped to 76% and the CPU usage increased to 26%. The CPU picked up the transcode load dropped by the GPU.

 

 

RogerS wrote on 12/23/2021, 9:10 AM

Yes, makes sense you are GPU limited not having a dedicated GPU installed. With a GPU you could expect results more fitting of that CPU (QSV encode in the 30-50 sec. range depending on GPU I'd guess.

Feel free to upload your results as it's still interesting even if it's unlikely many others will have a similarly configured system.

For more on GPU vs CPU on encodes I thought this was interesting:
https://techgage.com/article/best-cpu-for-rendering-video-encoding-spring-2021/3/

and

https://techgage.com/article/intel-i9-12900k-i5-12600k-workstation-performance-review/3/

If you have ideas for meaningful test projects, try finding TechGage on twitter before they run a battery of tests on VP 19.

john_dennis wrote on 12/23/2021, 10:09 AM

@Hulk

"I'm not in love with the black color scheme."

If you haven't found the setting by now, Vegas 19 allows you to pick one of four color schemes.

Hulk wrote on 12/23/2021, 11:38 AM

@Hulk

"I'm not in love with the black color scheme."

If you haven't found the setting by now, Vegas 19 allows you to pick one of four color schemes.

Thank you John. Lots of little (and not so little) changes all over the place since version 14. It's like I was sleeping for a few years and woke up to totally new Vegas!

john_dennis wrote on 12/23/2021, 12:52 PM

@Hulk

Interface colors (or lack of colors) has been a sore spot for me in the past:

https://www.vegascreativesoftware.info/us/forum/black-and-white-icons-are-wasting-my-time--103958/?page=1

I'm happy with the current options in Vegas 19.

Over the life if my i7-6850K/RX480, I've gotten Vegas performance improvements when support for decode was added. I use VCE renders for "dailies" to send to people's cell phones even if the final render is CPU only.

I did a "paper upgrade" to a bleeding edge system:

Motherboard

https://www.gigabyte.com/us/Motherboard/Z690-AORUS-XTREME-rev-10/sp#sp

CPU

https://ark.intel.com/content/www/us/en/ark/products/134599/intel-core-i912900k-processor-30m-cache-up-to-5-20-ghz.html

Cooler

https://www.corsair.com/us/en/Categories/Products/Liquid-Cooling/Dual-Radiator-Liquid-Coolers/Hydro-Series%E2%84%A2-H115i-280mm-Extreme-Performance-Liquid-CPU-Cooler/p/CW-9060027-WW

but, the whole thing broke down when I was unable to find DDR5 memory.

I give you even-money odds that my next build will contain a five-year old Sapphire Radeon Nitro RX480-8GB video adapter.

Howard-Vigorita wrote on 12/23/2021, 4:53 PM

@Hulk Thanks for running that. Looks like your 12th gen igpu is indeed quite a bit quicker than my 11th gen. Your fhd template is identical to mine but the uhd one is slightly different... I have the original benchmark specified rates of 28 and 50 rather than the default 24 and 48. Don't know if that would be terribly significant. I should also mention my cpu is single-fan/radiator aio cooled which runs the cpu steadily at stock 5.3 ghz which probably gives it an advantage if yours is air cooled. It's not overclocked but I did hand optimize core-affinity in bios to give greater affinity to cores that run coolest under load. Btw, this benchmark is dominated by performance of a gpu-intensive media generator fx that is implemented throughout on multiple concurrent tracks.

Former user wrote on 12/23/2021, 6:11 PM

Just to confirm this conclusion I ran the test using the same render template but without Intel QSV at FHD. The result was nearly identical at 1:41 (vs. 1:38). Also as expected the GPU usage dropped to 76% and the CPU usage increased to 26%. The CPU picked up the transcode load dropped by the GPU.

 

 

 

I've noticed you use different terms to what most people use here, the one you use the that jumps out the most is when you say transcode, this benchmark doesn't represent a transcode benchmark, there's no transcoding at all going on here except the last the last 6 seconds of the project , it's a test of GPU+CPU processing with encoding via CPU or GPU

The reason I bring this up is because in an actual transcode benchmark the results could be very different, the 12th series may make significant gains over other CPU's even though you don't see that here with high use of CPU+ GPU processing and the latency it introduces. There is a difference as @Howard-Vigorita points out but in a CPU transcode benchmark could be much greater.

This may also not be true because Vegas will happily use 100% CPU in a transcode, so the more cores the merrier, it could be more to do with the processing that the 12th gen see's the gains, as that is where many complain about Vegas's limited CPU processing even when paired with the most powerful GPU's

Hulk wrote on 12/26/2021, 12:21 PM

Looks the the Intel 7 (10ESF) process is quite a bit better than the Rocket Lake 14+++ as this guy overclocked the iGPU from 1550 to almost 2400MHz. https://wccftech.com/intel-alder-lakes-integrated-uhd-graphics-770-gpu-overclocks-like-crazy-almost-hits-2-4-ghz-on-water-cooled-setup-with-over-60-performance-gains/

I took mine to 2000MHz with no changes other than increasing the BCLK.

FHD render time using Intel QSV dropped from 1:38 to 1:18, or 20 seconds faster. It's still a relatively weak iGPU but that still 33% faster clock and 25% faster render performance.

Former user wrote on 12/26/2021, 6:03 PM

THAT'S CRAZY!

The other unusual thing about it's graphics which is to do with hardware encoding I haven't heard it explained yet is the following

  1. Multi-Format Codec Engines: 2 (Multi-Format Codec Engines provide hardware encoding and decoding for amazing video playback, content creation, and streaming usages.)
  2. Intel® Quick Sync Video Yes

The previous Intel series have no mentioned the codec engine stuff, and apparently 12 series have 2 such engines. I"m still left thinking it means the Igpu potentially has twice the encoding/decoding ability of previous generations, in the same way Nvidia PRO cards can have 2x Nvenc and Nvdec encoder/decoders. If that is what it is, even if you're not interested in hardware encoding, it could prove very useful in NLE's. If you are working in 4K and above at 60fps the decoders themselves can max out if you're blending or transitioning enough clips together, possibly the decoders have twice the capacity of previous versions.

Problem is if this is true, why isn't it being promoted and people talking about it,