Anyone running Vegas on an I9-9900 yet?

Comments

Mohammed_Anis wrote on 7/19/2019, 1:04 PM

Thanks again for the response. One thing is for sure, Vegas utilizes the GPU to a limited extent. And only for effects and transitions on the timeline. It's all about the CPU, but even on my 4th gen i7, it rarely approaches 100%. Your comment about the 4K playback was helpful. Apparently, we still will need to rely on proxies. At least this might save me a chunk of money on a big upgrade. It also has me thinking that it may be time to move on.




As a person who's worked on multiple NLE's, I'm wondering what would be your next go-to, considering that all of them face similar issues in terms of pace with 4K.

The only distinction I find with Vegas Pro & Premiere, for instance, is the fact Premiere allows you to configure your own custom proxies.

Sounds nifty & dandy on the outside, until you of course have to research and engineer a profile for different cameras.

Also, Adobe seems to be particularly fond of rendering hardware completely useless at every release for a few months. When 2019 dropped, their forums was a massacre. Took ages, before NVIDIA released a driver that helped to cope. And even with that, we still needed a few patches from Adobe.

Even without a proxy, I find VP's playback far more favorable.

DaVinci Resolve is a sublime package that has an advantage with GPU usage, but even with my 1080ti, I failed to see a drastic difference.

AVID is satan-software incarnate. If you truly want to take advantage of its power, you'd have to be an editing suite that houses its proprietary hardware, which is expensive as hell. Otherwise, it performs just as good as any other package. So again. No monumental difference.

Hardware and software package aside...You need to consider a workflow

Last changed by Mohammed_Anis on 7/19/2019, 1:06 PM, changed a total of 2 times.

"I'm a part of all that I've met." Alfred Lord Tennyson

Youtube Channel: https://www.youtube.com/c/VEGASCREATIVEACADEMY


Card name: AMD Radeon RX 6800 XT
Processor: AMD Ryzen 9 5900X 12-Core Processor             (24 CPUs), ~3.7GHz
Memory: 32768MB RAM
Monitor Id: PHLC18F
Native Mode: 3840 x 2160(p) (59.997Hz)
Storage Devices: 2 SSDS, One large HD. VEGAS is installed on SSD

 

Wolfgang S. wrote on 7/19/2019, 1:25 PM

Sure. One will benefit from an increase in performance in many footages. The difference could be utilization of the GPUs to decode the footage, what brings an performance increase for long GOP footages in some cases. ALL I footage seems to rely more on the number of cores. So, it depends ...

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

JoeAustin wrote on 7/19/2019, 1:26 PM



DaVinci Resolve is a sublime package that has an advantage with GPU usage, but even with my 1080ti, I failed to see a drastic difference.

Davinci is pretty amazing software. Unfortunately for me, multcam sync is broken in Resolve 16. Otherwise, I'd be tempted. And it does not support Pluraleyes. Killed the deal for me.

Having tried all that you mentioned, including some you didn't like Edius Pro, Vegas is definitely more user friendly than most. Things that take 11 steps in Premiere, take one or two in vegas.

 

MelvinGonsalvez wrote on 7/29/2019, 1:33 PM

I inquired with Magix recently if their upcoming v17 build will take better performance advantages of multicore CPU & GPU systems and got this response:

"Thank you for your message.
Vegas Pro is capable of recognizing and utilizing any number of cores. I would guess that your system is just powerful enough that task manager is showing that it doesn't need to use large amounts of system resources. I can inquire about the beta testing as well as the current status of optimization for large core counts such as yours. 
--
Thank you for contacting Magix Support."

fr0sty wrote on 7/29/2019, 2:00 PM

Vegas 17 features GPU accelerated video decoding now, so you may want to consider that GPU upgrade over a new CPU after all. I'd go with Nvidia, their cards are more stable and seem to offer more bang for the buck currently.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

JoeAustin wrote on 7/29/2019, 4:19 PM

Vegas 17 features GPU accelerated video decoding now, so you may want to consider that GPU upgrade over a new CPU after all. I'd go with Nvidia, their cards are more stable and seem to offer more bang for the buck currently.

Even though I have an old R9 380, it never get's over 30% utilization. I was thinking of just sticking with it. That's good to know. Thanks.

fr0sty wrote on 7/29/2019, 4:38 PM

Also worth noting Nvidia has just release a creator driver they say is optimized for Vegas 17.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

MelvinGonsalvez wrote on 7/29/2019, 6:24 PM

Thanks for the tip frOsty. I checked as late as last Friday for an update to nVidia Studio driver 430.86 (5/27/19) which I've been running and I just updated to 431.70 which as you mentioned was just released today. I'll redo some tests with v16 and this new driver to see if there are any rendering performance enhancements.

Their driver release page states, "And with SIGGRAPH comes updates for many of the industry's top creative applications. To support these latest updates, NVIDIA is releasing our latest Studio Driver which delivers the best performance and reliability for creative applications via extensive testing of creator workflows.

Available today, the latest NVIDIA Studio Driver provides optimal support for the latest releases of top creative applications including Magix VEGAS Pro v17, Autodesk Arnold, …"

In addition, nVidia mentions that this "driver introduces support for 30-bit color across all product lines … allowing for seamless color transitions without banding."

JoeAustin wrote on 7/29/2019, 6:36 PM

As Melvin said, thanks to Frosty. This is exactly the sort of info that helps in this expensive upgrade decision.

Wolfgang S. wrote on 7/30/2019, 8:07 AM

Even though I have an old R9 380, it never get's over 30% utilization. I was thinking of just sticking with it. That's good to know. Thanks.

 

The question is how good the new nvida cards will be utilized really. And if a 1080 Ti is enough, or ic it should be one of the new 2080Ti cards.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

JoeAustin wrote on 7/30/2019, 9:10 AM

 

The question is how good the new nvida cards will be utilized really. And if a 1080 Ti is enough, or ic it should be one of the new 2080Ti cards.

Absolutely correct, and something I would have to see some clear results from before I spent that kind of money. In fact, my 380 was nearly useless when I first got it, due to driver issues, and issues with Vegas.

fr0sty wrote on 7/30/2019, 12:04 PM

According to Magix, the improvements on older versions of Vegas should come in the form of additional stability, but I wouldn't expect much if any render time improvements. Vegas 17, since it supports GPU decoding, it likely will see a huge benefit.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

JoeAustin wrote on 8/5/2019, 11:29 AM

According to Magix, the improvements on older versions of Vegas should come in the form of additional stability, but I wouldn't expect much if any render time improvements. Vegas 17, since it supports GPU decoding, it likely will see a huge benefit.

I finally got around to trying the VP 17 demo, and sure enough, there is a notable improvement in timeline performance. Pretty dramatic improvement when compared to VP 15. Multitrack 4K is actually useable.

Very tempting to pull the trigger on an upgrade, but being an early adopter bit me pretty hard in the past. They also now have a monthly option, which I was not aware of until now. Definitely suggest checking out the demo.

MelvinGonsalvez wrote on 8/5/2019, 1:58 PM

I got the notification also that v17 is available but I don't see any link in the purchase process of this new software to download the free upgrade as those of us that recently bought v16 were promised. I'll reach out to Magix for some advice on that or I'll wait to hear back from one of you if I missed something along the way.

*** Update: Never mind, I just got my email link to the v17 software download so I'm good."

 

In the meantime, these are some GPU accelerated tidbits that was mentioned on their webpage for v17:

 

"GPU accelerated rendering

With support for cards from the industry's leading manufacturers including NVIDIA, AMD and Intel, VEGAS harnesses the power of GPU processing from the most popular graphics cards to accelerate rendering of popular AVC and HEVC formats to as much as twice the speed."

*** (I don't know how much of an improvement this statement is over what's offered in v16 already.)

 

"GPU accelerated decoding for AVC/HEVC

Take advantage of GPU acceleration for smoother timeline playback in VEGAS. Use the power in your graphics card to preview your project more efficiently than ever, even with effects, filters, and multiple video streams. VEGAS leverages your computer’s GPU for smoother, quicker playback."

*** (Some of you guys are reporting back favorably in this regard so I can't wait to try it out and see the difference myself.)

 

"Hardware-accelerated lossless intermediate format

Demand the highest quality you can get with our new hardware-accelerated lossless intermediate format. Leveraging NVIDIA GPUs, take advantage of this master-quality intermediate for efficient, fast editing or for top-quality archiving of your video. Requires specific NVIDIA hardware."

 

"NVENC 10-bit HEVC rendering

Leverage a qualifying NVIDIA graphics card to use the NVENC codec for 10-bit rendering to HEVC. Take advantage of the expanded bit depth for more robust and deeper color in your video than available in 8-bit, ideal for HLG, HDR, or heavy color correction or grading."

JoeAustin wrote on 8/5/2019, 4:27 PM

@MelvinGonsalvez

Those last two items are good bits of info in choosing my next GPU. How ironic that Vegas has gone from zero Nvidia support to having NVENC specific features.

One bummer to note about the demo is that it's limited to a two minute project. A shame, since past demos were fully functional.

 

 

 

MelvinGonsalvez wrote on 8/5/2019, 5:50 PM

Congratulations Magix, and welcome news to VP17 users, it looks like they nailed the playback performance in the preview window much more to our liking!

I reran my earlier view transform test that I did for Wolfgang and instead of the older, approximately 7fps playback results with ACES at Best (Full) in v16, I now got real-time/near real-time playback with v17!

Again, this was just a simple build of cross-dissolving UHD clips off my Sony AX100 that I was testing but the results were dramatic.

Glad I upgraded my PC to high-end hardware and can't wait to really throw a complex project build at it to see how well it's handled.

(Note, Task Manager reported CPU usage between 22-38 percent and GPU 0 usage at 28 percent and GPU 1 usage at 3 percent for this project playback in v17.)

I'll now try a re-render of that same project in v17 and I'll post the time results just for comparison sakes.

*** Update: This re-render test just dropped from 1hr:13m:52s (in v16) down to 17m:27s with the render option of "Internet 4K 2160P 29.97fps (Nvid NVenc)" chosen as my output file format. Wow, that's an impressive file rendering time reduction for me with the same 32-bit FP (full range) file setup settings!

For this faster rendering, Task Manager read CPU usage at 24%-66%, GPU 0 at 14%-34%, and GPU 1 at 1%-51%. This is the first time I saw both GPUs hitting higher usage loads during a project rendering.

vkmast wrote on 8/5/2019, 6:07 PM

One bummer to note about the demo is that it's limited to a two minute project. A shame, since past demos were fully functional.

The VP 15 and VP 16 trial versions already had the same less than 2 minutes render limit. Please read also here.

JoeAustin wrote on 8/22/2019, 11:52 AM

Just wanted to follow up on the original post. I finally did build a i9-9900 system. And oh boy was it worthwhile. Preview on a 4K multicam project with three tracks is nearly full 29.97 frame rate in 1/4 Preview. This is a huge improvement over what I had with the R9 380 / i7 4790 rig. And this is just with the Intel graphics. No external GPU yet. Render times with Quicksync are nearly halved. Curious to see what happens with a dedicated GPU.

I ended up going with the non overclockable CPU. True, I don't care about overclocking, and I thought a few bucks would be saved, as they throw in a fan on this model. Sadly, the Intel fan is pathetically inadequate. Prime95 had the temp to nearly 100c. Pretty shameful on Intel's part to include this thing. So, just budget for a good cooler.

 

SimplyNon_sense wrote on 8/22/2019, 3:54 PM

Sounds like you are only interested in intel but I will say I upgraded from i5 3750k to Ryzen 7 2700X and it has shorten render times in one case from 30min to around 10min. My only regret is not waiting for the 3700X or 3900X.

fr0sty wrote on 8/22/2019, 3:56 PM

I agree, the new Ryzen chips are an amazing value, and my original Ryzen 1800x still does almost everything I throw at it with ease.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

JoeAustin wrote on 8/22/2019, 6:15 PM

Sounds like you are only interested in intel but I will say I upgraded from i5 3750k to Ryzen 7 2700X and it has shorten render times in one case from 30min to around 10min. My only regret is not waiting for the 3700X or 3900X.

I do feel that Intel was best for my situation, but AMD is Intel's only real competition in this space. I have owned a number of AMD CPUs over the years. Including the famous 486/40 (yes I am that old) that embarrassed Intel quite badly at nearly 2X the performance. I think they are doing great work today. Without them, Intel would probably not be where they are today.

Howard-Vigorita wrote on 8/22/2019, 11:22 PM
Sadly, the Intel fan is pathetically inadequate. Prime95 had the temp to nearly 100c.

I threw a Noctua NH-D15 on mine.

JoeAustin wrote on 8/23/2019, 7:47 AM
Sadly, the Intel fan is pathetically inadequate. Prime95 had the temp to nearly 100c.

I threw a Noctua NH-D15 on mine.

I did look at those, but ended up with a Cooler Master 212 due to the smaller footprint. Works great, and keeps temps in the 60s during a full Prime95 assault.

fr0sty wrote on 8/23/2019, 9:25 AM

"I do feel that Intel was best for my situation, but AMD is Intel's only real competition in this space. I have owned a number of AMD CPUs over the years. Including the famous 486/40 (yes I am that old) that embarrassed Intel quite badly at nearly 2X the performance. I think they are doing great work today. Without them, Intel would probably not be where they are today."

As far as Vegas goes, there's no evidence that supports the intel Chips working any faster than equivalent AMD chips, it seems to be more GPU dependent (my Ryzen 7 1800x/Radeon 7 combo only renders a few seconds slower than a I9 9900k with a 2080ti attached, despite my combo of parts costing over $500 less than the intel/nvidia kit, and I get faster timeline playback performance:

However, for apps like Cinema 4D, there's no competition, AMD wins every time.

Off topic: I see Nvidia eventually either being bought out by Intel, or starting to make their own CPUs (they already are with Tegra). There could be a third horse in the race before long.

Last changed by fr0sty on 8/23/2019, 9:36 AM, changed a total of 4 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)