New PC Build For Vegas 20

Sassylola wrote on 1/3/2023, 5:46 PM

I am looking within the next few weeks depending on what they announce at CES this week on building a new PC. Rumors are AMD will announce a Ryzen 9 7950X3D. Also Intel will announce a Core i9-13900KS, although leaked benchmarks show only a 10% improvement in certain benchmarks from a 13900K.

I am looking at the Following:

CPU-Intel 13900K No OC. Thermaltake AIO 360 mm, Corsair HX1000i PS. Will keep my Ryzen Vll.

MB - Asus ProArt Creator Z690 ( Cannot Find Z790 anywhere ) Or Asus Prime Z 790. I know there is not much of a performance difference between Z690 and Z790. But you can get faster ram with the Z790.

Memory - If I go with the Pro Art, G.Skill Ripjaws S5 64 GB ( 2X32 ) DDR 5 5600. Or If I go with the Prime Board the $100.00 or so I save can go for faster 32 gb ram. The Prime board will run ram up to 7000mhz. I was looking at 6000mhz or 6400mhz. Would the faster ram at 32gb be better than 64gb of slower ram in Vegas?

I am leaning towards Intel because of the Onboard graphics. I have not run a Intel PC for quite a while.

My question is with Intels built in graphics to run it in Vegas do I have to enable it in the BIOS? I know I have to do that, but will it then have a conflict with the Ryzen Vll? To get the CPU graphics video signal to the monitor do I have to have a HDMI going from the MB to the monitor? And also have a HDMI from my Ryzen Vll to the monitor? Or will Vegas in file IO have both the Intel and Ryzen hardware decoder option to pick from?

I have to be honest I just make movies in Vegas for my family, but I like building a new PC every 3 to 4 years, its fun for me. Then my old parts I sell on Ebay to help offset the cost of the new PC.

Thanks for any suggestions.

My System Home Built

Intel 13900K Latest Intel Chipset. Latest Intel Management Engine Installed. NO OC (PL 1 Set to 253W PL 2 set to 253W)

Arctic Liquid Freezer ll 360mm AIO

Gigabyte Aero G Z790 MB Latest BIOS

ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO Latest Nvidea Studio Drivers Installed. With GPU OC results were about the same as GPU set to default settings. I have kept GPU at default settings

64 GB G Skill DDR 5 6000mhz Ram 2X32

Corsair 1000W RMx Power Supply

SK Hynix P41 NVMe 1TB Operating Drive. ( Boot Drive )

WD 2TB SN 850X SSD 1st Render Drive NVMe

SK Hynix P 41 2 TB NVMe Storage Drive

2 TB Sk Hynix P 41 SSD 2nd Render Drive For final renders NVMe

Win 11 Pro all currant updates applied

Fractal Design R6 Case

Samsung - M7 Series 43" Smart Tizen 4K UHD Monitor

Vegas Pro Suite 22 Ver 194

1 Audigyfx Sound Card

SSK USB C External NVMe drive Enclosure with 500GB Samsung 970 EVO Plus for backups.

PROBOI 10G Hard drive Enclosure. USB C. 4 4TB Spinning Drives for Backups.

 

 

Comments

fr0sty wrote on 1/3/2023, 7:54 PM

So far, Intel/AMD systems are topping out the VEGAS benchmarks. However, I was pretty disappointed with my Radeon VII, as well as my Radeon Pro VII, in VEGAS. Might consider one of the newer cards, as those are the ones topping the benchmarks.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

RogerS wrote on 1/3/2023, 8:10 PM

I built a 13th gen Intel system this fall for use in Vegas. I did choose the onboard iGPU as it has big theoretical performance gains with certain formats (10-bit HEVC, etc.) as well as better stability than NVIDIA.

As to the signals I'm still experimenting. With one monitor plugged into my NVIDIA GPU and the other to my motherboard (z690) I do have QSV encoding and decoding. With both monitors plugged into the NVIDIA QSV encoding is gone (don't really care as I have NVENC or could access it through Voukoder) but QSV decoding is inconsistent- it stops working sometimes. I think this is a Vegas, Windows or driver issue vs a hardware limitation and am experimenting to figure that out. I am also going to try out a $5 dummy HDMI adapter on the motherboard HDMI socket and see if that's a workaround to keep decoding consistent while the actual signals are sent from my NVIDIA card to my two monitors.

I opted for fast enough 64GB ram (DDR5-5200) as you can't just add more without sacrificing performance and I though 64GB should suffice for the life of the system. I don't think differences in ram speed are noticeable though can't do A/B testing as that would require another motherboard and DDR4 ram.

Personally I don't see the point in ProArt MBs, i9 CPUs or other high-end solutions that have obscene heat and power requirements and elevated prices for marginal performance gains. My CPU is barely above idle most of the time when editing (8-bit UHD AVC) anyway. I saved money on the MB and got two 2TB m2 SSDs with the second for cache and to store working video projects so HDDs can just be for archival storage. I also picked up a high internal bit depth Eizo monitor which is a pleasure to work with. Also picked up a fast ProGrade card reader and SD cards to cut down on transfer time of getting files onto the M2s. All this makes a bigger difference for my editing experience.

You can see benchmark data on a standardized project here: https://forms.gle/moMQJJRGjiVhGtMTA

Do contribute results when your system is built!

RogerS wrote on 1/4/2023, 9:34 PM

I got a dummy HDMI plug to test plugged into my motherboard with my real monitors connected to the NVIDIA card.

TLDR: I think this is unnecessary.

I did a few tests of media and decoding does disappear when I skip back and forth but I think that's because Vegas is storing frames in ram and doesn't need to decode them again. Having 64GB makes this more apparent than my other 32GB system.

QSV doesn't reappear for rendering with MagixAVC with the dummy plug inserted in Vegas. It is available in Shutter Encoder and Voukoder and Vegas with MagixHEVC so I think this is a Vegas or Windows issue.

The spike in iGPU activity on the left was a QSV render. On the right the same render but with X264 (CPU) in Voukoder from VP 20.

I can bring back QSV encoding by changing Windows graphics settings to power saver for Vegas. You have to then change the main GPU to NVIDIA (or AMD) in Vegas preferences/video. I get almost the same render time for an NVENC render (slightly slower) and exactly the same as my best time before for QSV.
Neat Video performance in this mode was the same as when Vegas was the high performance application so it doesn't appear to impact GPU performance.

Howard-Vigorita wrote on 1/4/2023, 10:27 PM
My question is with Intels built in graphics to run it in Vegas do I have to enable it in the BIOS? I know I have to do that, but will it then have a conflict with the Ryzen Vll? To get the CPU graphics video signal to the monitor do I have to have a HDMI going from the MB to the monitor? And also have a HDMI from my Ryzen Vll to@ the monitor? Or will Vegas in file IO have both the Intel and Ryzen hardware decoder option to pick from?

@Sassylola There is a setting in the Asus bios under System Management Agent to disable the igpu which also disables the hdmi port on the motherboard. It's always been enabled by default on my systems. It also determines whether the igpu will appear in Windows and Vegas for decoding and qsv rendering. You should be able to leave it enabled for Vegas to use and plug your monitor into the Radeon VII.

I generally set Amd or Nvidia as the main gpu in Video prefs. If the igpu shows as Optimal there, you may need to fiddle with the Windows power setting in System/Display/Graphics. I set i/o prefs to use Intel graphics.

MH7 wrote on 1/4/2023, 10:40 PM

@RogerS | Out of curiosity, do you have VEGAS Pro 18, and does 18 utilise the Intel iGPU better than VEGAS Pro 20, or is it about the same?

Last changed by MH7 on 1/4/2023, 10:40 PM, changed a total of 1 times.

John 14:6 | Romans 10:9-10, 13, 10:17 | Ephesians 2:8-9
————————————————————————————————————

Aussie VEGAS Post 20 User as of 9th February 2023 — Build 411 (Upgraded from VEGAS Pro 18)

VEGAS Pro Help: VEGAS Pro FAQs and TROUBLESHOOTING GUIDES

My YouTube Channel: https://www.youtube.com/@TechWiredGeek

Video Cameras: Sony FDR-AX700 and iPhone 12 Pro Max (iOS 17)

============================================

My New Productivity Workstation/Gaming PC 2024

CPU: AMD R7 7800X3D

Motherboard: ASRock X670E Steel Legend (AM5)

RAM: Corsair Vengeance 32 GB (2 x 16 GB) DDR5-6000 CL30 Memory

Main SSD: Samsung 980 Pro 1 TB SSD
Storage SSD: Western Digital Black SN850X 2 TB SSD

GPU: Asus TUF GAMING OC Radeon RX 7800 XT (16 GB)

OS: Windows 11 (Build: 23H2)

Main Monitor: LG 27UD88-W 4K IPS

Secondary Monitor: LG 27UL850 4K HDR IPS

RogerS wrote on 1/4/2023, 10:57 PM

Out of curiosity, do you have VEGAS Pro 18, and does 18 utilise the Intel iGPU better than VEGAS Pro 20, or is it about the same?

I did have VP 18 (actually VP 15-20). I don't use it any longer as 19 did everything it can do and more. No, 18 doesn't utilize the iGPU better. It's basically the same except each version of Vegas expands decoding compatibility and improves reliability.

Former user wrote on 1/5/2023, 3:20 AM

@MH7 @RogerS I've tested the decoder of VP18 - 20, they behave very similar with compatible codec, although from memory I think only the final VP18 version would hardware render at maximum speed with a 30 series Nvidia GPU.

I"ll show some comparisons so when the new Vegas GPU decoder gets here I"ll revisit this thread. Btw the testing might be considered controversial but I"ll describe the reasoning. The idea is to get maximum output from decoder, but I know that's not possible with NVENC at 1080P it becomes the bottleneck, so I used 1080P25 AVC source but encoded to 720P25 which moves bottleneck to something else.

Resolve(via voukoder) 636fps, decoder 77%

Capcut 801fps, decoder 97%

VP20(via voukoder) 139fps, decoder 17%

Notes:

Resolve appears to be constrained by GPU processing and looks to be it's bottleneck

Capcut is so fast, uses such low gpu and cpu I initially thought it was doing an auto copy (like remuxing) and wasn't actually encoding, so I added an 'auto looks' filter to ensure it showed in the encode.

Vegas is possibly being constrained by GPU in the same way Resolve is, the problem is the huge amount of GPU processing compared to the lower throughput.

If you look at the NVENC use it appears that capcut and Resolve should be encoding at a similar speed but it appears Voukoder preset called 'good quality' is using a more computationally complex encode then capcut

 

Sassylola wrote on 1/5/2023, 6:00 AM
My question is with Intels built in graphics to run it in Vegas do I have to enable it in the BIOS? I know I have to do that, but will it then have a conflict with the Ryzen Vll? To get the CPU graphics video signal to the monitor do I have to have a HDMI going from the MB to the monitor? And also have a HDMI from my Ryzen Vll to@ the monitor? Or will Vegas in file IO have both the Intel and Ryzen hardware decoder option to pick from?

@Sassylola There is a setting in the Asus bios under System Management Agent to disable the igpu which also disables the hdmi port on the motherboard. It's always been enabled by default on my systems. It also determines whether the igpu will appear in Windows and Vegas for decoding and qsv rendering. You should be able to leave it enabled for Vegas to use and plug your monitor into the Radeon VII.

I generally set Amd or Nvidia as the main gpu in Video prefs. If the igpu shows as Optimal there, you may need to fiddle with the Windows power setting in System/Display/Graphics. I set i/o prefs to use Intel graphics.

Howard, Radeon Vll, not Ryzen Vll, I just caught that.

My understanding is, and I am not sure. In Vegas I would use the Intel igpu to preview events on the timeline. I mostly use clips from my Sony AX 700 ( XAVC S ).

I have older videos from Mini DV, old VHS tapes digitized, and use videos my my Samsung Galaxy S8, these files I use Shutter Encoder and convert to Prores, they play really nice in Vegas once converted.

So previewing events on timeline I would use Intels igpu? Then for final render switch over to the Radeon Vll in i/o preferences?

I am now looking at Gigabytes Aero G ( Z 790 ) MB, but cannot find it in stock anywhere. Newegg claims they will have them in stock by 1-6-23. Same with the Asus Proart creator Z 790 cannot find that board anywhere. Z 690's are available.

I am also eying AMD's 7950X3D. To be released in February. That means I will be waiting longer for my build. On the productivity side, AMD is promising up to 52 percent performance improvements in file compression (7-Zip) over the i9-13900K, 17 percent on Adobe Premiere Pro (PugetBench Live Playback Score), and 4 percent on file encryption (VeraCrypt AES). No pricing has been released yet.

Thanks for your help.

Last changed by Sassylola on 1/5/2023, 6:01 AM, changed a total of 1 times.

My System Home Built

Intel 13900K Latest Intel Chipset. Latest Intel Management Engine Installed. NO OC (PL 1 Set to 253W PL 2 set to 253W)

Arctic Liquid Freezer ll 360mm AIO

Gigabyte Aero G Z790 MB Latest BIOS

ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO Latest Nvidea Studio Drivers Installed. With GPU OC results were about the same as GPU set to default settings. I have kept GPU at default settings

64 GB G Skill DDR 5 6000mhz Ram 2X32

Corsair 1000W RMx Power Supply

SK Hynix P41 NVMe 1TB Operating Drive. ( Boot Drive )

WD 2TB SN 850X SSD 1st Render Drive NVMe

SK Hynix P 41 2 TB NVMe Storage Drive

2 TB Sk Hynix P 41 SSD 2nd Render Drive For final renders NVMe

Win 11 Pro all currant updates applied

Fractal Design R6 Case

Samsung - M7 Series 43" Smart Tizen 4K UHD Monitor

Vegas Pro Suite 22 Ver 194

1 Audigyfx Sound Card

SSK USB C External NVMe drive Enclosure with 500GB Samsung 970 EVO Plus for backups.

PROBOI 10G Hard drive Enclosure. USB C. 4 4TB Spinning Drives for Backups.

 

 

RogerS wrote on 1/5/2023, 7:27 AM

My understanding is, and I am not sure. In Vegas I would use the Intel igpu to preview events on the timeline. I mostly use clips from my Sony AX 700 ( XAVC S ).

This part is confused. You set preferences/video to your AMD GPU. File/io should be set to the Intel iGPU for decoding (not previewing but decoding/viewing).

For rendering (encoding) you can chose whatever you prefer- AMD VCE, Intel QSV or Mainconcept (CPU).

These three functions of the GPU all set independently.

Hulk wrote on 1/5/2023, 3:16 PM

I have a 13900K and it will basically use as much power as you feed it in something like Prime 95 or other artificial benchmarks that activate all cores and structures in the cores to capacity, although performance gains are minimal for lots of watts after 200 or so watts. I have PL1 (long term power limit) set to 200W and PL2 to 225W with a tau of 20 seconds. This lets it "fly" a bit for short bursty workloads but keeps long term power in check.

The nice thing with these high clocking 13900K's is that many applications, including Vegas don't utilize all 8 P cores and don't activate all structures in the cores so heat/power really isn't a big concern. When the rubber hits the road this means that while my P's will be running at 5.5GHz, power consumption is only at 180W or less depending on the workload. So it's not just as simple as frequency creating power/heat it's also how many cores are loaded and how loaded the cores are. That's why I suggest limiting power in the BIOS and not frequency because at the end of the day you may be giving up frequency when you don't need to be.

Former user wrote on 1/5/2023, 7:12 PM

@Hulk Do you encode mostly hardware or software?

When I had a 4 core CPU I'd only ever hardware encode, but now with a 12core I'll always encode to x264 (time prevailing) and that is where I'd predict your clock speeds will come down and affect you speeds. Have you done any testing like that?

Hulk wrote on 1/5/2023, 7:36 PM

Yes I only encode via hardware. With older builds of Vegas I used to frameserve to Handbrake. Now I use Happy Otter R+, which basically does the same thing. All P cores stay pegged at 5.5GHz and E's at 4.3GHz during the encode. CPU usage on the other hand isn't very high but that a Vegas issue of course. I generally encode to x265. Only use x264 if I want to be sure the viewer will be able to playback as some older hardware/phones don't support x265. Starting to mess around with AV1 to see how good it actually is as well.

RogerS wrote on 1/5/2023, 9:10 PM

x264 and x265 aren't hardware encodes, they're open-source software (CPU-only) implementations. You mean h264 or h265 with NVENC, etc.?

Rich Parry wrote on 1/25/2023, 1:51 PM

@RogerS You mention using a second 2TB m2 SSD for cache. Can you expand on that, are you talking about a global setting in Windows OS or settings in individual apps. For example After Effects, Photoshop and others let you specify the location of cache. I'm in the process of setting a new PC, I have a spare SSD and could use it for cache.

Thanks

CPU Intel i9-13900K Raptor Lake

Heat Sink Noctua  NH-D15 chromas, Black

MB ASUS ProArt Z790 Creator WiFi

OS Drive Samsung 990 PRO  NVME M.2 SSD 1TB

Data Drive Samsung 870 EVO SATA 4TB

Backup Drive Samsung 870 EVO SATA 4TB

RAM Corsair Vengeance DDR5 64GB

GPU ASUS NVDIA GeForce GTX 1080 Ti

Case Fractal Torrent Black E-ATX

PSU Corsair HX1000i 80 Plus Platinum

OS MicroSoft Windows 11 Pro

Rich in San Diego, CA

RogerS wrote on 1/25/2023, 10:26 PM

Hi Rich for my past 3 systems I've had the OS on one drive and cache and documents on another. The main reason was the c: drive had limited storage.

If they are both m2 drives I'm not sure you'll ever hit the drive speed limits to the point that moving cache elsewhere improves performance (it can hit 7000MB/s and I barely push 100MB/s in normal work). That said I do have my cache on a second m2 drive and also use it for storage of working video projects and then move them back to HDDs once finished. I figure that way disk speed will never be a bottleneck for video editing and I also won't have to worry about my c drive running out of space due to video files.

Lukasz-Pecak wrote on 7/26/2023, 4:58 AM

What graphics card for VEGAS 20?

RogerS wrote on 7/26/2023, 6:22 AM

Who are you asking?

You can see some benchmark results in my signature with a variety of GPUs. You can even try it yourself and see how your times compare to others with 12th and 13th generation Intel and various GPUs.