Best path for next upgrade

JoeAustin wrote on 3/18/2019, 10:01 AM

Hi Folks,

Currently have a Win 10 system with an i7 4790K 4.0 processor and a AMD Radeon R9 380 and VP15. It performs fine on 1080P editing, but 4K is painful. Sure, proxies help a good deal, but they take a loooooooooooong time to build. Question is, what the next step with the most bang for the buck. I've seen some good bargains on the NVIDIA GeForce GTX 980 Ti. Can't help but wonder what the real world performance improvement would be. Or does it make more sense to just save up for a Skylake CPU and new motherboard? And it'll mean new memory too of course. I will of course eventually upgrade both GPU and CPU, but can't afford to do both right now.

Any thoughts would be appreciated.

Comments

john_dennis wrote on 3/18/2019, 10:10 AM

Technology, like life, is a one way trip. Buy more cores without dropping core clock frequency next time around and don't buy a GPU from the bargain bin.

j-v wrote on 3/18/2019, 10:18 AM

@JoeAustin

Here is a user with problems and the same processor I think.
A better one will give better results I think and my Nvidia GTX 1050 delivers fast rendering and full 4K preview with Vegas Pro 15 and 16.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

JoeAustin wrote on 3/18/2019, 10:45 AM

@JoeAustin

Here is a user with problems and the same processor I think.
A better one will give better results I think and my Nvidia GTX 1050 delivers fast rendering and full 4K preview with Vegas Pro 15 and 16.

According to the Passmark site, the 1050 is a performance hit vs. the R9 380. One thing I've learned is that you need to see near double the benchmark for a given CPU or GPU to make a significant difference.

j-v wrote on 3/18/2019, 11:12 AM

Sorry, but I cannot compare both on my hardware, so I told you my experience with te one I have and you asked for (more or less).

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Kinvermark wrote on 3/18/2019, 12:44 PM

Buy more cores without dropping core clock frequency next time around and don't buy a GPU from the bargain bin.

+1. I just "upgraded" to an RX 580 from an R9 290 and saw very little increase in performance. I half expected this to be the case and upgraded for other reasons, but the message should be clear: minor GPU card upgrades are unlikely to make much difference.

JoeAustin wrote on 3/18/2019, 1:05 PM

Buy more cores without dropping core clock frequency next time around and don't buy a GPU from the bargain bin.

+1. I just "upgraded" to an RX 580 from an R9 290 and saw very little increase in performance. I half expected this to be the case and upgraded for other reasons, but the message should be clear: minor GPU card upgrades are unlikely to make much difference.

Thanks for the reply. This is the sort of info I was looking for. And that's a pretty big jump between those two cards. Kind of as I suspected. Might just be best to just save up for the new mobo and CPU.

fr0sty wrote on 3/18/2019, 1:13 PM

Something to consider about Intel chips... a flaw was recently discovered in the chips that poses a security vulnerability at the hardware level, and no software-only patch can fix it. It will require them to change the architecture of the chips themselves, and even that is going to come at a performance cost, since the exploit hits the chip's speculative execution engine, which predicts upcoming computations to boost performance.

AMD processors are not affected by this vulnerability.

https://www.techacrobat.com/intel-spoiler-flaw/

 

JoeAustin wrote on 3/19/2019, 8:23 AM

Something to consider about Intel chips... a flaw was recently discovered in the chips that poses a security vulnerability at the hardware level.

AMD processors are not affected by this vulnerability.

https://www.techacrobat.com/intel-spoiler-flaw/

 

I do appreciate the heads up. I was aware of this issue. I suppose it is something to consider before building a new machine. When I bought my i7 4790K AMD, had fallen pretty far behind Intel in terms of cost vs. performance. Just looking over the list of current desktop CPUs on Passmark, AMD looks to have made quite a comeback. The Threadripper 1950X is well into i9 territory at almost half the price.

https://www.cpubenchmark.net/desktop.html

 

fr0sty wrote on 3/19/2019, 1:43 PM

I most definitely have not been disappointed by the performance of my Ryzen 7 1800x.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

james-ollick wrote on 3/19/2019, 2:15 PM

An Intel spokesperson said in a statement that software can be protected from Spoiler attacks while DRAM modules with Rowhammer mitigations still should remain shielded.  

"Intel received notice of this research, and we expect that software can be protected against such issues by employing side channel safe software development practices. This includes avoiding control flows that are dependent on the data of interest. We likewise expect that DRAM modules mitigated against Rowhammer style attacks remain protected. Protecting our customers and their data continues to be a critical priority for us and we appreciate the efforts of the security community for their ongoing research."

https://www.zdnet.com/article/all-intel-chips-open-to-new-spoiler-non-spectre-attack-dont-expect-a-quick-fix/

 

Home built PC - Corsair case, ASUS ROG Maximus XI Code motherboard, i9 9900k, 64GB Corsair Vengeance RGB DDR4 DRAM 3200MHz,  Sapphire Nitro+ Radeon RX 7900 XTX 24GB graphics card, Corsair 1000 watt power supply. Windows 11.

VP 21 BCC 2024 Boris FX Continuum Complete, Titler Pro v7. Various NewBlue effects.

fan-boy wrote on 3/19/2019, 3:30 PM

Try Davinci for free , it might accelerate your present machine so much that you won't even be thinking about new hardware . Davinci accelerates ( Time Line and Rendering ) on AMD 7850k APU Soc GPU . Davinci has some serious optimizations in there . Hitfilm does NOT accelerate too good here . Vegas Time Line accelerates , but Vegas rendering is CPU only , on Vegas 14 .

fr0sty wrote on 3/20/2019, 2:48 AM

Vegas can do GPU rendering in Versions 13+, but only openCL. NVenc and the AMD equivalent I keep forgetting is only supported in versions 15+.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

JoeAustin wrote on 3/20/2019, 8:08 AM

Vegas can do GPU rendering in Versions 13+, but only openCL. NVenc and the AMD equivalent I keep forgetting is only supported in versions 15+.

Sure, and accelerating rendering is important too, but I was mostly looking at timeline performance in this case.

JoeAustin wrote on 3/20/2019, 10:49 AM

Try Davinci for free , it might accelerate your present machine so much that you won't even be thinking about new hardware . Davinci accelerates ( Time Line and Rendering ) on AMD 7850k APU Soc GPU . Davinci has some serious optimizations in there . Hitfilm does NOT accelerate too good here . Vegas Time Line accelerates , but Vegas rendering is CPU only , on Vegas 14 .

I had tried the free Davinci in the past quite a while ago, and thought I would give it a go again. Then as now, I can not get multicam to work as it should. Even a couple of five minute simple clips with a couple of claps for sync, it fails to create a proper multicam timeline object. However, playback of 4K video straight from the GH5 was at full framerate. I may spend some more time with this when I can, but it is definitely a departure in terms of simplicity and intuitive operation as compared to VP15.

UPDATE: 3/21/19

Spent some more time with Resolve, and got a bit of help from their forum. Multicam is not only working, it's working very well. Full frame rate playback of a two camera multicam 2160 30P. Cuts can cause a couple of frames to drop, but it's quite usable. And apparently no need to build time consuming proxies. The built in audio capabilities are very impressive too.

As it turns out, I don't think it's the computer or video card that needs to be replaced. 😉

 

 

 

AVsupport wrote on 3/20/2019, 11:36 PM

Timeline wise I see VP being a CPU hog, and I cannot see much GPU acceleration there really. OpenCL support is basic ( Version 1.1 only: nvidia 1060 1.2, intel 630 2.1) : I believe that may benefit some 3rd party plugins more than VP native, And rendering being a different story alltogether. Currently, I believe a faster CPU might serve you better.

my current Win10/64 system (latest drivers, water cooled) :

Intel Coffee Lake i5 Hexacore (unlocked, but not overclocked) 4.0 GHz on Z370 chipset board,

32GB (4x8GB Corsair Dual Channel DDR4-2133) XMP-3000 RAM,

Intel 600series 512GB M.2 SSD system drive running Win10/64 home automatic driver updates,

Crucial BX500 1TB EDIT 3D NAND SATA 2.5-inch SSD

2x 4TB 7200RPM NAS HGST data drive,

Intel HD630 iGPU - currently disabled in Bios,

nVidia GTX1060 6GB, always on latest [creator] drivers. nVidia HW acceleration enabled.

main screen 4K/50p 1ms scaled @175%, second screen 1920x1080/50p 1ms.

eikira wrote on 3/22/2019, 12:58 PM

Technology, like life, is a one way trip. Buy more cores without dropping core clock frequency next time around and don't buy a GPU from the bargain bin.


Well easier said than done. Try to find instead of a 4 Core 4GHz a 8 Core who can remain on all Cores 4GHz without paying the same amount of a good GPU which you would benefit in encoding like NVENC.

As an example, i have a 6 Core 6850K and tried a 14 Core XEON on the same System, did for editing and encoding nothing better or worse. The 6850K was set to 4GHz and the 14 core Xeon was set to 2.2GHz per core.

So one has to wonder if it really makes a different more core with same frequency.

fan-boy wrote on 3/22/2019, 4:23 PM

JoeAustin ,

testing more with Davinci . Seems Vegas and Davinci do have some things in common .

I used Vegas's Magix Apple Pro Res 422LT to make two UHD 3840 x 2160p short clips . These files work really good in Vegas , Davinci and Hitfilm . wanted to see how this Soc 30 GB GPU compares to your RX 380 ( 256 GB bandwidth ? ) about 10x more crunch .

Davinci was playing both of those UHD clips in Multi-Cam at 29.97 in Edit Tab playback . Monitor view was playing both clips too , 3 total . i did some camera switching to further test Multi-Cam ,... i was surprised ! I kept looking though . and well , not totally impressive .

Davinci menu File-->Project Settings-->Master-->TimeLine resolution is core to this . Once that is set to 1920 x 1080 , it stays that way , until manually changed . Thus your "Time Line" may be playing at 1920 x 1080 , like was here . Manually set it to UHD 3840 x 2160 , and re-speed test ( FPS reduced to around 18-20 ) . Also , If working on a UHD project , and Time Line is set to 1920 x 1080 , the Delivery will default to 1920 x 1080 . Before rendering , be sure to set Time Line resolution to the actual res. of source footage .

If it really is about the GPU , than you should still be banging 30 FPS , with Time Line at 3840 x 2160 .

Resolve Manual.pdf says to set Time Line resolution to same res. as source footage to visually see the most accurate results of the Color Correction Tab . If too much hurt on the computer , the manual.pdf says it is ok to set Time Line resolution to something lower then source footage , when In the Edit Tab to get real-time playback there . Again , be sure to set Time Line res. to same as source footage res. before rendering . somewhere around page 195 in Resolve 15.pdf manual .

Vegas has 32 bit Full with gamma 2.222 in project settings . Vegas help says it is ok to set to 8 bit for performance improvements while editing . I have notice though , doing color or gradient adjustments in 8 bit mode can\might look different with switching back to 32 bit full with gamma 2.222. Both programs should use the highest settings during coloring or gradient adjustments .