AMD Radeon Vll or Nvidia 2070 Super?

Sassylola wrote on 1/10/2020, 3:08 PM

I know that AMD is not optimized yet for VP 17. And most users say go with Nvidia for VP 17.

I will be building a new system with a AMD 3900X, 32gb of ram, 2 - Sabrent 1TB Rocket NVMe 4.0 Gen4 PCIe M.2 Internal SSD Extreme Performance Solid State Drives. Win 10. MSI Meg X570 Unify Motherboard, 850 W PS.

I was going to go with a 2070 Super, but for $10.00 more I can get a Radeon Vll. The Radeon has 16gb of memory vs 8 for the 2070 Super. I know AMD is or is going to stop making the Radeon Vll's. The 5700XT model is taking its place?

I will be running VP 17. Video is from Sony AX 700 camcorder in 4K. I am looking more for smother timeline playback than render speed.

Any suggestions would be appreciated. Thanks

My System Home Built

Intel 13900K Latest Intel Chipset. Latest Intel Management Engine Installed. NO OC (PL 1 Set to 280W PL 2 set to 280W)

Arctic Liquid Freezer ll 360mm AIO

Gigabyte Aero G Z790 MB Latest BIOS

ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO Latest Nvidea Studio Drivers Installed. With GPU OC results were about the same as GPU set to default settings. I have kept GPU at default settings

64 GB G Skill DDR 5 6000mhz Ram 2X32

Corsair 1000W RMx Power Supply

SK Hynix P41 NVMe 1TB Operating Drive. ( Boot Drive )

WD 2TB SN 850X SSD 1st Render Drive NVMe

SK Hynix P 41 2 TB NVMe Storage Drive

2 TB Sk Hynix P 41 SSD 2nd Render Drive For final renders NVMe

Win 11 Pro 23H2

Fractal Design R6 Case

Samsung - M7 Series 43" Smart Tizen 4K UHD Monitor

Vegas Post 365 21 Ver 208

Sound Blaster 5.1 Audigyfx Sound Card

SSK USB C External NVMe drive Enclosure with 500GB Samsung 970 EVO Plus for backups.

PROBOI 10G Hard drive Enclosure. USB C. 4 4TB Spinning Drives for Backups.

 

 

Comments

NCARalph wrote on 1/10/2020, 5:45 PM

I have a Radeon VII and with 4K video and best full preview I get about 25 fps, although it is also using the Intel HD GPU on the CPU.

https://www.vegascreativesoftware.info/us/forum/rendering-gh5-5k-vs-gh4-4k-with-amd-radeon-vii-and-intel-hd-gpus--118333/

You might want to be a bit cautious with the Sabrent SSD. I had a 2 TB version and it would repeatably crash my entire PC when rendering a 90 minute video with a 4K and GH5 5K stream. For shorter renders and everything else it worked fine.

I replaced it with a Samsung SSD and it works perfectly.

tripleflip18 wrote on 1/10/2020, 9:06 PM

My Sabrent 2tb working fine, rending 3 videos week, about 1 hour of footage........

NCARalph wrote on 1/10/2020, 10:25 PM

My Sabrent 2tb working fine, rending 3 videos week, about 1 hour of footage........

Are you doing an intensive render with multiple 4K or more streams? You're signature says you have a Samsung SSD, is that the correct?

It's possible I got a bad card, but the fact that my Samsung works fine is pretty suspicious.

Last changed by NCARalph on 1/10/2020, 10:27 PM, changed a total of 1 times.

Windows 10 Pro 64, i7 6700K @4GHz (with Intel Graphics 530) , 32 GB, AMD Radeon VII, Samsung 1TB NVME, 9TB storage spaces RAID, 3 monitors running off the Radeon

fred-w wrote on 1/11/2020, 4:09 AM

Playback, with LUT engaged, every major card is about equal, you start loading on heavier effects and Radeon VII stands out, big time, look at second playback chart: https://techgage.com/article/magix-vegas-pro-17-cpu-gpu-performance/2/

TheRhino wrote on 1/11/2020, 9:44 AM

The Radeon VII was $500 USD over the holidays & IMO it is a better choice than a $500 Nvidia. AMD will release their next top-end card sometime in 2020, called the "Big Navi" or something... Meanwhile, the most recent batch of $300 - $400 AMD cards like the 5700, etc. are considered "mid-range" & don' t have the compute / content creation capabilities of the Radion VII, or even my liquid-cooled VEGA 64 purchased for $350 back in April... For even less money, the $350 VEGA 64 & $250 VEGA 56 are still great performers in Vegas Video & you can put the money saved towards the 3950X CPU...

AMD 3900X vs. 3950X... Because Vegas Video is built around older code, it does not peform significantly faster on the 3950X vs. the 3900X or even my 9900K... However, many other apps, like Handbrake, do... IMO it is MUCH easier to sell a used GPU & install a new one vs. CPUs... If I were building a best bang/buck system today I would probably pair a $750 3950X CPU with a $250 VEGA 56 GPU vs. spending $500 for the 3900X & $500 for the Radeon 7...
 

Last changed by TheRhino on 1/11/2020, 9:47 AM, changed a total of 1 times.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

tripleflip18 wrote on 1/11/2020, 3:36 PM

My Sabrent 2tb working fine, rending 3 videos week, about 1 hour of footage........

Are you doing an intensive render with multiple 4K or more streams? You're signature says you have a Samsung SSD, is that the correct?

It's possible I got a bad card, but the fact that my Samsung works fine is pretty suspicious.

No usually no more then 2 4k60p cams, so usually 1, but heavy user with all things open/running..... no problem ever..

Chief24 wrote on 1/11/2020, 5:03 PM

Well, I would have to agree for the most part with @TheRhino with his assessment. I have a Radeon VII paired to a 1950X (Stock), 64GB RAM (DDR4-2667), Intel 750 800GB NVME PCI-E for OS/Apps, 2-1TB Samsung Evo's (960 & 970) and an Intel 660p 2TB for all storage, with another 660p for all Scratch, and a WD Red 2TB for renders; and a second machine with the RTX 2070 nVidia Founder's Edition paired to a 2920X, 32GB RAM, and storage equivalent to first system. Both have Magix Movie Studio 16 Platinum, and honestly, can't tell the difference using either system. (Just purchased Vegas Pro 17 Edit - already have enough DVD Architect copies, and I don't use a lot of Plug-ins).

I have 4 GoPro Heros' (5, 6, & 2-7's), Canon 80D, and Sony AX-53 (baby kin to your AX 700). Throwing any of the footage from these cameras on a 4K timeline (sans 80D as it is 1080 only), and since most of the footage is definitely longer than a few seconds, and typically a combination of all the cameras, it can make editing TOUGH! Hence, I have the extra drives and just Transcode out to either Cineform or Grass Valley HQX (both have worked fine for my needs). I would use the inbuilt Proxy for Vegas Pro/Movie Studio, but since only having 28 inch 4K screens, my eyes are not that good to view the timeline for what I need. So, Transcoding it is.

And on either computer, doesn't really take as long as it used to. Plus, makes you get up, stretch those legs, and get some more coffee - that's the real benefit!😀

I have not tried Vegas Pro 17 on the second machine with the RTX 2070, so not sure how well it does on the editing timeline, yet. Like you, more worried about the actual editing to CREATE that story, then how fast to render. When I do have to render, I just refer to the previous paragraph!

For the first system, Vegas Pro 17 and the Radeon VII does wonderful on the timeline, and would think if you want to use "Rhino's" approach of either the Vega 56/64, you can't go wrong. I typically don't use the Hardware render, so not concerned AMD hasn't gotten off their butts to provide the necessary support for Content Creators/Editors and such. So render time is not much a concern, and I now have all the James Bond movies on disc in my collection of Blu-Ray/DVD, so there is always that!

Self Build: #1 MSI TRX40 Pro Wi-Fi w/3960X (be Quiet! Dark Rock Pro TR4) @ stock; 128GB Team Group 3200 MHz; OS/Apps - WDSN850X PCI-e 4.0x4 4TB, Documents/Extras - WDSN850X PCI-e 4.0x4 4TB; XFX AMD Radeon 7900XTX (24.1.1); Samsung 32 Inch UHD 3840x2160; Windows 11 Pro 64-Bit (23H2 22631.3155); (2) Inland Performance 2TB/(2) PNY 3040 4TB PCI-e on Asus Quad M.2x16; (2) WD RED 4TB; ProGrade USB CFExpress/SD card Reader; LG 16X Blu-Ray Burner; 32 inch Samsung UHD 3840x2160.

VEGAS Pro 20 Edit (411) & HOS (Happy Otter Scripts); DVD Architect 7.0 (100);

Sound Forge Audio Studio 15; ACID Music Studio 11; SonicFire Pro 6.6.9 (with Vegas Pro/Movie Studio Plug-in); DaVinci Resolve (Free) 18.6.5

#2: Gigabyte TRX50 Aero D w/7960x (Noctua NH-U14S TR5-SP6) @ stock; 128GB Kingston Fury Beast RDIMM @4800 MHz; OS/Apps - Seagate Firecuda 540 2TB PCI-e 5.0x4; Documents/Extras/Source/Transcodes - 4TB WDSN850X PCI-e 4.0x4; 4TB Inland Performance PCI-e 3.0x4; 2TB Inland Performance PCI-e 4.0x4; BlackMagic PCI-e Decklink 4K Mini-Recorder; ProGrade USB SD & Micro SD card readers; LG 32 Inch UHD 3840.x2160: PowerColor Hellhound RX Radeon 7900XT (24.1.1); Windows 11 Pro 64-Bit (22631.3155)

VEGAS Pro 20 Edit (411) & HOS; DVD Architect 7.0 (100); Sound Forge Audo Studio 15; Acid Music Studio 11

Canon EOS R6 MkII, Canon EOS R6, Canon EOS R7 (All three set for 4K 24/30/60 Cinema Gamut/CLog3); GoPro Hero 5+ & 6 Black & (2) 7 Black & 9 Black & 10 Black & 11 Black & 12 Black (All set at highest settings - 4K, 5K, & 5.3K mostly at 29.970); Sony FDR AX-53 HandyCam (4K 100Mbps XAVC-S 23.976/29.970)

TheRhino wrote on 1/11/2020, 8:12 PM

...I have a... Intel 750 800GB NVME PCI-E for OS/Apps, 2-1TB Samsung Evo's (960 & 970) and an Intel 660p 2TB for all storage, with another 660p for all Scratch, and a WD Red 2TB for renders...

I also use the Intel 660p M.2 in my main 9900K editing workstation... For the price of a single, faster 2TB Samsung, etc. I have (2) 2TB Intel 660p in RAID0 for source video and another (2) 2TB Intel 660p in RAID0 for either source or target video. The 660p's write speed will slow-down as they fill-up, but it is not that much of a problem when I have 4TB of space for each in RAID0. For really large projects I use both M.2 RAIDs for source video (8TB total) and use an older internal hardware SATA RAID0 with (4) 8TB WD Red Drives for target - which is fast enough to keep up with renders. Everything is backed-up to a NAS & one other location so I am not worried about my RAID0 setups failing... My workflow involves using more than one workstation at a time so I try to find the best bang/buck when I upgrade one.

 

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

fr0sty wrote on 1/14/2020, 9:19 AM

I have a Radeon 7, get the Nvidia card. My Radeon sucks... It is extremely powerful, but not very configurable in its config menu vs. Nvidia, it gives me all kinds of crap (sometimes my screen just goes black while editing, if I turn on my computer the TV will overscan until I change its view mode from "wide" to "full" (or the reverse if it is already on the other), my OLED display flickers with static and won't get a signal if I turn it on while Windows HDR mode is enabled, so I can't edit in HDR until I turn it off and then back on)... and the high end RTX Nvidia cards all rank ahead of mine on benchmarks, even if only barely. Then there's also the timeline decoding working on Nvidia cards in Vegas.

The ever-so-slight increase in performance plus the not-so-slight improvement in reliability make the Nvidia card the best option for you.

Last changed by fr0sty on 1/14/2020, 9:20 AM, changed a total of 3 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Kinvermark wrote on 1/14/2020, 11:57 AM

+1. 16GB Radeon card is tempting, but there is a lot of anecdotal evidence to suggest it can be problematic. The gamers say "great hardware, too bad about the lousy AMD drivers." Same applies to video editing use IMHO.

fred-w wrote on 1/14/2020, 12:07 PM

Anecdotal....well...I'd trust Rhino here, he's a pro and running more than one machine, as am I. If you are running Adobe products I might concur, but if your "go to" is Vegas, I'd say the bang for the buck and muscle is with the higher end AMD cards, VII, Vega 64 especially. (Frosty, are you simply ignoring the Techgage benchmarks? Here's another: )

https://techgage.com/article/exploring-magix-vegas-pro-16-gpu-performance/

Kinvermark wrote on 1/14/2020, 2:54 PM

Trust whomever you prefer… it's still anecdotal. i.e. Based on that persons experience as opposed to empirical broad based testing.

Like Frosty, I got fed up with my AMD GPU's (last two iterations for me) flickering the displays, resetting the desktop, and of course being without h264 GPU decode support in Vegas and other NLE's. Always seem to be last to the party. Just sayin' :)

 

fifonik wrote on 1/14/2020, 3:11 PM

I'm using AMD GPUs for quite long time already (not Vega 64/Radeon VII as discussed here) and "display flickering" is something new for me. Never had this.

Also, I'm not sure what do you mean under "resetting desktop". Is it resetting desktop icons' locations or something different? If this is about icons, I did have the same issue also on my wife's PC with NVIDIA GPU and worked around on both PCs simply installing "Desktop Restore" program. However, looks like the issue was related to older Windows 10 versions as I do not have it for long time already.

Are you sure that this is general issues, but not specific for the brand/model used? I did have some issue in past with specific brand/model. RMAed to another model with the same GPU chip and everything was good.

You are right that GPU decode not implemented for AMD GPUs in VP. Would like to have it to try (not really urgent in my case as my footage in FullHD only and I have 60fps on preview without the decode acceleration). I still hope Magix will implement it sooner than later. However, I read in the forum many times that people disabling it on NVIDIA GPUs as well because of crashes.

Last changed by fifonik on 1/14/2020, 3:14 PM, changed a total of 1 times.

Camcorder: Panasonic X1500 + Panasonic X920 + GoPro Hero 11 Black

Desktop: MB: MSI B450M MORTAR TITANIUM, CPU: AMD Ryzen 5700X, RAM: G'Skill 16 GB DDR4@3200, Graphics card: MSI RX6600 8GB, SSD: Samsung 970 Evo+ 1TB (NVMe, OS), Samsung 870 Evo, HDD WD 4TB, HDD Toshiba 4TB, OS: Windows 10 Pro 22H2

NLE: Vegas Pro [Edit] 11, 12, 13, 15, 17, 18, 19

fred-w wrote on 1/14/2020, 4:23 PM

Trust whomever you prefer… it's still anecdotal. i.e. Based on that persons experience as opposed to empirical broad based testing.

Like Frosty, I got fed up with my AMD GPU's (last two iterations for me) flickering the displays, resetting the desktop, and of course being without h264 GPU decode support in Vegas and other NLE's. Always seem to be last to the party. Just sayin' :)

 

I've not had similar problems.

I do think the Adobe integration is better with Nvidia.

I'm not JUST humming anecdotes here, I posted TWO, count 'em, sets of very empirical tests done with both Vegas 16 and 17, which tested, side by side, in real world/Vegas world routines, the most usual cross-section of AMD and Nvidia cards. The results speak for themselves. (did you even check those out??)

YOUR take is more anecdotal, if anything.

OF COURSE this sort of advice is always taken/given with a YMMV grain of salt, don't get that part twisted either. Try a card, if it works, keep it, if not, take it back, most vendors allow that. But I'm not going to say, in the face of these well documented, well run tests, and other pros reporting here, that AMD is garbage, or will cause people to get the blues, when the "real world" tests say the opposite.

fr0sty wrote on 1/14/2020, 5:26 PM

The benchmarks I refer to are the tests done on these forums, which are very thorough and well documented. While my system usually ranks 4th or 5th out of everyone else on these forums, the top systems always seem to have that RTX 2080 in them. The only thing I've seen my Radeon 7 accel within Vegas was Neat Video, where it scored a higher benchmark than any of the other GPUs.

As far as stability goes, I also had issues with my previous Radeon card, the 6870, that got so bad that they were crashing my system in the middle of performances (I am a video projection mapping artist), so I threw it away and bought a GTX 970. That card worked completely without issue for almost 6 years before I upgraded again to this Radeon 7, which again is giving me issues. Not only that, but the lack of configuration options really limits what I can do with it.

Another big trump card for Nvidia is them unlocking 10 bit support when you install the studio drivers. My Radeon can only do 10 bit when used full screen as a secondary monitor in Vegas (16 or later), and only within Vegas, no other apps that I have support it.

Last changed by fr0sty on 1/14/2020, 5:28 PM, changed a total of 2 times.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

Kinvermark wrote on 1/14/2020, 6:06 PM

@fred-w

1) Yes, I have seen the tests you posted. Very limited, but better than nothing. I like the more extensive testing that Puget Systems does, but they don't test Vegas Pro. Radeon VII looks good for Resolve, but they still don't recommend it - much to the annoyance of AMD fans everywhere.

2) Not saying AMD is "garbage." Last two cards were AMD.

3) I am providing a rationale for why I wouldn't choose an AMD card for Vegas at this point in time.

Be aware that Newegg ran good promo deals on Radeon VII's over the last several weeks ($650 CAD), but that these cards may only be returned for a replacement if found defective - no refunds if your experience isn't as you would like.

fred-w wrote on 1/14/2020, 8:33 PM

@fr0sty YES! When the card is the 2080ti - (not the 2080) that seems to tower over some of the others, so as far as playback, the 2080ti and the Vega 64 and the VII.....Those are the three cards that actually make a significant difference in playback smoothness/fps, when you are talking compositing or effects. No dispute there, but most people are not spending to that level, and so my comments go mostly to that idea.

Howard-Vigorita wrote on 1/15/2020, 1:19 AM

Been beating my Radeon VII / 9900k system relentlessly the last few months editing and rendering multicam 2 to 5 hour projects with vp17 and it's been rock solid. Except for occasional work-arounds of some lingering Vegas bugs. But no driver or rendering issues what so ever. My experience with Nvidia is limited to the on-board 1050ti in my laptop and I have to say it doesn't seem to play well with its integrated Intel uhd630 gpu also on-board. Whereas the Radeon VII works smoothly and flawlessly with the onboard Intel uhd630 in my Asus Maximus XI Extreme motherboard which hosts the 9900k.

I have a 2nd system with a lesser AMD card in it and toy with the idea of updating its gpu. Tempted to try an Nvidia 2080ti but a 2nd Radeon VII is quite a bit less expensive. Maybe in the Spring some price drops will help make up my mind.

fred-w wrote on 1/15/2020, 1:35 AM

The 2080ti doesn't make sense when the Radeon VII is at that sub $600 price point. Vegas ALWAYS ran better with AMD, generally, until very recently, and THEN people started acting like AMD never played well with Vegas.... IF one is spending OVER $500 and is not significantly invested in Adobe, I'd say the VII ALL THE WAY, until that 2080ti price comes way down. The OTHER caveat to that is if one is running some sort of work station, as I do have an HP Z workstation, that is setup to run with a Nvidia/Quadro Card and AMD for that, it is possible is not a good fit.

Marcin wrote on 1/15/2020, 6:20 AM

Interesting topic, maybe someone had Radeon and now installed Nvidia on the same computer and say something more?

TheRhino wrote on 1/15/2020, 5:37 PM

I agree that the 9900K's onboard QSV paired with either a $500 Radeon 7 or $350 VEGA 64 are a great match because Vegas is able to utilize BOTH GPUs during QSV rendering. Note - I had to select the Intel GPU driver 25.20.100.6373 because after 6373 Intel changed the drivers to DCH which requires Magix/Vegas to comply...

I found a like-new refurbished Liquid-Cooled VEGA 64 for just $350 last April.  It only takes (1) PCIe slot & the cooling allows it to overclock to near Radeon VII speeds... Selecting HEVC QSV, my 5.0 ghz 9900K completes the Red Car test (discussed on these forums) in 14s average.  This 14s speed was the same speed as similar systems with a Radeon 7... However, if I select VCE (AMD only), I get 33s.  CPU only is 50s. 

Before Christmas I got a 17" gaming laptop with a i7-9750H 6-core CPU, RTX 2060 GPU, 1TB M.2, 16GB DDR4, etc. for $999.  It completes the Red Car Test to HEVC QSV in 28s average using both the Intel & Nvidia GPUs.  I've been using it to edit family videos from the comfort of my Lazi-boy recliner vs. spending additional time in my studio and I am happy with the performance so far...

In comparison, an old 6-core 4.0ghz Xeon workstation, without QSV, but with a newer $130 AMD 570 GPU, completes the Red Car Test in 1:47.

I'm hoping someone will post Red Car Test scores for the new AMD 3950X soon... It really flies through Handbrake encoding although I predict that it will only be about 15% faster than a 9900K in Vegas due to Vegas' older code...

Last changed by TheRhino on 1/15/2020, 5:47 PM, changed a total of 1 times.

Workstation C with $600 USD of upgrades in April, 2021
--$360 11700K @ 5.0ghz
--$200 ASRock W480 Creator (onboard 10G net, TB3, etc.)
Borrowed from my 9900K until prices drop:
--32GB of G.Skill DDR4 3200 ($100 on Black Friday...)
Reused from same Tower Case that housed the Xeon:
--Used VEGA 56 GPU ($200 on eBay before mining craze...)
--Noctua Cooler, 750W PSU, OS SSD, LSI RAID Controller, SATAs, etc.

Performs VERY close to my overclocked 9900K (below), but at stock settings with no tweaking...

Workstation D with $1,350 USD of upgrades in April, 2019
--$500 9900K @ 5.0ghz
--$140 Corsair H150i liquid cooling with 360mm radiator (3 fans)
--$200 open box Asus Z390 WS (PLX chip manages 4/5 PCIe slots)
--$160 32GB of G.Skill DDR4 3000 (added another 32GB later...)
--$350 refurbished, but like-new Radeon Vega 64 LQ (liquid cooled)

Renders Vegas11 "Red Car Test" (AMD VCE) in 13s when clocked at 4.9 ghz
(note: BOTH onboard Intel & Vega64 show utilization during QSV & VCE renders...)

Source Video1 = 4TB RAID0--(2) 2TB M.2 on motherboard in RAID0
Source Video2 = 4TB RAID0--(2) 2TB M.2 (1) via U.2 adapter & (1) on separate PCIe card
Target Video1 = 32TB RAID0--(4) 8TB SATA hot-swap drives on PCIe RAID card with backups elsewhere

10G Network using used $30 Mellanox2 Adapters & Qnap QSW-M408-2C 10G Switch
Copy of Work Files, Source & Output Video, OS Images on QNAP 653b NAS with (6) 14TB WD RED
Blackmagic Decklink PCie card for capturing from tape, etc.
(2) internal BR Burners connected via USB 3.0 to SATA adapters
Old Cooler Master CM Stacker ATX case with (13) 5.25" front drive-bays holds & cools everything.

Workstations A & B are the 2 remaining 6-core 4.0ghz Xeon 5660 or I7 980x on Asus P6T6 motherboards.

$999 Walmart Evoo 17 Laptop with I7-9750H 6-core CPU, RTX 2060, (2) M.2 bays & (1) SSD bay...

Kinvermark wrote on 1/15/2020, 9:14 PM

As the pros & cons have now been openly discussed, I would like to ask another question (as a NEUTRAL):

How is it that these cards seem to be readily available, long after they were supposedly discontinued? The old story was that the 16 GB HBMC memory was horrifically expensive and that AMD would be loosing money on every one they sold. And yet, Newegg Canada has had these on occasional special for $650 CAD (approx. $490 USD) over the last couple of months. I wonder if there is perhaps going to be continued & ongoing availability of these cards as an offshoot of production of some other enterprise level components.

fr0sty wrote on 1/16/2020, 10:03 AM

Or maybe they just sold so poorly that there is stock left still.

Systems:

Desktop

AMD Ryzen 7 1800x 8 core 16 thread at stock speed

64GB 3000mhz DDR4

Geforce RTX 3090

Windows 10

Laptop:

ASUS Zenbook Pro Duo 32GB (9980HK CPU, RTX 2060 GPU, dual 4K touch screens, main one OLED HDR)

tripleflip18 wrote on 1/16/2020, 11:03 AM

Can anyone comment on 9900k and radeon 7 real time playback (timeline) with iphone 11 HEVC files 4k/60p ty