AMD Radeon VII (7) used with Vegas Pro 16

Former user wrote on 2/7/2019, 9:06 PM

Project & rendering settings are unknown. Author notes 2 pieces of software were unusually disappointing, both OpenCl.

  • Not all of the performance seen throughout the article was that impressive, with Radeon falling behind in MAGIX’s Vegas, but that could possibly point to software updates being needed. With Adobe’s Premiere Pro performing just fine with the VII, there’s definitely room for some improvement with Vegas.
  • OpenCL Radeon ProRender continues to give NVIDIA the performance lead, defying logic

https://techgage.com/article/amd-radeon-vii-workstation-performance/2/

This is the 2nd techgage benchmark that shows AMD cards performing badly with Vegas16

https://techgage.com/article/amd-radeon-pro-wx-8200-review/2/

 

Comments

Deathspawner wrote on 2/8/2019, 10:54 AM

Thanks for sharing :)

Both of the projects I use include 4x 30 second clips each, based on whether or not the scenes are light and dark. The tests are straight 4K/60 to 4K/60 encodes, with no use of filters. I do have plans to introduce filters into the testing at some point, and in fact have the scripts for it already, but with AMD performing so poorly, I feel like it might be best to wait for something to change. AMD performed a lot better in VP15.

Example: https://i.imgur.com/N6jcJBC.png

If it takes too long, I'll just do that testing with both vendors and be done with it... because it is what it is. The other four encodes I have to use in future testing include Dark/AutoLooks, Dark/Median, Light/LUT, and Light/Upscale - all four seem to use the GPU quite well. But it'd be nice for AMD's cards to work as we'd expect them to. The ProRender result even baffles AMD, but the reviewer's guide included performance for that with ONLY the Radeons, with the NVIDIA spots left empty. I feel like that had to have been on purpose.

I am really hoping a driver update will help this, but since the weak performance occurred in the WX 8200 review as well, I am not too hopeful on that coming soon. It could also be a Vegas thing, though.

When the time comes to do more testing, I might have to load up VP15 again just to sanity check things.

Kinvermark wrote on 2/8/2019, 12:09 PM

@Deathspawner

Well, thanks to you for taking an interest...

Questions:

1) Why 2080 ti? Isn't the 2080 the price equivalent to Radeon 7 ?

2) It isn't clear to me what flavor of avc/hevc render you are performing. Are you using NVENC & VCE? If so, please re-do without those ASIC's. Renders to professional formats like Prores, dnxhd, cineform, Xavc-i could be very different.

Personally I would prefer to see comparisons based on timeline performance rather then pure render. And applied fx are necessary for any kind of real-world test. It's possible that with a straight render the Open-CL GPU has almost nothing to do.

3) Did you notice if Vegas is using the 16GB VRAM on the Radeon 7. I would guess it is not. Finding a workflow that uses all that VRAM may be the only way to show this cards benefits. Neat Video noise reduction is the only plug-in I can think of that might test this.

Thanks again!

 

PS

From 5:00 he talks about video editing and the potential need for large amounts of VRAM.

 

BruceUSA wrote on 2/8/2019, 12:30 PM

This testing don't tell us a lot information. First, what we want to know is what percent of GPU utilized during rendering. Vegas is not supports this newly released card as yet. That is why its performed poorly. I would say to early to concluded that this newly released card is bad for Vegas.

Intel i9 Core Ultra 285K Overclocked all P Cores @5.6, all E-Cores @5ghz               

MSI MEG Z890 ACE Gaming Wifi 7 10G Super Lan, thunderbolt 4                                

48GB DDR5 -8200mhz Overclocked @8800mhz                  

Crucial T705 nvme .M2 2TB Gen 5  OS. 4TB  gen 4 storage                    

RTX 5080 16GB  Overclocked 3.1ghz, Memory Bandwidth increased from 960 GB/s to 1152 GB/s                                                            

Custom built hard tube watercooling.                            

MSI PSU 1250W, Windows 11 Pro

 

Deathspawner wrote on 3/13/2019, 9:08 AM

I meant to respond here sooner, but I got side-tracked updating the testing script instead. I hate to admit I spent an entire day of my life updating it, because I haven't yet been able to make use of it since Vegas performance is all over the place. I emailed MAGIX in December about issues, but never heard back. I followed-up a few minutes ago. 

@Kinvermark 

1 - If I had all of the time in the world, I'd test all of the cards. I chose Ti because it's the highest-end GeForce Turing. The RTX 4000 is included to match the price point, as is WX 8200. The ultimate logic is 2080 Ti = Highest-end GeForce, WX 8200 = $900, RTX 4000 - $900, P6000 = last-gen top Quadro, TITAN Xp = last-gen top "GeForce" (with workstation optimizations). 

2 - VCE for AMD, and NVENC for NVIDIA. 

3 - I did not test VRAM too much, because it's hard to accurately measure that kind of thing. You need to run into real workload bottlenecks to realize VRAM is an issue. It's kind of hard to benchmark. Even if a GPU monitoring tool shows 11GB used of 12GB, it doesn't mean that 11GB is actually needed. It means that's how much has been allocated. Some applications (and games) are designed to simply reserve most of the buffer. 

@BruceUSA

You are right about me not giving good information. One thing my script added was the ability to do playback tests, since that's what everyone wants. After I thought I had things perfect, the tool (Fraps) I use to monitor frame rates just stopped working in the application. I blame the quirkiness of Vegas more than this tool, since Fraps works in other applications without fail (including Premiere Pro). I haven't been able to find other solutions that work, but that's not even the biggest problem.

It seems like every time I want to return to Vegas testing, I get hit with a bunch of unknowns. Sometimes, AMD performs better, sometimes NVIDIA does. Lately, NVIDIA is completely broken, at least for me, on multiple PCs. I tested TITAN Xp in a completely different rig, and it gives me the same issues. 

Here's an NVIDIA encode that involves LUT: 

https://i.imgur.com/ByteI4a.png

You can see that the CPU is used 100%, and the GPU sits at about 10%. That's fine and good if that means things are working, but that's not the case here, as the encode takes 15 minutes. Meanwhile, here's AMD: 

https://i.imgur.com/hGdcljs.png

Things change. CPU usage goes from 100% to 50%, and the GPU sits at the same ~10%. Yet, the encode time is cut down to a fourth of what it is on NVIDIA. Median encode times are twice on NVIDIA what they are on this Radeon, as well.

This directly ties into playback performance, as far as I can tell. If the encode runs super-slow, it means the same filter in the playback window is going to be crippled just the same. On the AMD rig, the playback is fine at Best; on NVIDIA, it's crippled. 

I admit I feel a little defeated at this point. I've spent an obscene amount of time on these tests, and have nothing to show for it. I remember being told by MAGIX that VP16 would bring some nice performance changes, but I didn't realize those changes would be hassles. 

Former user wrote on 3/14/2019, 5:03 AM

@Deathspawner

Theres a Magix sample project available for testing/benchmarking also ...

its located at ... C:\ProgramData\VEGAS\VEGAS Pro\16.0\SampleProject

Kinvermark wrote on 3/14/2019, 10:49 AM

@Deathspawner

I feel your pain, so thanks for the effort.

However, I do think you are making this more complicated than it needs to be. Forget the fraps number, and forget NVENC and VCE (for now) just work on a simple observable metric like "is the timeline playing back at full rate, and does it scrub smoothly forwards and backwards." Test this in a few different scenarios (ie footage types, HD, 4K, h.265 from a Gopro or DJI drone, cineform ...vs multiple applied fx) and you will be giving us really good information. Check out reviews from pugetsystems for examples of reviews that are useful to editors rather than gamers ( doesn't need to be quite this extensive; though it is nice.)

This doesn't have to be hugely time consuming if you eliminate all the peripheral measuring and capturing software, which I doubt is accurate anyway.

 

Deathspawner wrote on 3/14/2019, 11:33 AM

@Kinvermark

If Vegas didn't have broken NVIDIA support, I'd have useful test results to share. The script I made works fine. It's crippled NVIDIA performance that's the problem. Before responding here, I decided to sanity check something... and sure enough, Fraps works fine on Radeon in Vegas. Meanwhile, it doesn't on NVIDIA, and likewise, performance on that side is in the toilet. 

As I mentioned before, the rendering performance seems to correlate directly with the viewport. If a Median FX'd encode runs slow on NVIDIA, it means the playback in the viewport is going to be just as bad. Both processes use the GPU the same way. Logically, you'd think Median would cripple performance before LUT, but not on NVIDIA. Even though Median can use up to 90% of a GPU's performance, the meager 10% usage with LUT manages to bring NVIDIA to its knees. We're talking 1/4th the performance of Radeon.

All of this said, it feels so strange to me that performance appears to be completely broken on NVIDIA, but I seem to be the only one complaining about it. I've tested three different NVIDIA GPUs in three different PCs, and all of them exhibit the exact same issues. But not on Radeon. And it wasn't always like this.

Once MAGIX fixes this issue, I'll get to benchmarking. I will likely take your suggestion and integrate a different source file for some of the tests, like a 4K or 8K RED. Currently I am using 4K/60 video straight out of a OnePlus (~120Mbps), and it's fine for scaling (when Vegas agrees to work right), but I could always enhance further.

 

Kinvermark wrote on 3/14/2019, 12:09 PM

@Deathspawner

Sorry, but there is so much wrong with your post I don't know where to start.

1) Encoding performance in NVENC and VCE is only indirectly related to timeline playback performance of the many PROFESSIONAL formats that those two ASIC's cannot even decode. E.g. Cineform , Prores, Red, Avid MXF...

2) Nvidia broken? I don't think so. Maybe some minor element, but certainly not generically. Your metrics are not even consistent - look at the GPU usage stats on your PNG screen captures.

I think you have just made up your mind that you don't want to benchmark Vegas (that's fine) and are using any minor roadblock to justify this.

Anyway, I give up. I will patiently wait for Puget Systems to report on Radeon VII and extrapolate the results.

Kinvermark wrote on 3/14/2019, 12:12 PM

Actually, the Puget System Radeon VII review is up for those that are interested:

https://www.pugetsystems.com/labs/articles/DaVinci-Resolve-15-AMD-Radeon-VII-16GB-Performance-1382/
 

Eagle Six wrote on 3/14/2019, 12:25 PM

@Kinvermark Interesting, as I use Resolve 15.3 as my primary and cost is always a factor, Thank You for the link and the testing they do.

System Specs......
Corsair Obsidian Series 450D ATX Mid Tower
Asus X99-A II LGA 2011-v3, Intel X99 SATA 6 Gb/s USB 3.1/3.0 ATX Intel Motherboard
Intel Core i7-6800K 15M Broadwell-E, 6 core 3.4 GHz LGA 2011-v3 (overclocked 20%)
64GB Corsair Vengeance LPX DDR4 3200
Corsair Hydro Series H110i GTX 280mm Extreme Performance Liquid CPU Cooler
MSI Radeon R9 390 DirectX 12 8GB Video Card
Corsair RMx Series RM750X 740W 80 Plus Gold power pack
Samsung 970 EVO NVMe M.2 boot drive
Corsair Neutron XT 2.5 480GB SATA III SSD - video work drive
Western Digitial 1TB 7200 RPM SATA - video work drive
Western Digital Black 6TB 7200 RPM SATA 6Bb/s 128MB Cache 3.5 data drive

Bluray Disc burner drive
2x 1080p monitors
Microsoft Window 10 Pro
DaVinci Resolve Studio 16 pb2
SVP13, MVP15, MVP16, SMSP13, MVMS15, MVMSP15, MVMSP16

Kinvermark wrote on 3/14/2019, 12:32 PM

Just read it: Short answer - Radeon VII is better than the far more expensive 2080 ti for DR!

I guess it will be the same for Vegas Pro (assuming card is supported).

Note that DR does not have AMD VCE decode (or encode) support for h.264/265 files and STILL performs better than Nvidia.

Deathspawner wrote on 3/15/2019, 1:32 PM

@Kinvermark

"I think you have just made up your mind that you don't want to benchmark Vegas (that's fine) and are using any minor roadblock to justify this."

I remember saying that I was going to tackle things once the hardware issue I was bumping against was fixed. I didn't create a benchmarking script or spend all of that time on things just to abandon it all before I publish a single set of relevant results. You are one of the reasons I introduced a LUT and playback test, so I am glad all I've managed to do so far is annoy you.

Regardless of anything else: LUT and Median FX exhibit crippled performance for me on NVIDIA hardware (on three different PCs and three different GPUs), for both rendering and playback. When I say crippled, I'm talking single digit frame rates in playback, when it's a full 60 FPS on Radeon hardware. 

I shouldn't have said that NVIDIA support was "broken". I'll reach out to my contacts there and see if they can do internal tests. If not a software issue, it could be a driver issue. 

Kinvermark wrote on 3/15/2019, 4:15 PM

Not really annoyed, but you have to expect a bit of "push back" when you make generalizations about something being broken. Plus I am doubtful the issue you want "fixed" will happen any time soon, so the end result is... no review… any time soon.

I appreciate your last sentence. FYI, there are reports on other forums (not Vegas) claiming the last two NVIDIA DRIVER updates have yielded 20% performance drops.

 

ryclark wrote on 3/16/2019, 8:39 AM

Which versions of the last two NVidia driver updates don't work properly? So which version is recommended for GTX series cards?