GPU specifications vs Vegas Performance

Comments

NormanPCN wrote on 11/8/2013, 11:28 AM
I tried Hwininfo and it was showing some GPU info where GPU-z does not. It was sporadic. This on my AMD setup. Hwininfo shows D3D use and this must be ignored as Vegas i s using this to draw to the display.

I cannot say for sure what either monitor program tests and shows for GPU load. They are basically getting the info from the video driver. I thought I once read that GPU-z "GPU load" is a monitor of the shader use. These are the compute units.
Hulk wrote on 11/8/2013, 2:56 PM
OldSmoke,

Sorry to belabor the point but what is the GPU load you are showing with 60p stream matched to output and are you using GPUz to monitor GPU activity?

- Mark
OldSmoke wrote on 11/8/2013, 6:23 PM
@Hulk

I think I solved the mystery. My PC is usually in standby/sleep when I am not on it. This way it will start in a second and I can work when ever I got time for it. Yesterday, when I switched off GPU under preferences my second graphic card was still showing activity up to 25% regardless which render codec and even when I render 60p. Today I had to shut down and restart the computer to get iTunes updated. I then did again what I did yesterday, GPU was still off under preferences, and this time it is really off and there is no activity on either GPU. Simply turning it off without a restart doesn't completely switch it off, maybe because there are two identical cards in the system?

If I render the AVCHD 1080-60p clip (no FX or others) with GPU ON to lest say XAVC to 1080-60p I get around 45% GPU load. GPU-Z and HWiNFO64 both show the same results.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

D.Anne wrote on 11/9/2013, 2:15 AM
Several items:

1. I googled for "Nvidia driver 296.10" found it and downloaded it 3 days ago. (From the Nvidia site)

http://www.nvidia.com/object/win7-winvista-64bit-296.10-whql-driver.html

2. Data for my system is:

Windows 7 Pro, 64 bit.
AMD 8320, 3.5GHz (8 cores) 8GB memory
Intel 128GB SSD, WD 2 TB,


Vegas Pro 12, Build 726
Options->Preferences->Video->GPU 'ON'
Nvidia GTX-470, Driver 296.10
Preview Performance: (at 25 seconds on timeline)
Best Full: ~22fps
Best Half: Solid 29.970
Render XDCAM
Time: 83 sec (23.01 FPS)
CPU Load: 40-65 %, est 55 % avg
GPU Load: 50-80%, est 73% avg
Render MC AVC/AAC ("CUDA, if Available")
Time: 73sec (26.16 fps)
CPU Load: 50-80%, est 72%
GPU Load: 55-80%, est 74%
Render MC AVC/AAC ("CPU only")
Time: 193 sec (9.89 fps)
CPU Load: 90-99%, est 97%
GPU Load: 10-20%, est 14%




Hulk wrote on 11/9/2013, 12:21 PM
D'Anne,

Thanks for running the test. If you get a minute could you run the preview and XDCAM EX render with GPU turned off in preferences? I'm wondering how your CPU handles them?

- Mark
D.Anne wrote on 11/9/2013, 12:50 PM
Heres my data with GPU OFF in the preferences->video menu.

Preview (Best Full) 1.8 fps
Solid 29.97 at Preview Qtr

XDCAM Render 243 sec, CPU 80%, GPU <5%
AVC/AAC Render 251 sec, CPU 98%, GPU <5%

For Previewing I selected a 30 second region starting at 25 sec and watched it multiple times, watching to see how low the fps reading was. This seems to be more consistent as the image data is cached in memory rather than read from disk.
Hulk wrote on 11/9/2013, 3:24 PM
D'Anne,

I hate to ask but could you do two more runs?
Could you run the Mainconcept render with GPU off in preferences and Cuda On in the Mainconept dialog box? And report CPU/GPU loading as well as render time?

And also Mainconcept with GPU acceleration on in preferences and OpenCL on in the Mainconcept render as dialog?

I'm trying to isolate these systems.

Also please restart after turning off GPU in preferences to make sure it is really off.


Thanks,

Mark

D.Anne wrote on 11/9/2013, 10:27 PM
Forgot to include my preview ram setting. It is currently set to 500MB. I have 8GB installed.
NormanPCN wrote on 11/10/2013, 12:45 PM
One question I have about this while thing about are ALL render as codecs GPU accelerated.

Sony clearly documents certain things as GPU accelerated. The Video engine and the AVC encoders. All three of these each have their own separate options to enable/disable GPU acceleration. The video engine is always used for everything. Playback and render as.

So if anything else does have GPU accel, then why no option to enable/disable. It seems Sony is quite clear in their implementation design. Give an option to be able work around potential problems. Also, at least for MC AVC, the GPU encoders are actually completely different design and not up to the same quality level as the classic CPU only.

Hulk wrote on 11/10/2013, 1:53 PM
I'm not sure if only the Mainconcept AVC and Sony AVC are the only GPU accelerated codecs anymore.

I believe we (the community here) initially thought only the Mainconcept AVC and Sony AVC are the only GPU accelerated codecs because those two codecs are the only ones that have an option for GPU acceleration in their customize template dialogs. But if you look at the choices, it's more than "off" and "on." Mainconcept has Cuda and OpenCL and Sony has a variety of options. It is *possible* that many of the render as codecs are GPU accelerated, but only OpenCL accelerated and when GPU in preferences is on then so is the GPU acceleration for these other codecs. Perhaps there was no need for Sony to include "on" and "off" for these other codeces because there is only one choice (OpenCL) and it can be controlled from preferences "GPU acceleration."

Based on some of the scores I've been analyzing this could possibly be the case, but I'm not sure. We need more data to figure this and other things out.

I guess the one way to check on this would be to put a clip in the timeline with no fx of any sort. Output to a codec, matching properties of interlacing and frame rate. Then render with and without GPU acceleration on in preferences. If there is a huge speedup in rendering and a much higher loading of the GPU during render then perhaps the codec being tested is accelerated.
fp615 wrote on 11/11/2013, 6:19 AM
When we do a render Vegas actually does:

1 - rasterize a frame for each video track
2 - applies fx, resizes, masks, alpha channels, etc (in orders I don't know)
3 - compress the frame with the codec

It may be that actions in point 1 and 2 are using GPU. Sony clearly states that GPU is used for point 2.
VideoFreq wrote on 4/1/2014, 9:38 AM
And I'm sure you all know by now that the GTX-660 and ANY Kepler cards are the WORST cards to use with Vegas Pro 11 or 12. Any Fermi architecture and older drivers that were benchmarked with the card when new. Every upgrade after that is for Gamers.
TVJohn wrote on 4/1/2014, 9:35 PM
Some useful GPU info here:
http://www.studio1productions.com/Articles/PremiereCS5.htm
Granted, the article is written re CS5/6 however I suspect it is applicable generally.