GPU vs CPU Rendering Internals

VideoFreq wrote on 12/15/2015, 10:33 AM
Does anyone know how the internals of GPU and CPU rendering work within Vegas beyond the project settings?
I have a 37 minute XDCAM MXF-24p project (and a little 60mbps/4K video) set to 32bit Float and 2.22 CG. I have told Vegas to render direct to BluRay at the 16mbps setting and to use the R9-390 GPU only. The issue is that the i7-4770K chip runs at 90-100C and causes BSOD. It is not overclocked. It has a Zalman copper bodied CPU fan and adequate Zalman case fans with 32G of memory.
Why is the i7 CPU running much at all?

MC & HH
Jim

Comments

NormanPCN wrote on 12/15/2015, 10:55 AM
GPU is mostly driven by effects, scaling and compositing. No effects, little or no GPU use. All this happens during playback and file encoding, aka "render as". This is the process of generating the video stream.

During "render as" the resulting video stream is passed to the file encoder. File encoders are mostly CPU only. Mainconcept AVC and Sony AVC do have GPU encoder modes. Sony AVC uses little GPU on its own. Mainconcept AVC can use the GPU heavily but it only supports very old GPU models. It has not been updated in years.

If you are hitting those temps something is probably wrong and/or inadequate with your CPU cooling. I also have a 4770k with a very slight overclock and I cannot really get it above 65C.
john_dennis wrote on 12/15/2015, 11:41 AM
With "adequate" air flow through the case and "adequate" cooler capability to transfer heat from the CPU, check the heat transfer compound for uniformity. Not too thick, not too thin and complete coverage.
John_Cline wrote on 12/15/2015, 12:24 PM
There is almost certainly an issue with the thermal interface between the CPU and the cooler as John Dennis pointed out.
Mikee wrote on 12/19/2015, 3:27 PM
Jim, Can't say much about the BSODs, but GPU-Z shows informative details of the GPU load during rendering. Works on my NVIDIA GeForce GTX660 anyway.
https://www.techpowerup.com/gpuz/
https://tpucdn.com/gpuz/screen2.gif
astar wrote on 12/20/2015, 1:58 AM
The CPU over heat sounds like the cooler has lost contact with the chip surface, or to much thermal paste was applied. Clearly you have system hardware stability issues to overcome before worrying about GPU assist.

GPU in vegas is mult-tasked by windows, in that a single GPU does both general display painting, resolution scaling, and media related "direct" functions. On top of that OpenCL support allows the compute units on the GPU to be used in floating point calculations. These compute units are added to the virtual OpenCL units on your CPU. The correct matching of CPU speed, PCIe connector bus, memory bandwidth, and memory quantity is important for correct GPU assist.

You could do multiple GPUs for Vegas, and dedicate one GPU to display, and the other to Compute. Vegas will not use both GPUs in the way most think, but effectively will use both by the nature of one for display, and other for compute. Most systems people buy are not really able to support full bandwidth to 2 GPUs, and so I would not recommend it unless you really know what you are doing. Most struggle with hardware bought for the "right price" vs the best tool for the job, and have a hard time getting GPU assist to work correctly for them.

While in 32-bit floating point mode, picture info should use the CPU+GPU to calculate picture info to a 32-bit FP accuracy. Reducing the 32-bit mode to Video Levels reduces the amount of picture info that needs to be calculated. Reducing to 8-bit accuracy reduces system load even more, possibly to the point of not really seeing much GPU hit. If the GPU is faster than the CPU, it may return results faster than GPU-z and other monitors will track the utilization. The OpenCL DEV kit from AMD has tools to show the actual processes being run and timing / utilization information. Remember GPU-z and AMD system monitor are only showing utilization averages over a long period, and not instantaneous utilization.

Certain codecs are more optimized than others. HDCAM SR.mxf, XDCAM.mxf, Cineform, XAVC-I will show a marked GPU utilization during playback of these codecs in 32-bit mode. I am a firm believer in conforming your source media into a DI format, and editing like for like codec. I find this eliminates issues down the road.

Min/Max plugin is an effect that hits the GPU really hard, and you should see GPU being used during playback.

For an i7-47xx system, you should have 32GB of stable system RAM, an X/XT series AMD GPU. Such as a 7970, 290X, 390X, or FuryX operating at confirmed 16X speed.

For Blu-ray discs, you may not need more than 8-bit mode for your project. The video result is an 8-bit format being displayed to an 8-bit display. You could be calculating more info to a degree of accuracy you will never seen in the result. Specific situations MAY call for the need to render color changes in 32-bit mode, but even then you would not need full range, only video levels mode.