Ideal balance between CPU and GPU in rendering?

Len Kaufman wrote on 10/24/2018, 9:11 AM

Hi all,

Right now, when I'm rendering a video, the balance of utilization for GPU and CPU is about 50/50. Of course it varies a bit as it chugs along. In an ideal world, what would be the optimum numbers for GPU and CPU?

Also, memory runs around 56%. 9 of 16 gigs.

Not sure what I might do to change it, but it would be interesting to me to know.

Thanks.

Comments

Red Prince wrote on 10/24/2018, 9:54 AM

There is no ideal. It all depends on you. Personally, I always turn GPU rendering off because it is made to be fast at the cost of precision and quality (remember, it is designed to allow computer games to display quickly, not for high precision rendering). But others prefer to get it done quickly. So it is entirely up to your needs and wants with no common ideal.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Musicvid wrote on 10/24/2018, 9:55 AM

At least GPU is seeming to be a benefit on your system. Setups with high powered CPUs often report minimal actual GPU boost or utilization.

Don't change memory settings. They are already happy.

Personally, I always turn GPU rendering off because it is made to be fast at the cost of precision and quality

There is always that trade-off, and I would render without gpu if I had one here to turn off.

The size/quality/speed game goes round and round in every aspect of rendering, like kids playing rock/paper/scissors. Good point, Red Prince.

j-v wrote on 10/24/2018, 10:13 AM

In an ideal world, what would be the optimum numbers for GPU and CPU?

Not so difficult I think.
100/100 and both with the same amount of quality. 😄
But we not live and, I think, we will never live in an ideal world.

 

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Musicvid wrote on 10/24/2018, 10:16 AM

That made me chuckle, J-V!

j-v wrote on 10/24/2018, 10:40 AM

That made me chuckle, J-V!

That question gave me a big smile ( as you could see).
But the answer I gave, I would like to have in MY ideal world.

met vriendelijke groet
Marten

Camera : Pan X900, GoPro Hero7 Hero Black, DJI Osmo Pocket, Samsung Galaxy A8
Desktop :MB Gigabyte Z390M, W11 home version 24H2, i7 9700 4.7Ghz,16 DDR4 GB RAM, Gef. GTX 1660 Ti with driver
566.14 Studiodriver and Intel HD graphics 630 with driver 31.0.101.2130
Laptop  :Asus ROG Str G712L, W11 home version 23H2, CPU i7-10875H, 16 GB RAM, NVIDIA GeForce RTX 2070 with Studiodriver 576.02 and Intel UHD Graphics 630 with driver 31.0.101.2130
Vegas software: VP 10 to 22 and VMS(pl) 10,12 to 17.
TV      :LG 4K 55EG960V

My slogan is: BE OR BECOME A STEM CELL DONOR!!! (because it saved my life in 2016)

 

Musicvid wrote on 10/24/2018, 11:18 AM

128 bit OS will save the world in a few years -- for a little while.

Len Kaufman wrote on 10/24/2018, 12:13 PM

Perhaps I should have tempered my question with reality. Should have asked, "What is a realistic expectation?" The length of time for the render is already in the "hours," largely because of the large number of effects/plug-ins, etc. So turning off the GPU is not an option.

Musicvid wrote on 10/24/2018, 12:33 PM

Load balancing may occur in many systems around 80% CPU.

It's apparent yours is benefitting from 50-50, so no, you wouldn't want to turn it off if time wins the rock/paper/scissors game for you.

Again, the game is time, quality, speed. Pick only two.

TheDingo wrote on 10/25/2018, 2:58 PM

I always turn GPU rendering off because it is made to be fast at the cost of precision and quality (remember, it is designed to allow computer games to display quickly, not for high precision rendering)

Be careful about generalizing about specific NLE functions if you haven't done an actual comparison of the results yourself. I recently compared the results of rendering using VP16 software MP4 CODECs against the AMD, Intel QSV, and nVidia accelerated CODECs, and found the final renders to be almost identical. With 4K footage the hardware accelerated CODECs cut the rendering time in half, and with 1080p HD footage the rendering time was cut to one quarter. So from this point on I'm going to always use the hardware accelerated CODECs when I need to render MP4 videos, as the visual results are essentially identical between the software and hardware accelerated CODECs.

Musicvid wrote on 10/25/2018, 5:07 PM

So in YOUR size/quality/speed game you are willing to sacrifice some quality for the sake of speed because it doesn't look that bad with your material. Thanks for sharing.

astar wrote on 10/26/2018, 9:35 PM

So why would you want to have an expensive GPU in your system then disable Vegas from utilizing it.

Consider a good GPU from NV and Vegas 16:

  • GPU handles screen display extremely fast for smooth video playback
  • GPU has OpenCL ability to speed up complex math that far exceeds the CPUs ability, even something like a threadripper.
  • GPU has specific hardware that can decode and encode h.265 and h.264 much faster than the CPU, offloading that task from CPU.

The days of GPU h.264 encoding are over, and it was that type of rendering that was impairing quality. You do have to work within the limits of tech like quicksync and yes those limits will place limits on image quality.

  • For editing use the GPU, because not using it is like doing math on paper vs using a computer.
  • Rendering for quality, use high quality professional codecs that do not use tech like quicksync or NVENC.

 

GPU utilization should always remain low, low utilization means the GPU will be delivering results faster than the CPU needs, and the work load of screen display is not interrupted. Not sure why people think the GPU should show load like an engine going up hill. You want low utilization if your system is outclassing your workflow.

 

Kinvermark wrote on 10/26/2018, 9:42 PM

Good post!

I keep trying to point out the futility of this 100% gpu / 100% cpu load thinking... to no avail. It's like judging the speed of your car by reading the tachometer instead of the speedometer.