Comments

Former user wrote on 7/25/2018, 12:22 AM

GPU is used for 32bit processing. I was wondering about that a while ago. I used a gpu enabled filter & timeline played using 32-bit floating point (video levels) turned on in properties. The GPU used 10% extra compared to 8bit (70% average GPU compared to 60% average GPU)

However when rendering the GPU was used less in 32bit compared to 8bit, but the whole rendering process was 1/3 slower, so I figure the lower GPU level was due to the longer wait states between a bogged down CPU & GPU.

As for best card. Maybe an AMD 580, as vegas filters use opencl, which AMD excels at, and Nvidia does not. Or pro level amd card if you are a pro

 

EDIT: Actually GPU prob has more work due to more data being produced but not actually generating it??