I'm sure there are specs detailing theoretical performance among those graphics. The fact that one costs a lot more is not "necessarily" an indicator of video rendering performance, which is completely different than gaming or 3d benchmark performance.
In real life, hardware assisted benchmarking is a crapshoot, qualitatively different than CPU-only, and your mileage will vary.
“The only advantage I see for a Quadro GPU vs a GTX is the support for 10bit color, other then that its a waist of money.” This is touched upon in the above urls test conclusion also.
So I can now conclude for my purposes, a Geforce card will be most suitable. I guess if you are a professional, the Quadro cards would be more relevant, but I just need something for editing some promotional music videos.
Seems like a Intel Core i7 8700, 32gb ram, Geforce 1070 or 1080 would be a prudent choice, (for me, anyway).
Remember that not all Geforce models support NVENC. Choose carefully, and don't put any faith in gaming/3D benchmarks. They are unrelated to video rendering.
Remember that not all Geforce models support NVENC. Choose carefully, and don't put any faith in gaming/3D benchmarks. They are unrelated to video rendering.
GeForce cards DO support 10 bit. I am using a 970 hooked up to my LG OLED HDR set right now, with windows 10 set to HDR mode. It's buggy (the screen flashes black once in a while, and the TV's HDR logo pops up again when it does), but it does work. I've tested it with 10 bit test patterns, no gradients.
Former user
wrote on 5/23/2018, 5:06 AM
Nvidia quote .. “NVIDIA Geforce graphics cards have offered 10-bit per color out to a full screen Direct X surface since the Geforce 200 series GPUs. Due to the way most applications use traditional Windows API functions to create the application UI and viewport display, this method is not used for professional applications such as Adobe Premiere Pro and Adobe Photoshop. These programs use OpenGL 10-bit per color buffers which require an NVIDIA Quadro GPU with DisplayPort connector. A small number of monitors support 10-bit per color with Quadro graphics cards over DVI.”
"For Pascal, NVIDIA is opening things up a bit more, but they are still going to keep the most important aspect of that feature differentiation in place. 10bit color is being enabled for fullscreen exclusive OpenGL applications – so your typical OpenGL game would be able to tap into deeper colors and HDR – however 10bit OpenGL windowed support is still limited to the Quadro cards. So professional users will still need Quadro cards for 10bit support in their common applications.""
So, full screen preview, you get 10 bit even if it's OpenGL if you have a 10 series or greater card. Windowed, you do not.