Are Quadro (& FirePro) GPUs better for 10bit than GeForce (& Radeon)?

NickHope wrote on 11/26/2016, 7:55 AM

Are NVIDIA Quadro (& maybe AMD FirePro) GPUs better for 10-bit video in Vegas than NVIDIA GeForce GTX (& AMD Radeon)?

If so, exactly how are they better? If there's something clear and significant, I'll add a note to this.

Thanks!

Comments

NormanPCN wrote on 11/26/2016, 10:08 PM

Workstation cards are the only ones that support 10-bit output to a 10-bit capable display.

As for general computations, such as effects and compositing, there should be no difference. The workstation cards are using the same GPUs as consumer cards. The workstation GPUs are generally packaging jumperd to enable certain special features like 10-bit displays and ECC ram and maybe other features.

10-bit video data computation can be supported via 16-bit integer, 16-bit half float (only just) and 32-bit float computation. Of those Vegas only supports 32-bit float.

megabit wrote on 11/27/2016, 12:00 AM

Can't say much about the Quadro in VP 14, as I replaced my Quadro M2000 card with a GTX 1080 before VP 14 was released. But I remember that with VP 13, I was only able to get true 10 bit preview (on my 10-bit external monitor) using Decklink 12G card; the same gradient was still banding when output though Windows Secondary Monitor (i.e. the Quadro) even with Project Settings at 32 bit float...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

NickHope wrote on 11/27/2016, 1:26 AM

Thanks gents.

the same gradient was still banding when output though Windows Secondary Monitor (i.e. the Quadro) even with Project Settings at 32 bit float...

...which indicates that it will only display 8-bit that way, same as the GTX cards.

megabit wrote on 11/27/2016, 2:00 AM

That's xactly what I mean, Nick - I'll stress it again that I cannot be sure about VP 14, as I sold my Quadro card before installing it.

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Adi-W wrote on 11/27/2016, 8:25 AM

Apparently, the AMD Fury X (that I use) and their newers cards support 10 bit. The setting can be enable in the catalist control center if your monitor display support 10 bit.

 

NormanPCN wrote on 11/27/2016, 10:05 AM

All correct. Just because a video card can support 10-bit displays does not mean the app, like Vegas, is able to support that ability of the video card.

Adi-W wrote on 11/27/2016, 7:22 PM

For sure I didnot had the time to verify it up to now but what is the most simple/easy way to do it ?

NickHope wrote on 11/27/2016, 9:49 PM

For sure I didnot had the time to verify it up to now but what is the most simple/easy way to do it ?

Probably shoot a blue skyscape (or something similar) in 10-bit, set Vegas Project Settings pixel format to 32 bit float, set up a Windows secondary monitor for Vegas and look for banding in the sky.

Then render the same footage to a 8-bit lossless format and put it back on the timeline below the original footage and mute/unmute the upper track for comparison.

megabit wrote on 11/27/2016, 10:13 PM

There is plenty of 8, 10 and 12 bit gradient patterns on-line to test just that...

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

igniz-krizalid wrote on 11/27/2016, 11:23 PM

Workstation cards are the only ones that support 10-bit output to a 10-bit capable display.

As for general computations, such as effects and compositing, there should be no difference. The workstation cards are using the same GPUs as consumer cards. The workstation GPUs are generally packaging jumperd to enable certain special features like 10-bit displays and ECC ram and maybe other features.

10-bit video data computation can be supported via 16-bit integer, 16-bit half float (only just) and 32-bit float computation. Of those Vegas only supports 32-bit float.

Exactly the answer I was looking for, I dont even have a 10-bit display so I should be good with my consumer card for now

Main PC:

MSI X370 Pro Carbon, R7 1800X, OC Nitro RX 480 4Gb, 2X8GB DDR4 3200 CL 14, 850 EVO 500GB SSD, Dark Rock 3 cooler, Dark Power Pro 11 650W Platinum, Serenade PciE CM8888 Sound Card, MultiSync 1200p IPS 16:10 monitor, Windows 10 Pro 64bit

Second PC:

Z170XP-SLI, i7 6700K, Nitro R9 380 4Gb, 2X8GB DDR4 3200 CL 16, MX200 500 SSD, MasterAir Pro 4 cooler, XFX PRO 650W Core Edition 80+ Bronze, Xonar D1 7.1 Ch Sound Card, NEC MultiSync 1200p IPS 16:10 monitor, Windows 10 pro 64bit

Wolfgang S. wrote on 11/28/2016, 5:41 AM

Thanks gents.

the same gradient was still banding when output though Windows Secondary Monitor (i.e. the Quadro) even with Project Settings at 32 bit float...

...which indicates that it will only display 8-bit that way, same as the GTX cards.


If one uses a Quadro card and sets the project properties of Vegas to 32bit floating point, he will get a 10bit preview. I see that here with my Quadro card, and the best way to test that is to use testpatterns as one can download here:

https://imagescience.com.au/knowledge/10-bit-output-support

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

megabit wrote on 11/28/2016, 6:04 AM

No way Wolfgang - having an almost A/N comparison with my Decklink, I'm definitely not seeing the same band-free gradient with nVidia.

Piotr

PS. True, I don't have a Quadro any more - but I read in many places the high-end, Pascal GTX cards (1000 series) have the same capabilities as Quadro in this regard.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Wolfgang S. wrote on 11/28/2016, 8:19 AM

No idea what your GTX card is doing with respect to 10bit. I do not know your card. BUT I know what I see with my Quadro - and that is the 10 bit band free gradient, similar to the output of my Decklink.

And your GTX and a Quadro are different cards with different drivers too. I would not mix that up.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems

megabit wrote on 11/28/2016, 9:34 AM

OK - but that must have changed with VP 14, as with the 13 I checked when still running the Quadro M2000, and 10-bit didn't work. Cheers

Piotr

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Wolfgang S. wrote on 11/28/2016, 10:16 AM

I am not talking about VP14 alone. I have seen that with VP13 too.

Desktop: PC AMD 3960X, 24x3,8 Mhz * RTX 3080 Ti (12 GB)* Blackmagic Extreme 4K 12G * QNAP Max8 10 Gb Lan * Resolve Studio 18 * Edius X* Blackmagic Pocket 6K/6K Pro, EVA1, FS7

Laptop: ProArt Studiobook 16 OLED * internal HDR preview * i9 12900H with i-GPU Iris XE * 32 GB Ram) * Geforce RTX 3070 TI 8GB * internal HDR preview on the laptop monitor * Blackmagic Ultrastudio 4K mini

HDR monitor: ProArt Monitor PA32 UCG-K 1600 nits, Atomos Sumo

Others: Edius NX (Canopus NX)-card in an old XP-System. Edius 4.6 and other systems