Workstation cards are the only ones that support 10-bit output to a 10-bit capable display.
As for general computations, such as effects and compositing, there should be no difference. The workstation cards are using the same GPUs as consumer cards. The workstation GPUs are generally packaging jumperd to enable certain special features like 10-bit displays and ECC ram and maybe other features.
10-bit video data computation can be supported via 16-bit integer, 16-bit half float (only just) and 32-bit float computation. Of those Vegas only supports 32-bit float.
Can't say much about the Quadro in VP 14, as I replaced my Quadro M2000 card with a GTX 1080 before VP 14 was released. But I remember that with VP 13, I was only able to get true 10 bit preview (on my 10-bit external monitor) using Decklink 12G card; the same gradient was still banding when output though Windows Secondary Monitor (i.e. the Quadro) even with Project Settings at 32 bit float...
Apparently, the AMD Fury X (that I use) and their newers cards support 10 bit. The setting can be enable in the catalist control center if your monitor display support 10 bit.
All correct. Just because a video card can support 10-bit displays does not mean the app, like Vegas, is able to support that ability of the video card.
For sure I didnot had the time to verify it up to now but what is the most simple/easy way to do it ?
Probably shoot a blue skyscape (or something similar) in 10-bit, set Vegas Project Settings pixel format to 32 bit float, set up a Windows secondary monitor for Vegas and look for banding in the sky.
Then render the same footage to a 8-bit lossless format and put it back on the timeline below the original footage and mute/unmute the upper track for comparison.
Workstation cards are the only ones that support 10-bit output to a 10-bit capable display.
As for general computations, such as effects and compositing, there should be no difference. The workstation cards are using the same GPUs as consumer cards. The workstation GPUs are generally packaging jumperd to enable certain special features like 10-bit displays and ECC ram and maybe other features.
10-bit video data computation can be supported via 16-bit integer, 16-bit half float (only just) and 32-bit float computation. Of those Vegas only supports 32-bit float.
Exactly the answer I was looking for, I dont even have a 10-bit display so I should be good with my consumer card for now
the same gradient was still banding when output though Windows Secondary Monitor (i.e. the Quadro) even with Project Settings at 32 bit float...
...which indicates that it will only display 8-bit that way, same as the GTX cards.
If one uses a Quadro card and sets the project properties of Vegas to 32bit floating point, he will get a 10bit preview. I see that here with my Quadro card, and the best way to test that is to use testpatterns as one can download here:
No way Wolfgang - having an almost A/N comparison with my Decklink, I'm definitely not seeing the same band-free gradient with nVidia.
Piotr
PS. True, I don't have a Quadro any more - but I read in many places the high-end, Pascal GTX cards (1000 series) have the same capabilities as Quadro in this regard.
No idea what your GTX card is doing with respect to 10bit. I do not know your card. BUT I know what I see with my Quadro - and that is the 10 bit band free gradient, similar to the output of my Decklink.
And your GTX and a Quadro are different cards with different drivers too. I would not mix that up.