push the envelope, bleed on the edge.

Comments

OldSmoke wrote on 7/10/2014, 12:53 PM
I have 2x GTX580 and both allow 32bit. From what I gather in this article is, that 32bit would mean 10bit per color and 2bits for padding. http://en.wikipedia.org/wiki/Color_depth.
So, I got Win7 64bit Pro, a graphic card that can display 30(32)bit and a monitor that can handle 10bit per color (again 3x10=30bit). Where is the limitation now? Vegas or the OS? This is assuming I use my monitor not for the preview window but as external monitor. And the next question would be, how many bits can a modern HDTV handle or the BluRay player?

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

larry-peter wrote on 7/10/2014, 1:52 PM
I know several nvidea GTX cards have the architecture for 10 bit, not sure about the 580, but I know when I researched this a year or so ago the only way to actually get the 10bits out was to modify the .inf file to force a Quadro driver install. Perhaps the newer GTX drivers will allow this.

I know a year ago the consensus was that the 10 bit output on the desktop cards was crippled by the GTX drivers. If it makes a difference, most posts were talking specifically about 10 bit output from Photoshop.

Edit: I just remembered that Photoshop uses OpenGL to provide the 10 bit pipe. So in your case I would say that your 8 bit limitation is simply going to be Vegas.
wwjd wrote on 7/10/2014, 2:46 PM
hmmm I wonder what other editors are capable of? if know one knows, I'll google later
larry-peter wrote on 7/10/2014, 3:37 PM
This topic just made me think of something I hadn't considered in ages...
I have an AJA Kona card that I use for preview to both a large LCD and a 20" CRT monitor. In Vegas' preview device preferences where the card's option are shown there is a tick box for "use 10-bit encoding."

I've always thought that the Kona driver is opening a 10 bit pipe for Vegas. My LCD is not 10 bit, but I can definitely see a difference in banding on the CRT when I tick the box in a 32 bit float project. I knew I couldn't get 10bit from Vegas out of a video card, but videoITguy's mention of BM devices not being able to get 10 bit from Vegas because Vegas couldn't provide it made me wonder... Is it just a Blackmagic issue or am I not really getting 10bit through the Kona?
OldSmoke wrote on 7/10/2014, 3:53 PM
Edit: I just remembered that Photoshop uses OpenGL to provide the 10 bit pipe. So in your case I would say that your 8 bit limitation is simply going to be Vegas.

Actually, Vegas uses both, OpenGL and OpenCL but I am not certain which part OpenGL plays.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

videoITguy wrote on 7/10/2014, 5:27 PM
In BlackMagic Design and AJA products - you do have hardware passing 10bit precision - but not creating it. It's likely that with a good monitor you are able to see some improvement over not using this preference setting. This means the hardware is not your direct impediment to visual review.

However, it does suggest you will need quantitative analyis more than visual reference to justify whether your workflow steps are producing something truly useful.