8bit rendering vs 32bit

Comments

vtxrocketeer wrote on 7/26/2013, 7:18 PM
As with Will, I had to click "apply" first, too, in making the change from 8-->32 bit stick. Thinking I was clever, it wasn't long before I discovered that it was a one way ticket...
videoITguy wrote on 7/26/2013, 7:34 PM
vtxrocketeer and previous post by willgen NOW raise serious concerns about what build 670 is doing with a given system - be it GPU setting, type of video card and driver version, etc. Somethin's as "crazy" as ever when a person can roll back previous version and restore different behaviors.

SCS has apparently introduced new bug complexity in the latest 670 build.
Robert Johnston wrote on 7/27/2013, 2:13 AM
@vtxrocketeer:

You can switch back to 8-bit using a workaround. Evidently you didn't read my post several blocks back?

Intel Core i7 10700K CPU @ 3.80GHz (to 4.65GHz), NVIDIA GeForce RTX 2060 SUPER 8GBytes. Memory 32 GBytes DDR4. Also Intel UHD Graphics 630. Mainboard: Dell Inc. PCI-Express 3.0 (8.0 GT/s) Comet Lake. Bench CPU Multi Thread: 5500.5 per CPU-Z.

Vegas Pro 21.0 (Build 108) with Mocha Vegas

Windows 11 not pro

willqen wrote on 7/27/2013, 2:32 AM
Robert Johnson is right.

I experimented using levels and the studio rgb to computer rgb preset and can get 32bit full range to look exactly the same (to my eyes, & with my gear) as 32bit floating point (video).

Uncanny.

That works as a workaround. And it doesn't seem too hard to do.

Thanks Robert ... Much appreciated ...

Edit: works vice-versa. Sometimes 32bit full range color actually looks much better than 32bit video range. Using levels I can bring 32bit floating point (video) up to 32bit full range's color quality. Don't know why yet ... But I'll figure it out sooner or later. Maybe one of you could enlighten me ... mostly notice it in AVCHD media, all other parameters being the same.
vtxrocketeer wrote on 7/27/2013, 9:21 AM
You can switch back to 8-bit using a workaround. Evidently you didn't read my post several blocks back?

Quite right, Robert. I think I missed your post. Thanks! (But, man, what a kludge of a workaround.)
GlennChan wrote on 7/27/2013, 5:52 PM
Edit: works vice-versa. Sometimes 32bit full range color actually looks much better than 32bit video range. Using levels I can bring 32bit floating point (video) up to 32bit full range's color quality. Don't know why yet ... But I'll figure it out sooner or later. Maybe one of you could enlighten me ... mostly notice it in AVCHD media, all other parameters being the same.
That's because Vegas' preview isn't accurate in some/many situations. This is extremely unintuitive but you need to understand that first... Vegas is not WSIWIG. What you see isn't always what you get.

If all video is in studio RGB levels (which is often the case, e.g. 8-bit project with AVCHD)... then the Video Preview window won't be accurate. One way to get around this is to use the Windows secondary display as your preview device. In the settings for it, check the "studio RGB" checkbox.

http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm