Why can't I get 32bit pixel format to work?

Kommentare

rmack350 schrieb am 03.04.2011 um 19:56 Uhr
Another detail has been nagging at me. I know some filters don't support 32-bit processing. That might be a source of trouble here, and possibly an adaptive noise reduction filter could be poisoning the render as well.


Quoted from Vegas Help:

Applying Video Effects
<snip>
Not all video plug-ins are capable of multithreaded rendering. Plug-ins that do not support multithreaded rendering are displayed with a yellow icon in the Plug-In Manager and Plug-In Chooser windows and with this icon in the Video FX window.

Video plug-ins and media generators that do not support floating-point processing are indicated by a blue icon in the Plug-In Manager and Plug-In Chooser with this icon in the Video FX and Media Generators windows.

eightyeightkeys schrieb am 03.04.2011 um 22:53 Uhr
GenJerdan and eightyeightkeys can you:

Vegas Pro 10c-64bit, Windows 7 Pro-64bit, i7 3.6 GHz, 12 GB RAM, Nvidia GTX470
Capture from Canon HV30, stills from Canon Powershot SX210is.
Usually 2 video tracks, one for stills, one for video + 2 or more audio tracks...maybe one text track.
Plugs on video track - Gaussian Blur 0.001 in the horizontal only + Secondary Color Corrector-Computer RGB to Studio RGB Preset.
Resample turned off on all vids and stills.
Interlace Deflicker checked.
Very simple stuff for vacation vids only.
erikd schrieb am 04.04.2011 um 05:19 Uhr
Mack350

Just going from memory but I seem to recall Spot and JR saying along time ago that if you use an 8bit FX filter in a 32bit render that the final video output will not be 32bit (dropping down to 8bit) during the transition only. They didn't mention anything about not doing it or causing crashes or other problems.

In my case, before the memory hack was applied, my 32bit render attempts were always immediate crashes. If it were related to 8bit FX filters I think it would be more obvious and detectable that the transition filter was causing the crash.

I can't thank everyone enough for chipping in on this thread and helping me work through this. The only thing I am still chewing on are Bob's comments regarding manufacturers of 10bit cameras. He seem to be saying there were a good deal of 10bit cameras out there that really shouldn't be because there S/N ratio wasn't high enough to warrant it. The fact that Bob included the 2/3" HD cameras is shocking to me.

I say shocking because I can clearly see the difference between 8bit and 10bit on my video monitor and the 10bit looks clearly better on everything. Better color depth adds an overall richness to the image that is unmistakable in my opinion. The entire image even seems like it has a higher resolution. It simply looks thicker, more HD. I'm sure Bob has a very good points from the spec sheet to support the point he is making but I'm wondering if he has also done any A/B switches to a 10bit monitor for comparison.

Erik


farss schrieb am 04.04.2011 um 06:38 Uhr
"The only thing I am still chewing on are Bob's comments regarding manufacturers of 10bit cameras. He seem to be saying there were a good deal of 10bit cameras out there that really shouldn't be because there S/N ratio wasn't high enough to warrant it. The fact that Bob included the 2/3" HD cameras is shocking to me. "

No, no, no. I'm saying there's hardly ANY 10 bit cameras out there.
XDCAM HD, XDCAM EX, all 8 bit. Spend enough on the XDCAM HD line and you get 4:2:2 chroma sampling but still 8 bit.

"I say shocking because I can clearly see the difference between 8bit and 10bit on my video monitor and the 10bit looks clearly better on everything. Better color depth adds an overall richness to the image that is unmistakable in my opinion. The entire image even seems like it has a higher resolution. It simply looks thicker, more HD. I'm sure Bob has a very good points from the spec sheet to support the point he is making but I'm wondering if he has also done any A/B switches to a 10bit monitor for comparison."

No I haven't because I cannot afford a true 10bit monitor and come to think of it I'm not too certain of anyone I know I could borrow some time on one from. If you've got a 10bit monitor you're way ahead of me. Which 10bit monitor have you got as a matter of interest?

By the way have you read Glenn Chan's article on 32bit float in Vegas:
http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm

You do need to be careful with 32bit mode.

Bob.

erikd schrieb am 04.04.2011 um 07:09 Uhr
Ok Bob, got off track there. To tell the truth, I thought the high end XDCAM HD was 10 bit so that in itself was news for me.

My monitor is:
http://www.crutchfield.com/p_158KD40XB4/Sony-KDL-40XBR4.html?tp=161

Thanks for the link on Chan's article. I've read it a couple of times over the past couple of years but a re-read every few months is not a bad idea. My understanding was that when the "video levels" option came out that most of the problems were done away with.

Erik
rmack350 schrieb am 05.04.2011 um 05:36 Uhr
Erik

I think what I was trying to get at with the filters is that projects can get quite complex. Certainly allowing 32-bit Vegas acess to another GB of RAM is going to help you with rendering in 32-bit mode. You might have also found that disabling FX would get the render working.

Of course the goal isn't to figure out if there's something more than memory that's stopping you. The goal is to render your project in 32-bit mode without any changes to the project nor to your workflow.

As for relying on other people's say so...I deal with a lot of experts in the course of my work. If I had a dollar for every time in the last 10 years that I tested what they told me and found them wrong I could take a friend out to a five star dinner. In another country. Lodging included. You should always test things if you want to be sure of the answer.

Your description of the visual difference between 8-bit and 10-bit displays sounds a LOT like what people were saying when they first tried 32-bit mode when it was introduced.

Rob
erikd schrieb am 05.04.2011 um 06:36 Uhr
"Your description of the visual difference between 8-bit and 10-bit displays sounds a LOT like what people were saying when they first tried 32-bit mode when it was introduced."

So, admittedly, I didn't get a lot of sleep last night and pardon if I am a little slow here but was that a slightly disparaging remark? :) Are you suggesting that there is not much visual difference between a 8-bit and 10-bit display? Again, sorry, but I'm not aware of the conversations that you are referring to.

farss schrieb am 05.04.2011 um 09:06 Uhr
Erik,
I had a look at the specs for your HDTV and it is also x.v.color compatible i.e. it's probably a wide color gamut display. They generally need 10 bit RGB to cope with remapping the colorspace. This is quite a different connundrum to traditional 10 bit video. Keep in mind that if you're playing a DVD either SD or HD into that set the source material is encoded only at 8 bits.

What really compounds the discussion is the difference between 10bit Y'CbCr and 10 bit RGB. If I can understand correctly some of what Glenn Chan had to say sometime ago when 8bit Y'CbCr (what cameras record) is decoded to RGB it ideally needs 10bits to have the same color resolution / space as the original.

Also some of the info regarding the difference between 8bit and 10bit recordings is not complete. 10bit video recording in/from cameras may not be as simple as just adding two extra bits. Things such as DPX and Cineon have different white and black points, generally to handle highlights. If I recall correctly Cineon sets 100% white at 635 which would map to mid grey in 8 bit.

Bob.
rmack350 schrieb am 05.04.2011 um 17:49 Uhr
I was trying not to be disparaging but it was late and I was running out of time. Seems to be happening to me too much these days.

When you say that the colors look deeper and better in a 10 bit codec played on a 10 bit display, are you saying things look better because you graded them to look better? Or are you saying they look better simply by virtue of being processed, encoded, and then displayed at the higher bit depth?

The reason I ask is that converting video to a higher bit depth should not, on it's own, change the look of the image. The image should stay true to the original until you deliberately make it different.

I think you are probably saying that your images look better when run them through a 32-bit process chain (edit: of FX and transitions) rather than doing the same exact thing with an 8-bit chain. The difference I'd expect to see would be a cleaner image, but not a drastic difference, although the blog post I linked to was claiming that you could preserve more highlight and shadow detail with 32-bit processing.

Anyway, this is all beside the point. It's like the joke asking how many Art Directors does it take to change a light bulb.

"Does it have to be a light bulb?"

How'd the render go? Success?

Rob