Comments

Xander wrote on 5/24/2009, 8:11 PM
For me, I get frequent out-of-memory crashes using 32bit mode or just plane old crashes. It is not there to speed up anything.

The more tracks of video you composite together, 32bit mode should render them at a higher quality than 8 bit. The more Video FX you apply, 32bit mode should help with the final quality.

To improve rendering and playback speed, supposedly the 64 bit version of Vegas should help, however, it still currently lacks some features that make it usuable - at least in my Cineform intermediate workflow.

In AE, you get a 16bit mode, which I find to be a good compromise between quality and rendering speed.
Mikeof7 wrote on 5/24/2009, 11:40 PM
Excellent.
Thanks for the detailed response, Xander.
ingvarai wrote on 5/25/2009, 12:06 AM
...what can it do for me?
If you open Project properties, and toggle 8 / 32 bit and invoke Apply each time, you will see a significant quality improvement in the 32 bit mode. USe the preview window to see this. If you also have applied the Sony Levels FX (or tohther FXes too), the difference is striking.

ingvarai
Marc S wrote on 5/25/2009, 12:07 AM
I tried it in the new 9.0 trial and it crashed my system just like in 8.0. I wish they would stop advertising a feature that does not work. In AE on the other hand it can give much cleaner results on some effects.
farss wrote on 5/25/2009, 1:05 AM
If you're seeing a "signifcant quality improvement" in the preview window I suspect you're doing something wrong.

Bob.
Grazie wrote on 5/25/2009, 1:42 AM
I suspect you're doing something wrong.
Ah, go on, Bob, give him a clue? Why don't yah?

Kinda like - "If you keep hitting your thumb with a 5lb hammer, and it hurts, I suspect you are doing something wrong"

Bob, too precious . . .

Grazie
farss wrote on 5/25/2009, 2:01 AM
"Ah, go on, Bob, give him a clue? Why don't yah?"

Sure but I was hoping he'd come back and explain what benefit he thought he was getting. That's why I said "suspect" because there can be benefits. Anyways...

In 32 bit mode in Vegas 8 the levels get shifted, the video may appear better due to an increase in saturation. This is incorrect, see this article by Glenn Chan. There's also an update for V9 here that explains the two 32bit modes.

Pretty much switching to 32bit mode gains you very little for most video processing. It will increase render times and requires more RAM.

Hope I got that right :)

Bob.
Grazie wrote on 5/25/2009, 2:28 AM
Y'see? Wasn't that bad? - Bob, you have a wealth of tech experience - just tell us!

Youse a great Bloke!

Grazie
farss wrote on 5/25/2009, 3:10 AM
"Youse a great Bloke!"

Ya not too shabby yourself you know.

Blushing Bob.



GlennChan wrote on 5/25/2009, 10:22 AM
The main benefit is so you can get better-looking cross dissolves and compositing.

See
http://www.glennchan.info/articles/vegas/linlight/linlight.htm

*You don't actually need 32-bit to get better looking cross dissolves... use the SMLuminance plugin.
ingvarai wrote on 5/25/2009, 12:00 PM
>If you're seeing a "signifcant quality improvement" in the preview window I suspect you're doing something wrong

What wrong? I do see a dramatic change. Here:
1) Drop AVCHD MTS on the time line
2) Toggle 8 / 32 bit

With 8 bit, I have the impression of a thin white, almost transparent layer over the picture. In 32 bit, this disappears, and the colors look great. Like looking at a scenery through a window and then going out to look at it directly.

I also applied the levels FX to a clip, and the difference between 8 bit and 32 bit is striking. Ok, i do something wrong. The question is - what is it?

ingvarai
GlennChan wrote on 5/25/2009, 12:53 PM
It's only the levels changing.

see
http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm
(or the vegas 8 article if you're on Vegas 8.. http://www.glennchan.info/articles/vegas/v8color/v8color.htm )

It's not because of 32-bit image processing, it's because the 32-bit "mode" in Vegas causes the codecs to output different levels. You can do the same thing via Levels.
Though what you really need to do is to make sure all your levels are proper... you need to manually wrangle your levels in Vegas.
ingvarai wrote on 5/25/2009, 1:16 PM
Glenn,
this I understand.
Just a side note, I wonder who uses 32 bit then, and why.

ingvarai
GlennChan wrote on 5/25/2009, 2:01 PM
1- for linear light / optically correct processing

2- to avoid banding artifacts if dealing with computer-generated imagery.

Real world images have noise in them, which acts as dithering so you don't see banding artifacts.
rmack350 wrote on 5/25/2009, 9:43 PM
The other thing that 32-bit processing brings to Vegas is the ability to work with 10-bit color. 8-bit processing can't support it. Whether a lot of people are using it for this is another question, but if you're going to work with 10-bit media you need better than 8-bit processing. 32-bit ought to cover the possibilities for a while.

On the topic of 32-bit mode making things look better... if all you've done is put a clip on the timeline then you shouldn't see a change when you switch between 8-bit and 32-bit. Imagine if that clip were color bars. You wouldn't want the bars changing since they're supposed to be a constant that you can use to calibrate your equipment.

Rob Mack

BrianStanding wrote on 5/26/2009, 6:19 AM
How about with color-grading? Does 32-bit help here? I've heard conflicting reports.

Floating-point works fine on my system with 8.0c, but it is SLOOOOOOW.....

If there's a significant benefit to color-grading with 32-bit, I can put up with it, though.
GlennChan wrote on 5/26/2009, 9:19 AM
You wouldn't want the bars changing since they're supposed to be a constant that you can use to calibrate your equipment.
Yeah but... that's not how Vegas 8 worked. :/ You could throw color bars onto the timeline, and it would make a difference between 8 and 32bit when rendering to some codecs (e.g. HDV, MPEG2, AVC HD) and not others (e.g. DV).

In Vegas 9, you can avoid that by using the video levels / 32-bit mode. Upgrading to Vegas 9 will make things easier.

How about with color-grading? Does 32-bit help here? I've heard conflicting reports.
In Vegas 8 you have the levels silliness, so don't be confused with that. It has nothing to do with 8-bit versus 32-bit image processing. It's the 32-bit mode causing the codec behaviours to change.

2- I don't believe that there's a big difference. The easy way to find out would be to flip between the two modes.

If the compositing mode is 1.000 / linear, there will be a difference in how bezier masks look (they might look . But the 1.000 mode causes other wackiness and I would definitely avoid it for color correction. Stick with the 2.222 mode.