LUT FX have a huge delay when added

marcel-vossen wrote on 9/18/2018, 3:25 AM

Hi there,

In Vegas 15 I have added quite a lot of LUT presets (30-40) in the "Vegas LUT filter" FX, but whenever I use it the program freezes for quite a long time the first time I add it, I'm waiting maybe half a minute before I can see what the LUT does, but after that it's fucntioning normal and I can also pick other presets from the menu, without lots of delay. It'svery annoying because it always looks as if the program has crashed again.

Is anyone familiar with this, is this normal? Does this really take so much resources, it doesn't seem so to me...

Or are there maybe other ways to add LUT presets then through this FX?

Marcel

Comments

karma17 wrote on 9/19/2018, 6:17 AM

I use the LUT FX quite a bit and can't say I've ever noticed a delay like that. When you are editing, are you set to 32-bit floating or 8-bit pixel format under Properties?

marcel-vossen wrote on 9/19/2018, 6:54 AM

I use the LUT FX quite a bit and can't say I've ever noticed a delay like that. When you are editing, are you set to 32-bit floating or 8-bit pixel format under Properties?

Hi, 8-bit is set in the project properties, but underneath the LUTs in the FX window I can see a text that says 32 so I'm a bit confused, where do I set the LUT to 8 bit then? :)

 

 

Red Prince wrote on 9/19/2018, 1:16 PM

where do I set the LUT to 8 bit then? :)

You don’t. At least not those with the .cube extension (the most common ones at this time). They are text files that list the LUT values as 0 being dark (black) and 1 being bright (white, or the brightest red, green, etc). This is the standard in computer graphics and has been since before digital video was born.

While most OFX (and other) plugins do not release their source code, it is most likely that they work in 32 bits internally, so when you set your properties to 8 bits, you are telling Vegas to present every single channel of every single pixel to every single effect as an 8-bit integer, which just about every plugin has to then take the time to convert into a 32-bit value between 0 and 1, process it internally as a 32-bit number, then take the time to convert it to an 8-bit integer, while throwing away some of the clearly visible 32-bit finesse (well, at least visible on my monitor which 10 bits per channel).

Of course, whenever I point that out on this forum, I get yelled at by certain members because they know better than someone who has been computing since 1965. Oh well...

 

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Kinvermark wrote on 9/19/2018, 3:51 PM

How do you get 10 bits out of Vegas to the 10 bit monitor? Blackmagic device? UHD or HD?

Red Prince wrote on 9/19/2018, 6:58 PM

How do you get 10 bits out of Vegas to the 10 bit monitor? Blackmagic device? UHD or HD?

No, just an NVIDIA card and a 4k BenQ monitor.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Kinvermark wrote on 9/19/2018, 7:32 PM

Is there a test chart/method to confirm that 10bits are actually getting through from Vegas' external monitor preview? (I have a 10 bit LG UHD display and the Radeon software indicates 10 bit to that panel, but I am not so sure...)

Red Prince wrote on 9/19/2018, 9:02 PM

Gosh, there was another thread where we talked about this. It was a while ago and I do not remember the details anymore, but the answer was yes.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Kinvermark wrote on 9/20/2018, 10:52 AM

There appears to be 2 modes of piping 10 bits from the software to the GPU/monitor: Direct X (typically games) and Open GL.

I was able to get a 10 bit test video to play back through VLC media player, but that same media does not play back in 10 bits from Vegas. My R9 290 card only supports Direct X 10 bit, not Open GL, which seems to be reserved for workstation cards from both Nvidia and AMD.

Red Prince wrote on 9/20/2018, 2:39 PM

I have a GeForce GTX 980 M from NVIDIA. It supports OpenGL 4.5, at least according to its specifications. So, I guess, I’m in luck.

He who knows does not speak; he who speaks does not know.
                    — Lao Tze in Tao Te Ching

Can you imagine the silence if everyone only said what he knows?
                    — Karel Čapek (The guy who gave us the word “robot” in R.U.R.)

Kinvermark wrote on 9/20/2018, 3:16 PM

I used the rotating Spears & Munsil test footage from here:

https://www.avsforum.com/forum/139-display-calibration/2269338-10-bit-gradient-test-patterns.html

(FYI, The forum poster claims he has permission from the author to post the test pattern.)

I like this one, because it makes banding easy to see, and you definitely see a difference between the 8 bit and 10 bit squares if you have 10 bit output.

PS The R9 290 also has Open GL support, but not 10 bit Open GL output.