NeatVideo: can it use 10bit color?

megabit wrote on 8/19/2010, 6:53 AM
So, after a long time frustration with my nanoFlash augmenting noise (along with detail) in my EX1 files, I bit the bullet and bought the NeatVideo NR plugin for Vegas.

First impressions are OK, even though it's sooooo slow :(

BUT, in my result clips I can see occasional color banding which are not present in the original, 100 Mbps 4:2:2;L-GoP files. So I tried and converted a particularly noisy clip to 10-bit Sony YUV, and tried NeatVideo on that...

Well - it looks to me NV cannot take advantage of the 10 bit color: in its configuration preview window, the frame is still reported as 1920x1080, 8-bit RGB.

What am I missing?

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Comments

farss wrote on 8/19/2010, 7:11 AM
1) Converting 8bits to 10bits makes absolutely no difference. The two LSBs will just be zero.

2) What generally stops you from seeing color banding is the noise. This is how banding can be prevented when downsampling, you use dithering which is a fancy way of saying adding noise. Well dithering algorithms are maybe a bit smarter than simply adding noise but it practice noise provides dithering.

Bob.

megabit wrote on 8/19/2010, 7:26 AM
Thanks Bob.

ad 2. So, what Convergent Design is planning - to replace 10->8 bit truncation with real dithering of 10 bits - looks like it might make a lot of sense....

ad 1. I'd think so too - but then, why the common consensus (also on this forum), that post-processing 10 bit intermediate (even if both input and output are just 8 bit) makes a lot of sense? I got mislead by this, or did I misunderstood again?

:)

Cheers,

Piotr

PS. I've now done with my Moldflow projects for some time, so would like to talk to you on Skype - please let me know when you have a while to spare...

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

farss wrote on 8/19/2010, 7:50 AM
ad 1. I think this has something to do with the Y'CbCr > RGB conversion inside of Vegas.

Bob.
megabit wrote on 8/19/2010, 10:09 AM
OK - but do you think it would work with full 10 bit if given a full 10 bit video? This is an academic question, I know - but still...

Also, do you think setting the project to 32bit floating point could help in avoiding the banding?

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

Jøran Toresen wrote on 8/19/2010, 12:08 PM
Piotr, try the NeatVideo forum:

http://www.neatvideo.com/nvforum/

The author (Vlad) is very helpful.
Jøran
farss wrote on 8/19/2010, 2:27 PM
The question is moot as Vegas doesn't have a 10bit integer mode.
The only way to get more precision is to use 32bit float so the question to ask is if NeatVideo work in Vegas's 32bit float modes.

Bob.
megabit wrote on 8/20/2010, 2:18 AM
Well, I simply set my project settings to 32 bit and rendered out my test clip with NV... I *think* it does look better, but if there are differences they are extremely subtle.

Will ask NeatVideo on their forum; thanks guys!

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)