8bit Dither

Streamworks Audio schrieb am 18.11.2008 um 17:16 Uhr
I was mucking about with some images the other day that I created for our website, and I decided to 'Index' them or convert them to 8bit (256 colors), and in the process of course there were some color banding in the gradients. I then tried again using a Dithering Algorithm (Floyd-Steinberg, in Gimp) and this (while not perfect) was a vast improvement then reducing the bit depth without it.

So what I am wondering is - when we export to MPEG2 or AVC etc that are 8bit (well MPEG2 can choose up to 10bit) are we required to put our faith in the encoder to perform decent dithering on the RGB 24bit source that Vegas spits out? Is there some sort of output dithering that can be applied in Vegas before it reaches the encoder?

96% of our source is screen captures which are 24bit or sometimes 32bit (with the Alpha channel) - so the concern for us is to have optimal reduction and dithering.

Curious...

Thanks,
Chris

Chris Hawkins
Managing Director
Streamworks Audio

Kommentare

TheHappyFriar schrieb am 18.11.2008 um 18:24 Uhr
the 8-bit gimp is doing isn't the same as an 8-bit mpeg encoding. You did an 8-bit image in gimp, that's 256 colors. 8-bit video is 256 shades of red, 256 shades of green & 256 shades of blue. A LOT more headroom in there. It's impossible to NOT have banding with 8-bit anything with slow color changes. Odds are your images are 256 shades or RGB (3 channels * 3 = 24bit, 32-bit = RGB + 8 bit alpha). They're already dithered.
Streamworks Audio schrieb am 18.11.2008 um 19:05 Uhr
Gimp is 8bit per channel... thus was a 24bit (per pixel) image - which allowed for 16.7 millions colors. When reducing to 8bits (per pixel) this allows for 256 possible colors. So in conclusion the Gimp image was down sampled 16.7millions colors (24bit per pixel) to 256 colors (8 bits per pixel) - after doing this I started to wonder about 8bit (per channel, YUV) MPEG2....

Here is a good example of what I am talking about...

http://www.avsforum.com/avs-vb/showthread.php?t=1011359

Here the discussion is about different algorithms (mainly the use of MS's Xscaler) to reduce the bit depth (per pixel). In the example - the images show a conversion from a 16 bit (per pixel) TIFF (48bit RGB) into 8bit (per pixel - 24bit RGB).

So if we are working with most graphics (screen captures)... okay maybe not as high as 48bit (as I am sure most animations are done at... if not more!) - but 32bits and Vegas runs at 24bits (8 bits per channel) then we have to trust that the bit depth conversion is a good one, as I cannot see any settings for this. Vegas 8 does of course offer 32bits per sample (128bit image if you include the alpha channel!) - however, this could be a tad overkill no? And not to mention the time needed to render that! And even if we did have 128bits - we still have to feed that back into say MPEG2 or AVC which are still only at 24bits (8 per channel... YUV).

Also how about newer cameras that are now recording at 10bit (per channel)? Can Vegas represent that? One would have to go up to 32bit per channel.

Just curious ;-)
GlennChan schrieb am 18.11.2008 um 19:48 Uhr
In some cases Vegas does do dither and in other cases it doesn't.

e.g. if you have a 32-bit project with 1.000 compositing gamma, try adding a 8-bit only transition (most 3rd party transitions will be 8bit). Vegas will remove gamma correction, send that as *8-bit* to the filter, then add gamma correction back in. This usually results in unacceptable banding artifacts, even after dithering. You probably need 12+ bits going into and out of the plugin for this practice to look acceptable. Whereas if Vegas always sent 8-bit plugins gamma corrected values, you wouldn't have this problem and the plugin would still be able to do linear light processing (which is what the SMLuminance plugin does). So due to questionable design, it's possible to get Vegas to introduce banding artifacts into your video.

It seems that in most other cases Vegas won't add dithering (e.g. rendering). But I've not tried every combination of everything.

2- Most Y'CbCr formats use a 16-235 range for Y' and 16-240 for Cb and Cr, and on top of that Y'CbCr has a much larger volume than R'G'B' (the bits are spread out much further apart). So you'll get some rounding error going between Y'CbCr and R'G'B'.
(There are different flavours of Y'CbCr, but we'll not get into that.)

3- For video, you generally don't worry about banding artifacts too much since noise will effectively dither your image. And any video camera will add some noise into your image. You do have to watch out for computer generated stuff... e.g. any 8-bit media generator in Vegas, or other CG sources. It's possible that you'd be able to see banding artifacts. Make sure that it's not your monitor that is introducing banding artifacts (or noise from dithering).

If you're making images for the web, try a higher bit depth or use JPEGs to get rid of banding.
farss schrieb am 18.11.2008 um 20:04 Uhr
"Also how about newer cameras that are now recording at 10bit (per channel)?"

Nothing new about cameras recording 10bit. Vegas 8.0 does appear to read the 10bit values from at least Digital Betacam correctly into it's 32bit FP pipeline and output 10bit values. If it does that for 10bit HD I don't know. Unfortunately Vegas will not even read any of the 10bit Log codecs.

Bob.
Streamworks Audio schrieb am 18.11.2008 um 20:08 Uhr
HI Glenn,

Thanks for that! I am SURE my camera introduces noise (crappy DV cam... but thats only for home family stuff ;-)) and I am aware that noise is used intentionally to reduce banding artifacts.

As I mentioned... our products are all computer computer generated... often we will incorporate images (PNGs etc) and the sometimes add Vegas's media generators content on top. Our last video had a black and white gradient background during the chapter titles, and the banding was just too much.

Of course the content is created at 24bit (RGB) - and I am a little unsure how the encoders do it... sometime is almost appears as if it is indexing the images like its 8bits total! Each encoder will produce a different result. I have not yet moved to 32bit projects... still working in 8bit projects. I am assuming the 8bit MPEG 2 (or AVC whatever) would mean 8bit Y 8bit U and 8bit V?

Thanks,
Chris
GlennChan schrieb am 18.11.2008 um 20:22 Uhr
Vegas should be able to handle 10-bit in/out via SDI. I don't have HD-SDI hardware to test though.
Vegas doesn't seem to do 10-bit in/out through most file formats. Quicktime and AVI are limited to 8 bits. No DPX. Except for the 32-bit SonyYUV codec via AVI.

2- MPEG2 encoders may generate banding?
I know Apple's H.264 encoder does.

I am assuming the 8bit MPEG 2 (or AVC whatever) would mean 8bit Y 8bit U and 8bit V?
Yes. (But really, Y'CbCr is kind of the right term.)
farss schrieb am 18.11.2008 um 20:43 Uhr
I haven't tried this myself but it would seem that one could simulate dithering by compositing in just the right amount of noise. The again perhaps the real trick is to add the noise at just the right part of the image i.e. where the values make the transition from one value to the next. Maybe I'm looking at this too simplistically, proper dithering would be similar to the half toning used in printing I think.

Bob.
Streamworks Audio schrieb am 18.11.2008 um 22:32 Uhr
I have used noise before - but like you say it has to be the right amount to dither... and sometimes that is too much noise :-(
Streamworks Audio schrieb am 19.11.2008 um 00:49 Uhr
Here is a quick test I did. Here is a simple Black to White gradient generated by Vegas (Project is 8bit (24bit RGB)).

I used Apple's MPEG-4 codec for the test as I know it is bad with banding.

Here is the images...
http://streamworksaudio.com/dither.png

The quicktime player on the right shows the gradient rendered to MPEG-4 (unlimited bit rate). The player to the left I have applied some noise to the generated clip (clip level)... you can see how the noise does indeed help the lack of bits that Apple's encoder can provide which (without the noise) leads to banding.

So I guess it comes down to the encoder that is used, and how well it deals with banding.

Just my 2 cents ;-)
TheHappyFriar schrieb am 19.11.2008 um 01:59 Uhr
I'm still betting you're not using 16-bit channel images. 24/32 bit images are 8-bit color space, you'd need at least 48-bit images. 16.7 million colors have been around since the mid 90's, my SVGA with a Trident 2d card support 16.7 million colors back then.
Streamworks Audio schrieb am 19.11.2008 um 03:39 Uhr
Like I said... I used the Media Generator to create the gradients, and as Glenn already pointed out these generators a 8bits per channel.

I am not working with external images on this (I only referred to Gimp as work I was doing when I got to thinking about this topic). I know that Gimp is limited to 8bits per channel... and as I do not use Photoshop or have any other graphic application that can do 16bit /channel images - of course I am not using 16 bit / channel images. Again these are 8 bit per channel images generated by Vegas's Media Generator.

Maybe I should get CinePaint and try to make some 16bit/channel gradient image and import into Vegas... but to use all that data the Vegas project would have to be 32bit / per channel. Or else it would have to down sample the image to 8 bits per channel and that would not be beneficial.

Streamworks Audio schrieb am 22.04.2009 um 08:15 Uhr
Digging this one back up from the dead....

Question.... I have now moved to a Sony AVCHD cam. It can record using the x.v.Color gamut. When doing so - are these files created with a 10bit AVC codec or no? When imported into Vegas, in order to see the full x.v.Color gamut do I need to have the project set in 32bit FP mode?

Cheers,
Chris
Christian de Godzinsky schrieb am 22.04.2009 um 08:25 Uhr
Hi,

Count me in. I would like to know exactly the same, as Chris above!!

How is x.v.Color handled from an SR12 imported to the Vegas timeline and rendered out as different HD formats? Or are the twol lower bits just discarded? I have not found any definitive info about this...

Christian

WIN10 Pro 64-bit | Version 1903 | OS build 18362.535 | Studio 16.1.2 | Vegas Pro 17 b387
CPU i9-7940C 14-core @4.4GHz | 64GB DDR4@XMP3600 | ASUS X299M1
GPU 2 x GTX1080Ti (2x11G GBDDR) | 442.19 nVidia driver | Intensity Pro 4K (BlackMagic)
4x Spyder calibrated monitors (1x4K, 1xUHD, 2xHD)
SSD 500GB system | 2x1TB HD | Internal 4x1TB HD's @RAID10 | Raid1 HDD array via 1Gb ethernet
Steinberg UR2 USB audio Interface (24bit/192kHz)
ShuttlePro2 controller