Here are some more questions about Luminance levels when encoding. I have been looking at some Blu Ray source video as well as a lot of DVDs when I load them in Avisynth - and all the luminance levels for the sources are at 0-255 not 16-235... what I am wondering, just for the sake of knowing, are these videos encoded at such levels then the player adjusts the luminance upon black back (i,e moving them into legal 16-235 range)?
I am moving some of my DVDs to my Apple TV - and I found a lot of the encodes I did were very dark... upon further inspection of my workflow I found that DGDecode and FFDshow (or even the Windows 7 built AVC decoder) would output 0-255... which is not a problem for computer playback... but the problem is that the Apple TV is applying it's own corrections to the video - so if 0 was already black - the Apple TV would play that at -16. I confirm this by encoding a SMPTE Colorbar test (generated by Avisynth). I should add in that if the Apple TV is set to RGB High then the values seem to be 'shifted' as opposed to RGB Low which seems to clips the black levels.
So to adjust I have had to correct the luminance before encoding... but the histogram shows colorbanding (although hard to see by eye) when doing so.... which leads me to think that the source files are in fact encoded at 0-255
I would like to know more about this... as I have for the most part with home movies and such set the levels in Vegas at 16-235 before encoding... if players are adjusting the level while playing back... then that would make my source too bright.
Curious...
Cheers!
Chris
I am moving some of my DVDs to my Apple TV - and I found a lot of the encodes I did were very dark... upon further inspection of my workflow I found that DGDecode and FFDshow (or even the Windows 7 built AVC decoder) would output 0-255... which is not a problem for computer playback... but the problem is that the Apple TV is applying it's own corrections to the video - so if 0 was already black - the Apple TV would play that at -16. I confirm this by encoding a SMPTE Colorbar test (generated by Avisynth). I should add in that if the Apple TV is set to RGB High then the values seem to be 'shifted' as opposed to RGB Low which seems to clips the black levels.
So to adjust I have had to correct the luminance before encoding... but the histogram shows colorbanding (although hard to see by eye) when doing so.... which leads me to think that the source files are in fact encoded at 0-255
I would like to know more about this... as I have for the most part with home movies and such set the levels in Vegas at 16-235 before encoding... if players are adjusting the level while playing back... then that would make my source too bright.
Curious...
Cheers!
Chris