How to show Studio RGB levels in VLC Player?

S35 wrote on 2/7/2015, 6:44 PM
Hello all,

I had a very interesting read in the recent thread:

http://www.sonycreativesoftware.com/forums/ShowMessage.asp?MessageID=917779&Replies=32

This is addressing the same issue I've always experienced but never understood. In Vegas 13, my video looks a certain way in the preview window, but plays back more contrasty in VLC player (an HD mpeg 2 file). However, I noticed in Vegas under "Preview Device Preferences", one can select "Adjust levels from Studio RGB to Computer RGB", and then when entering fullscreen playback in Vegas, the same contrasty video will appear. (Interestingly, VLC will also display these RGB levels for DNxHD, however Windows Media Player will display Studio RGB levels for DNxHD [the same as Vegas' preview window]).

So is there a way to get VLC player to playback MPEG 2 video in Studio RGB levels to look the same as Vegas' preview window?

Thanks very much for any advice!

~Adam

Comments

musicvid10 wrote on 2/7/2015, 9:27 PM
Yes.
Edit in Vegas.
Prior to rendering, use a Computer->Studio RGB Filter on the output.

wwaag wrote on 2/8/2015, 2:00 PM
Rather than VLC, try MPC-HC (Media Player Classic-Home Cinema). Using shaders (under playback options), you can change levels during playback (Computer to Studio RGB or vice versa). E.G. I usually render to an intermediate at full scale (0-255). If I want to see how this will look on my TV, I apply the 0-255 to 16-235 shader and "play" the file to my TV using HDMI. Conversely, you can apply the 16-235 to 0-255 shader for files already rendered to Studio RGB for playback on your computer monitor. I find these conversion filters very convenient with no need for a separate render to whatever levels you want. There are also other shader options which can be quite useful.

wwaag

AKA the HappyOtter at https://tools4vegas.com/. System 1: Intel i7-8700k with HD 630 graphics plus an Nvidia RTX4070 graphics card. System 2: Intel i7-3770k with HD 4000 graphics plus an AMD RX550 graphics card. System 3: Laptop. Dell Inspiron Plus 16. Intel i7-11800H, Intel Graphics. Current cameras include Panasonic FZ2500, GoPro Hero11 and Hero8 Black plus a myriad of smartPhone, pocket cameras, video cameras and film cameras going back to the original Nikon S.

videoITguy wrote on 2/8/2015, 2:34 PM
While VLC is an incredible open source player - it does have severe limitations. The nice thing about the alternatives - is they are really patterned after the broad distribution of TV hardware - that is you get to modify the viewing environs.

Again some will say this is the 'Never the same color twice' or aka NTSC problem promulgated since the early days of broadcasting and still continuing through huge customized settings menus in newer LCD screens. Yet, I think this is really the way to go - let the viewer adjust and customize to their desired experience level. Hence whether software player, or newest Blu-ray hardware the issues really are universal for the viewer of the video.
S35 wrote on 2/8/2015, 6:38 PM
Thank you so much everyone for the helpful responses!! Thanks wwaag for the info about Media Player Classic-Home Cinema... I will give it a try.

I just did a bunch of testing, and here is what I discovered:

1. When I render an 1080p Mainconcept MPEG 2 or AVC file direct from Vegas, the file will play back with computer RGB levels (bloomy, clipped whites) in WMP and VLC which looks EQUALLY ugly on both my HDTV over HDMI and my old CRT computer monitor.

2. However, setting Vegas to 32-Bit (Full Range), Composting Gamma to 2.222 (video), and Display Transform set to OFF, Vegas will render these files correctly, and they will playback correctly in both WMP and VLC player!! The key though, is to set it to 32-Bit FULL RANGE-- if you just set it to 32-Bit Video Levels, it will render with the same nasty clipped whites.

musicvid10 wrote on 2/8/2015, 6:44 PM
Using the Computer RGB _> Studio RGB levels plugin on the output will do the same thing.

What you are seeing is a collateral of 32 bit float Projects, but your source and output are both 8 bit, so there is no advantage, although your render times "may" be longer.

S35 wrote on 2/9/2015, 7:14 AM
Thanks musicvid10, that's a good point. Of course, the disadvantage of using the Levels (or Color Corrector) plugin to convert to Studio RGB before render is you're doubling the effect, so in a media player that happens to already display the correct values, the video will appear more washed out than intended.

As a side note, rendering 8 bit into Lagarith RGB avi will play correctly in both WMP and VLC, and then converting to H.264 mp4 via Freemake video converter will also produce a file with correct levels.
musicvid10 wrote on 2/9/2015, 8:47 AM
Not exactly.
Lagarith RGB is an RGB codec, and is always decoded at RGB levels, per the original standard. So you're mixing apples and oranges by wanting to compare to mp4/h264.

Players don't "happen" to decode fullrange YUV "correctly." Either the levels are correct, or it plays back clipped, or there is a pesky flag that tells a "few" players the levels are incorrect. Your converter is probably doing the latter, not filtering.

The fullrange VUI flag (and the older yuvj420) is an obscure, optional switch that was resurrected first by Apple, for decoding iPhone video, 100% of which is shot at incorrect levels, on its Quicktime player. Flag support is not a requirement for either encoding or decoding (Vegas doesn't use it), player support is still very spotty, unpredictable (I tested it in ffmpeg, flash, and quicktime), and therein lies the REAL danger of "doubling" up a correction (use link at the end). I see this all the time with 2nd and 3rd generation renders that folks ask me if I can "fix." Of course by then the damage is done; one simply can't put the toothpaste back in the tube.

Every time some genius comes up with another kludge to mitigate bad levels in the software or hardware, the opportunity (and incentive) to misuse it is raised by another power of ^2. Due largely to that insidious flag and ridiculous "dynamic contrast" controls on video cards, I have abandoned updating the tutorial that has been watched by over 30,0000, yet I still teach the principles of correct leveling just the same as five years ago when it was first published.

The ONLY bulletproof, deliverable method of producing video for consumers is 16-235 flagged BT 709. We supply the levels, the renderer supplies the flags and compression.
Much less common is 0-255 wrapped RGB (WMV is the main one here, along with a few illegitimate stepchildren.)

The biggest problem with multiple layers of levels contraction / expansion is that every generation loses bit depth, and it is quite possible to end up with a 4-bits per channel video image just by the third generation of error redundancy, reducing the effective output to only a few thousand colors!

That's why it is essential for anyone in a production setting to get the levels right the first time, and not count on a bunch of unknown, possibly destructive potions to get it right on the road to home playback.

But with so much unmitigated fullrange yuv being delivered, flaky flag support, the dynamic contrast "toy," and god only knows what else. the chances of anyone actually getting it right are pretty slim these days. I've felt the futility of trying to explain it again just these past few weeks, but get stymied at every turn by the hobbyist / gamer mentality, which essentially says, "Better video through alchemy."

You, at least, seem to have a somewhat clearer head about this, and I thank you for taking the time to listen.


http://www.sonycreativesoftware.com/forums/ShowMessage.asp?ForumID=4&MessageID=918017

S35 wrote on 2/9/2015, 2:49 PM
Thanks very much musicvid10 for taking the time to share that helpful info! You actually saved me from making a potentially big mistake in rendering for DVD output.

So, in your opinion, did I evaluate scenario 1 and 2 below correctly:

My project is made up of mostly Cineform avi files, but also some pre-rendered DNxHD 444 .mov and Lagarith RGB .avi effects shots. The effects shots, which contain extremely bright whites, look "correct" when rendered into MPEG 2 or AVC from Vegas @ 32 bit Full Range, however, the rest of the Cineform files will still display a contrast bump in media players when rendered @ 32 bit Full Range, so there's really no advantage as I previously had thought.

1. So basically, if I'm rendering for output to DVD, I can just render straight into MPEG 2 @ 8 bit or 32 bit (Video Levels), forget about how VLC player shows it, and know it will look right once put on DVD.


2. However, for rendering for Internet delivery, the mp4 file would look clipped and contrasty in media players if I render the project @ 8 bit, and if I render @ 32 bit Full Range, only the effects shots will look right. But if I render first to Lagarith RGB (to get both the Cineform files and the effects shots into the same format) and then render the Lagarith file into an mp4 file @ 32 bit Full Range (or just use Freemake to make the mp4), I'll get the correct look for direct playback from software media players on the internet.

EDIT: THE FOLLOWING INFO IS NOW OBSOLETE. PLEASE SEE MY NEXT POST.

(BTW, after 7 hours of testing, I discovered to set my DVD MPEG 2 render settings to High Profile, 4:2:2, AND check 2-pass. If you only check 1 of those settings, such as setting 4:2:0 and 2 pass or 4:2:2 and 1 pass, or just leaving everything at default, you may experience SEVERE blocky dropouts when rendering killer shots that contain huge moving soft gradients: such as, in my case, blurry moving smoke, and a glowing angel with moving soft light rays, disappearing in a flash of white light that eclipses the entire frame, then the image returns to nothing but the blurry moving smoke.

Also note that in the System Tab, set the Video Buffer Size to 232 KB... at some point between Vegas 8 and Vegas 13 that setting was changed to 23 (a digit was missing!), also the "Align sequence headers to packets" checkbox is left unchecked in Vegas 13, even though the manual still recommends having it checked).

Thanks again for helping me to understand things a bit better... I'm sure you'll be able to tell I still don't quite get it all... :-)



musicvid10 wrote on 2/9/2015, 3:31 PM
We don't know anything about the levels contained in your various intermediates.
However, I do know a bit about the codecs, and neither Cineform nor DNX444 will ever show a shift in a 32 bit float project, for the simple reason that they are 10 bit codecs. The Cineform was likely leveled for you when you created it.

A DVD is 8 bit, 4:2:0, so your project, edit environment, and mpeg-2 render should maintain that pipeline from door to door. The result of not doing so is unpredictability. Use the scopes, levels filter as needed, the DVD Architect templates, and resist the inclination to overthink. Best

S35 wrote on 2/9/2015, 4:01 PM
You are correct. After more testing, I came to the shocking discovery that "2-pass" rendering was actually the simple fix to the issue I previously mentioned. Why this was not immediately apparent, was because VLC player would produce horrible dropouts on EVERY mpeg 2 clip, even the ones that had rendered without ACTUAL dropouts. However, testing the rendered clips again in Vegas showed the issue had been solved with 2-pass mpeg 2 rendering. Time to get a new media player!! :-)

EDIT: I have now switched to Media Player Classic Home Cinema, and it doesn't glitch on mpeg 2.

Thanks for the extra info. I was under the assumption that DNxHD 444, though a 10 bit codec, was stuck in an 8 bit wrapper (.mov) in its free VFW version...

musicvid10 wrote on 2/9/2015, 5:47 PM
MOV is not VFW. And despite its extension, it does not decode with qt32lib.
In addition to 2-pass, set min bitrate at 2,000,000, not 192,000.
You will be pleased.
S35 wrote on 2/16/2015, 6:59 AM
Thanks! I'm making the final render today, so I'll try that.

~Adam