WSWG Preview vs Rendered file - Posible Fix?

ZigoH wrote on 6/5/2020, 3:26 PM

Hi, ive noted some time ago that my rendered videos didn' t look exactly as the preview monitor the looked more constraty ... i was aware that vegas displays clips in different levels range than other NLE's like Premiere, but i didn't remember Vegas changing levels on rendered files. So i went to my old laptop to verify this,,and indeed the "problem" was present in previous versions of Vegas, and it also wasn't some fault on my system because that one was a different computer. Also i've read here that it can be fixed with a levels conversion on the output, that actually works, but for me its a pain in the a** . I usually color correct every single clip individually on my projects so its important to have a preview window that actually shows like the video will look more accurately. After experimenting i've come up with a recipe that seems to work... Please, it would be great if anyone can confirm that this works on their system.

I know, 32 bit take longer to prosses but for me, it worth if i can work with a preview that looks like the output will look.

Please confirm!

 

PD. Just to clarify...this is true for 8bit footage..i don't have 10 bit or hdr clips to test.

 

 

Comments

Musicvid wrote on 6/5/2020, 7:13 PM

It doesn't actually work, it's an anomaly in the transformation matrix, and far worse damage, including shadow noise, is introduced by quantizing back to 8 bit integer. Here is more information than you ever need to know on the topic in the first post of this thread.

https://www.vegascreativesoftware.info/us/forum/plea-to-new-users-please-do-not-set-32-bit-pixel-format--114750/

A levels filter is the right way to go -- this new tool using Marco's presets may interest you.

https://www.vegascreativesoftware.info/us/forum/new-free-video-levels-tool-for-ms-platinum--120709/

For 98% of all users, I call pixel format the Hogwart's Filter, because of its mystical internet powers -- for almost everything. Said to cure hpv warts, too.

Marco. wrote on 6/5/2020, 7:20 PM

If not log/aces driven by the source or if there isn't the very need to preserve 10 bit + data from input to output, using a float point full level project for non color management experts is the very last thing I'd recommend.

Proper use of limited and full range levels in 8 bit projects usually is all it takes.