Vegas "preview auto" levels shift

johnmeyer wrote on 3/17/2014, 4:25 PM
I just spent three hours of my life figuring this out, so I thought I'd share it. Perhaps this has been discussed before, but if so, I can't remember it.

The problem is that the Vegas preview window changes video levels, but does so differently on different tracks. However, this only happens in certain preview modes, but does not happen in others.

I have absolutely no idea why this is happening, but I now know how to avoid the problem. I did this in Vegas 10, but I have confirmed the problem (and the solution) in other versions (although not in Vegas 12, because I don't own that).

Here's the "reduced case" that lets me create the problem:

1. I put a a short clip of 60p 1920x1080 video from my Sony CX-700 camera on the timeline in a new project and then rendered to AVI using the "uncompressed" template (the problem shows up when using any codec, but this let me eliminate any codec issues).

2. I put this AVI directly above the original video, on another timeline.

3. I opened the Videoscopes with Studio RGB checked, and the other box unchecked, and "Waveform" display selected.

4. I then muted and then un-muted the top track while watching the scopes.

When I did this, the entire levels shifted up by about 3-4 when I viewed the uncompressed copy of the original. This shift should not happen.

What I describe above is my final, "reduced" test case. The original workflow was far more complicated, but involved frameserving and denoising. I couldn't understand in that workflow why I was getting this level shift, and it makes even less sense in this simple, reduced case.

There was nothing different about the tracks, and I even duplicated the original track, deleted the original video and then placed the uncompressed version on that track. The levels shift persisted.

I was doing this in Vegas 10, but when I tried it in Vegas 8, I had no problem.

Hmmm .....

Well, to make a long story short, I finally figured out why Vegas 8 gave me no level shift (which is what should happen) and Vegas 10 did shift the levels:

It was my preview settings.

When Vegas (any version that I've tried) is set to "Preview Auto" you will get the levels shift on uncompressed video. This also happens with video that has been rendered using DV AVI codec (which also involves down resn'g to 720x480), Cineform, HuffYUV, etc.

If you change the setting to Preview Full, the problem goes away.

I tested all the other preview settings, and until you get to the series of Best preview settings, at least one of the settings in the other three preview modes has this same problem.

I have no idea why this happens, but I think this probably explains why so many people were reporting different results in many of those long, tortured threads that explored why we were seeing level shifts when comparing uploaded material to the original. Don't misunderstand what I'm saying: in those threads there were most definitely other reasons why there were levels shifts. However, part of trying to figure out those shifts sometimes involved putting results back on the timeline and then muting and un-muting tracks while looking at the Vegas waveform scope. In those cases, this preview setting anomaly will definitely cause a problem.

For those who are interested (Sony??), here is my test case, created in Vegas 10:

Levels Test Case

It contains the veg file, the original mts 1920x1080 file from my camera (a short clip resulting from accidentally pressing the record button), and an AVC file rendered from Vegas. Don't worry about the codec used (AVC) because it doesn't make any difference to the test. Open the veg file and set the preview setting to preview-auto. Watch the Waveform monitor in the scopes. As you switch between the two files, you will see the levels shift up and down. This is easiest to see on the bottom right of the levels display. Switch to Best-Full, repeat the test, and notice that there is no shift in levels (although there is a slight spread because of changes made by the codec).

I hope this helps someone. Perhaps someone from Sony can explain why this is going on.

P.S. Here's the original:



and here's what the scope looks like with the level shift:

Comments

larry-peter wrote on 3/17/2014, 5:53 PM
I don't recall it being discussed before, and you may have hit upon the reason some have difficulties with level shifts.

I don't consider it to be a problem, personally. If you consider that the reason for the lower preview levels (half and quarter) is for Vegas to process only a portion of the picture information and give faster timeline playback. If the meters see a full-quality image in a lower preview mode, you're forcing Vegas to process the entire frame and losing the speed you gained with the half-resolution preview.

Some plugins, like convolution kernel and sharpen, will show levels above 235 and below 16 in lower preview modes even if you have a levels plug in applied correctly for a best/full preview. When you think about what's happening it makes sense. If, as I assume, Vegas is feeding the plugin a half or quarter resolution image, it's processing big pixel chunks and the meters are showing the result of that.

I only trust anything at best/full.

NormanPCN wrote on 3/17/2014, 6:21 PM
If, as I assume, Vegas is feeding the plugin a half or quarter resolution image, it's processing big pixel chunks and the meters are showing the result of that.

Very true. Vegas playback is full, half or quarter and all processing is done at that resolution. If auto then what you get depends on the size of your preview window. Vegas always tells you what it is processing in the Preview window status display. It has project size, current preview render size and preview window size. Half size playback being 1/4 the pixels of full. So Vegas is interpolating down from full to half/quarter.

All that said, I downloaded the file and I did not see the jump shown in the pictures supplied. I tried half and full, preview and best. My bottom dark line did show a change between slightly thinner and thicker.

VP12 build 770.
johnmeyer wrote on 3/17/2014, 6:26 PM
Your results with V12 is good news: it sounds like the problem may be fixed in the current Vegas version.

Finally, perhaps, a reason to upgrade ...
larry-peter wrote on 3/17/2014, 6:54 PM
I'm interested in why levels displayed on the waveform meter are important in any mode other than Best/Full? In any of the other modes, you are not getting a reading that reflects the levels of your actual footage. If you have chosen half or quarter (or auto) you're only looking at a fraction of the pixels in the image. In Draft or Preview, your settings for deinterlacing aren't being applied and that could potentially cause a difference.

And as far as I know, what Vegas is doing to the image in Good mode is a mystery. What's happening after the codec releases it to Vegas that makes it easier to preview even in Good/Full? Could it be handled differently for different codecs? Best/Full is the only way I see for a degree of certainty in making QC decisions.

Am I missing something?
johnmeyer wrote on 3/17/2014, 7:12 PM
I completely agree that Best is, well, the best way to get full fidelity on the video. However, I always thought that the various settings below best simply degraded the image spatially, by throwing away half the lines of video, and doing other "easy" tricks to reduce how much information must be pumped to the display. However, I never suspected that any of this would introduce a levels shift, and therefore it never occurred to me to switch back the preview.

Also, if you experiment with all sixteen of the various preview modes, you will find that the levels shift is not entirely predictable from one preview mode to the next. For the record, here is what I found (with Vegas 10):

Preview Mode Levels Shift?
======================
Draft-Auto Yes
Draft-Full Yes
Draft-Half Yes
Draft-Quarter No (go figure ...)
Preview-Auto Yes
Preview-Full No
Preview-Half Yes
Preview-Quarter Yes
Good-Auto No
Good-Full No
Good-Half No
Good-Quarter No
Best-Auto No
Best-Full No
Best-Half No
Best-Quarter No

As to why I would "want" to preview at a lower resolution, the answer is simple: we all use lower resolution when we want to get smoother previews which display the video at full frame rate. It is visually obvious how the image has been degraded when you do this, and that's the price we pay to get the higher fps playback. However, speaking for myself, I never thought I would have to return to Best-Full in order to get a reasonably accurate waveform display, and certainly didn't expect to have a levels shift. The shift on other clips was actually more pronounced (larger) than this test clip, but I wanted to use the shortest clip possible in order to make it easy to up/download the test package.

musicvid10 wrote on 3/17/2014, 7:25 PM
I first discovered this when I was developing my grayscale image, mentioned here:
http://www.sonycreativesoftware.com/forums/ShowMessage.asp?ForumID=4&MessageID=776044

It's a byproduct of rasterization.
Not used anything but Best/Full when scoping since, except when the mind slips . . .
;?)
johnmeyer wrote on 3/17/2014, 7:31 PM
I first discovered this when I was developing my grayscale image,If anyone would have known about his, it would be you. I guess I should have gone back and read all your posts ...
musicvid10 wrote on 3/17/2014, 7:38 PM
Hehe, most of my posts are wrong, full of hot air, or merely ridiculous. Wouldn't expect anyone to dig the remaining shards of truth out of them . . .

My discovery while testing the grayscale was one of those, "Wait a minute. A few seconds ago, it said 128" moments.
;?)
johnmeyer wrote on 3/17/2014, 9:14 PM
[I]My discovery while testing the grayscale was one of those, "Wait a minute. A few seconds ago, it said 128" moments.[/I]That is precisely what I felt.
larry-peter wrote on 3/17/2014, 10:34 PM
When I first discovered a difference in levels between preview modes, it was with the convolution kernel I mentioned earlier. I had done some color correction, added the convo plugin in Best/Full, and then a Color Curves plugin to tame the edge enhancement that took my levels above and below 16/235. When I went into preview mode my levels were out of range again.

My thoughts were that because in preview/half I was feeding the plugin a lower resolution image and thus causing a wider edge effect than in full resolution. I guess what I'm getting at is because we have plugins that will respond differently at different resolutions, I came to the conclusion that expecting levels to be accurate in lower preview modes wasn't going to happen, so I never looked back.

The results you got from your test don't seem to follow my logic, but I'm not sure that even if the level shift you experienced was fixed, that Vegas could maintain accurate metering in anything other than Best/Full if plugins are in the chain.
johnmeyer wrote on 3/17/2014, 11:53 PM
Yes, I too thought that plugins or something else was involved, especially because my initial problem resulted from a pretty complicated project. It was only after I'd killed many hours where I couldn't seem to get anything to render without a level shift, that I decided to simplify the problem and figure out what was going on. In the end, I had nothing but two clean tracks and two events, with no fX, no pan/crop, no track motion, and no compositing.

Absolutely pristine.

The fact that at least one person has reported no problem when loading the project in Vegas 12 seems to indicate that Sony eventually saw the problem and fixed it. I hope that is the case.
farss wrote on 3/18/2014, 12:23 AM
This is one of the reasons why I'm so averse to using anything other than a waveform monitor when it comes to measuring levels. As you reduce preview resolution and pipeline quality high frequency components from detail enhancement get smoothed out. With any parade or histogram this is hard to see whereas with a waveform monitor it's relatively easy to detect what's going on.

Bob.