Why is Vegas external preview so bad?

Comments

Sebaz wrote on 5/20/2010, 3:23 PM
Have you tried sending it to someone else's 1080p television? Although from what you described it sounds like Vegas isn't sending out a properly configured signal.

I don't have someone that can come over with an HDTV over here, but it doesn't matter, because like I said, my main monitor is 1080p native, and this double field effect shows in it too, and looks exactly the same as in the Sony 720p TV.
R0cky wrote on 5/21/2010, 11:44 AM
Sending this to my external SD CRT pro monitor I get the double images - it is not displaying properly on my interlaced monitor. I'm using a Canopus ADVC-300 to drive the monitor. I have never been able to get interlaced footage to preview correctly.

If I change my preview to my 2nd 1600x1200 DVI monitor I also get the double images. Checking "deinterlace" just gets rid of the scan lines, I still get the double images.

Rocky
Sebaz wrote on 5/21/2010, 1:34 PM
I have a 720p television as well. It is a Panasonic Plasma. Like all 720p televisions 720p is the only resolution the set can display. If it receives a 1080p or 1080i signal the television converts it to 720p. If it receives a 1080i signal it must downconert and also deinterlace the signal as well.

Also, I would like to reiterate this because it also serves the debunk the theory that this problem might be caused by the TV set being 720p. When I send content that is originated as interlaced, and I send it as a pure interlaced signal to the TV set, the set does the down-sampling and deinterlacing perfectly fine. Like I said, when I connect my AVCHD camcorders through HDMI to the set, the set is getting a 1920x1080 60i signal, and it displays it perfectly. Also I have my blu-ray players normally set to 1080p, so the player does the deinterlacing itself, however, for this purpose I set them both to 1080i and played interlaced sourced content, and it also looks perfect.

So this is clearly not a matter of the TV set, and I'm positive that if I bought a native 1080p Vegas' output would look exactly the same. What I wrote above and the fact that it also shows when I choose my 1080p monitor makes me certain of this.
GlennChan wrote on 5/21/2010, 2:43 PM
Sending this to my external SD CRT pro monitor I get the double images - it is not displaying properly on my interlaced monitor. I'm using a Canopus ADVC-300 to drive the monitor. I have never been able to get interlaced footage to preview correctly.
That *is* correct. You *want* to see both fields without de-interlacing on a CRT.

De-interlacing introduces artifacts (always). Ideally, you want to preview without stuff that introduces artifacts. Unfortunately with most of the non-CRT dispalys, you have to do de-interlacing for moving video.

Also, I would like to reiterate this because it also serves the debunk the theory that this problem might be caused by the TV set being 720p.
That's no way to monitor your signal anyways because scaling from 1080-->720 will introduce artifacts.
http://www.glennchan.info/broadcast-monitors/scaling-artifacts/scaling-artifacts.htm

If you are doing SD work, get a CRT TV to preview on.
JJKizak wrote on 5/21/2010, 4:08 PM
Is there anyone who is selling CRT's anymore?
JJK
Sebaz wrote on 5/21/2010, 4:08 PM
Also, I would like to reiterate this because it also serves the debunk the theory that this problem might be caused by the TV set being 720p.

That may be so, but I would say that the poor quality in which Vegas displays interlaced video using the external preview is far worse to monitor the editing than if it were sending the signal correctly but my TV was downscaling. Besides, I don't know the technical details, but my TV has an excellent downscaler, like I said above, when I bought it there were several native 1080p sets on the sales floor, and this 720p looked better than most of them, both in terms of quality of colors and in sharpness, but without looking like an artificial border enhanced type sharpness. I've had chances to buy 1080p native sets even for a few hundred below what I paid for this one back in 2008, but I've been so happy with this one that I didn't see the point. I will wait until I can buy a top of the line Sony, because then I would see a difference worth paying for.
farss wrote on 5/21/2010, 4:37 PM
"That may be so, but I would say that the poor quality in which Vegas displays interlaced video using the external preview is far worse to monitor the editing than if it were sending the signal correctly but my TV was downscaling"

Finally you get to the core of the problem. Vegas's scaling for External Preview. You can turn that Off. I'd suggest you try doing that and see what if any difference it makes.

Bob.
Rainer wrote on 5/21/2010, 5:38 PM
I think Sebaz means "compared to even the current most basic consumer editing systems", and he's got every right to ask, because it is.
Sebaz wrote on 5/21/2010, 6:24 PM
Finally you get to the core of the problem. Vegas's scaling for External Preview. You can turn that Off. I'd suggest you try doing that and see what if any difference it makes.

No, Vegas is not scaling down to 720p. Vegas outputs in either 1080i or 1080p, depending on which mode I have selected in the ATI control panel. As long as I leave Vegas' external preview to "Current settings" it will not scale anything. I did try however, to select 720p in Vegas while having the card set to 1080p or 1080i, and it made no difference. It changes the card's output to 720p, but it still shows the double field effect.
farss wrote on 5/21/2010, 6:38 PM
Yes, Vegas does something different compared to other NLEs and even hardware boxes although some hardware boxes do the same as Vegas which caused us no end of grief with a large order from The Australian Broadcasting Corporation.

It's pretty common when sending interlaced video and the stream is paused to only send one field to displays. The reason for this is simply to avoid having monitors flickering all over the place. Trust me if you're sitting in master control in front of a wall of flickering monitors you rightly would be screaming.

Vegas does not do this, when you hit pause, Vegas will still send both fields to the display and the display will show both fields. When you actually play out the video of course the eye merges both fields. Comparing what happens when you pause playback with Vegas and what it's doing during playback is going to lead you up the garden path.

This behavior could also confound any display that does a reasonable job of de-interlacing. It takes more than a frame's worth of fields to do an adaptive motion de-interlace.


Bob.
Sebaz wrote on 5/21/2010, 7:18 PM
Bob, I know what you're talking about when you say "It's pretty common when sending interlaced video and the stream is paused to only send one field to displays.", however, this is not the case. There is an option in Edius and in Neo to choose between frame or field for pause, and it works accordingly. If you choose "field" when the cursor is stopped, it will only show one field to avoid the crazy back and forth between the two fields.

This is, however, not the same thing. Vegas does some kind of ridiculous processing in the external preview module, where it actually sends the two fields at the same time instead of one and then the other. The double field effect that I show in those photos with the cursor stopped, it's the same when it's in motion. You see double contours on everything that moves. It looks slightly different if 1080i is selected in the card instead of 1080p. In 1080p the scan lines are more evident, while in 1080i they are smoother, but you still can see the double fields.

I don't know if there's some fancy $1000+ device that will take this mess that Vegas puts out and make it into a decent interlaced signal, but if there is I don't care. I don't even expect Vegas to have a BOB deinterlacer, or any deinterlacer of any kind. I only expect Vegas to output a real 1080i signal, the same as my two blu-ray players, my two AVCHD cameras and the Time Warner Cable piece of junk cable box, all of which send a 1080i signal to the TV set that looks perfect to me.

What I think is that Sony should make some changes urgently in that module so that it works as it should, to send out a real 1080i signal.
Former user wrote on 5/21/2010, 8:03 PM
I don't have HD monitoring, but I see what you are talking about in Vegas.

But if I open the file in VLC, I see the exact same thing. The curtains have the ghosting in VLC as well.

Don't know what that means, but Vegas is showing the same thing as VLC.

Dave T2
farss wrote on 5/21/2010, 8:30 PM
I can confirm that Vegas seems to bypass DirextX. Changes in my nVidia control panel make no difference to Vegas's output to the Secondary Display, only thing that affects that is changes to Vegas own Secondary Display Control. From that changing the De-Interlace, ICM profiles etc do change the outcome. VLC on the other hand is affected by the nVidia control panel.

I can confirm that Vegas does send interlace video correctly down firewire. It does send interlaced SD down a BMD Decklink card to a SD SDI monitor correctly as checked on a rather expensive Sony CRT monitor. I now have access to a system with a Decklink Extreme card in it and can hook up a 17" Panasonic BT-LH1760 monitor that is one of the few monitors that correctly displays interlaced HD. That's going to take me a few days though. If that does work correctly and I have no doubt it will, then I think I can say with a good level of confidence that a BMD Intensity HDMI card will do the same thing, feed interlaced video to your HDTV one field at a time.

Bob.
Sebaz wrote on 5/21/2010, 10:22 PM
Thanks, Bob, it would be good to know, or at least have a notion, that the Intensity card might display real interlaced video.

You can change some of the picture quality settings for the signal being output from Vegas in the graphics card control panel but ironically not where you change those settings for video content, but where you normally change them for the normal Windows desktop. At least it works that way in the ATI control panel. Still, I remember that back when I had a Nvidia card Vegas' external preview looked just as bad for interlaced content.
farss wrote on 5/22/2010, 4:53 AM
The nVidia control panel has a sort of hidden extra for "video". I assume that means for video overlays.
I'm still wanting to know if you've tried using Vegas's own control panel. The interlace, scaling and ICM controls in there certainly change the outcome on the secondary display monitor.

Bob.
Sebaz wrote on 5/22/2010, 10:02 AM
I'm still wanting to know if you've tried using Vegas's own control panel. The interlace, scaling and ICM controls in there certainly change the outcome on the secondary display monitor.

I did, in fact I tried each of the different ICM profiles, even thought ICM profiles really should have no influence in how fields are displayed, but none of them did anything at all for this problem. I also tried selecting custom resolutions, but there's a limit to what you can select. For example, if you set the card to 60 Hz, and you select 1920x1080 (29) or (30), as soon as you click the external preview icon the arrow starts flashing very fast and you get no preview on the TV set. Same thing if the card is set to 30 Hz and you select a custom res. in Vegas of 60 Hz.

Now, even though in the ATI control panel I have a choice of "29i Hz" and "30i Hz", selecting 29 still shows 30i Hz in that list, like you can't select it even though it's available. I don't even know what it means, I don't know if it's supposed to be 29.97 interlaced, and why is there a 29 and 30 Hz choice. Vegas always shows it as (29) even if 30 Hz is selected in the card. I did select 1920x1080 (30) as a custom setting in the Vegas CP and it flashes both screens to black for a split second, and then goes back to normal, except that it seems it really sends a 30 Hz signal to the TV set. Still, the quality of the interlaced preview is just as bad, movement looks blurry, a hybrid between interlaced and progressive video, caused by displaying both fields at the same time.
craftech wrote on 5/22/2010, 11:51 AM
Is there anyone who is selling CRT's anymore?

------------
JVC still sells them.

Here is a place in Secaucus, NJ that sells them. I bought my 15 inch model from them although they don't seem to have them listed anymore. Grazie has the same one. Great monitor.

John

UPDATE: Apparently JVC is phasing them out.
farss wrote on 5/22/2010, 2:31 PM
" even thought ICM profiles really should have no influence in how fields are displayed, but none of them did anything at all for this problem"

Well I'd be pretty surprised too if an ICM profile made any difference to the interlacing.
What about the de-interlace controls in the same control panel?
What about the scaling control in the same control panel?

Bob.
Sebaz wrote on 5/22/2010, 6:36 PM
What about the de-interlace controls in the same control panel?

Also checked those out. Applying deinterlacing makes the motion a little more jumpy, so it's better to leave it off.

As for scaling, I turned it off and on, and it doesn't do anything for this problem.
R0cky wrote on 5/24/2010, 10:57 AM
Glenn, I think I miscommunicated. I did not mean a computer CRT, I meant my Sony video monitor intended for video. Are you saying that should not show interlaced video correctly?
GlennChan wrote on 5/24/2010, 4:30 PM
A CRT (TV or broadcast monitor receiving a video signal) does and should show interlacing correctly. If you can see annoying interlace flicker... e.g. when you pause on a frame of motion and you are seeing both fields jump around like crazy... that is correct.

You need to make sure stuff like that isn't on your final product. You want your monitoring setup to expose that flaw, not hide it.