Why is Vegas external preview so bad?

Sebaz wrote on 5/19/2010, 2:41 PM
I believe that any good NLE should have a flawless external monitor preview without relying on expensive third party cards, as long as the computer has a more or less powerful graphics card with a second HDMI or DVI port, because certainly the capability is there. In Edius, for example, you can choose to use DirectDraw Overlay in the preferences and using the card's Theater Mode you can have a great preview of the footage being played in the NLE, and this includes interlaced footage. It's not a perfect solution, because it forces Windows to go into basic mode and showing the hideous basic interface, but at least it's possible.

In Vegas however, previewing interlaced footage accurately is not possible with the secondary monitor preview that Sony claims in the “Video Editing” section of the Vegas Pro 9 page on Sony's website:

Enhanced Video Monitoring

Well, either this is a case of seriously misleading advertising, or I'm doing something terribly wrong, but I don't think I am, because I have tried several different things to be able to preview interlaced footage properly in my TV set, which is connected to the second DVI output of my ATI HIS Radeon 4850 graphics card, and nothing works. Surely, it displays the footage, and if it were natively progressive footage I wouldn't have a problem with it, but it's 29.97 interlaced footage, and I hate seeing it as progressive when it is not that way. It's as annoying as seeing widescreen movies in 4:3 pan&scan like in the old days.

I would really like to find out what's the cause of this. Perhaps there's nothing that can be done unless SCS fixes it, but perhaps somebody that knows about the inner workings of Vegas, graphics cards and video rendering engines in Windows can figure something out. I have tried several different things and nothing seems to work. No matter what deinterlacing is applied in the project settings, or if deinterlacing is selected in the preview device options, Vegas still displays both fields at the same time on the secondary display, while it displays only one on the monitor window within Vegas' GUI. I'll show you what I'm talking about. In footage I shot last weekend I found this frame that serves well to illustrate this, since I was lifting the tripod with the camera, moving it from one place to another, so it's very shaky footage. In this frame you see big black curtains, partially open so it offers a good contrast to see this problem. I uploaded a few photos rather than screenshots to show properly what I'm talking about. In all these the preview quality is set to Best and Full.

Note: when I first posted this message, I made image tags, and the images would not display, so I changed them to links.

This is the photo I took from the computer monitor, showing the preview area in the Vegas GUI:

http://imgur.com/T0DXb.jpg

And this is the photo I took of the TV set, which is connected to the second output of the graphics card (HIS ATI Radeon 4850, 1 GB of RAM), with deinterlacing unchecked in the preview device options, with the card set to 1920x1080, 60Hz from the ATI control panel:

http://imgur.com/8KNTU.jpg

As you can see, it shows the two fields at the same time (see the ghost effect in the curtains). This wouldn't be a problem if when in motion it would show the fields properly, or at least deinterlace properly, with BOB or some other deinterlacing method, but it doesn't. It shows interlaced video in a weird way that makes it look progressive. You might say, why don't you check the deinterlace option in preview device? Well, I did, and here's what the frame looks like:

http://imgur.com/zg8h1.jpg

Some of you may think that because the card is set to 60Hz (meaning 1080p), but that output is normally set to that and I play all sorts of interlaced content from the computer to the TV set and it displays properly. For example, I watch recorded TV shows from Media Center, all of which come from an interlaced broadcast signal, and some of which are interlaced from the cameras to the final product, such as Saturday Night Live. I also play clips from my AVCHD cameras that are mostly interlaced, and I play them from Media Center, from Media Player Classic, Window Media Player 12 or Splash Lite, and they also display properly in my TV set.

Now, if I switch the signal from the ATI control panel for the TV set from 60 to 30 Hz, the playback in Vegas is a bit smoother, but it's still not interlaced. Playback still makes interlaced video look like weird progressive video. Here's 30 Hz selected in the ATI control panel without “apply deinterlace filter” selected.

http://imgur.com/V0WC3.jpg

And here's 30 Hz with the apply deinterlace filter applied:

http://imgur.com/GUGXh.jpg

As you can see, it barely makes any difference. No matter if the signal selected in the graphics card control panel is 1080p or 1080i, Vegas still shows the two fields at the same time. What the supposed deinterlacing filter does is just remove the comb effect, however, it seems to me that Vegas' external preview module is very poorly written and the deinterlacing filter is just a toy which does nothing more than applying deinterlacing to a video signal that is coming as progressive showing the two fields at the same time. Because as you can see, Vegas shows the two fields at the same time in both 60 and 30 Hz, and with or without the deinterlacing filter applied.

To make things even more interesting, Vegas can show a decent preview that looks kind of like the original, but if you convert the footage to 59.94fps and set the project to the same frame rate. When it's played back, it looks the same to the eye as when 1080i footage is played with the proper field order, and it doesn't show the double field problem, both with the graphics card set to 30 Hz or to 60 Hz. Here's a photo of the TV set with the footage already converted to 59.94 fps progressive, with the graphics card set to 30 Hz for the TV set (it looks the same at 60 Hz):

http://imgur.com/ceXAp.jpg

As you can see, the double field effect is gone.

Just in case you wonder, setting the deinterlace method in the project settings has absolutely no influence in this at all in any of the three choices. All three choices look exactly the same in the second monitor. Also, this is not a matter of which brand of monitor or connection type it may be. This is very easy to test because if, in the Preview Device Preferences you change the display adapter from 2 to 1, making the video preview to show on your main monitor in full screen, it looks just the same as it looks on the TV set, with the double field effect.

I also found out, after I took the photos, that the only way to make the double field effect disappear in the second monitor is setting the preview quality to any of the Draft modes, but the only one that makes sense to select is Draft Full, which is not really full but half size, and it effectively removes one field, so when the cursor is stopped, it may not show the double field effect, but when being played back it still shows the interlaced footage as progressive, obviously because one field is gone.

So I wish I knew exactly what's the problem with this. On one hand, Vegas doesn't seem to connect to any DirectX video rendering engine such as Overlay, VMR 7, VMR 9 or EVR, because any video that goes through those engines can have its picture properties tweaked with the ATI control panel, in the Avivo Video section. See for yourself if you have the Catalyst Control Panel. Select the All Settings tab in the Avivo Video section and try to change brightness, contrast, tint, saturation, or any of the other controls below it, including deinterlacing method. They change video when it's being played from any regular player, but they don't change a thing in the video signal that Vegas sends to the second monitor when external monitor preview is selected. To change the picture settings for that, you have to go to the Color tab in the Desktop Properties section after you selected the second monitor and chose properties.

So the way Vegas sends a signal to the secondary monitor, it seems to me, is absolutely non compliant with DirectX. There are several video players out there, and some are better than others, but one thing they all have in common is that you can always change the picture settings of the video being played with the settings in the Avivo Video section of the ATI control panel (as long as “Use application settings is unchecked, of course”) while you can't do that with Vegas.

But this wouldn't be a terrible problem if it sent a video signal to the external monitor with the fields in the right order, instead of displaying the two at the same time, causing a weird effect that is neither interlaced nor progressive video. It's more like blurry progressive video, caused by displaying the two fields at the same time.

Besides having a very poor monitoring engine for interlaced video, this makes me wonder if the drivers in Vegas for the other preview devices have the same poor quality. I would consider spending $190 in an Intensity Pro card if I knew for sure that it's going to give me perfect interlaced preview, but I don't know for sure and nobody seems to know. I don't want to have to pay for it and the shipping and then waste more money in shipping because I have to return it.

If anybody wants to test this for yourselves, I recompressed a small part of the footage as AVCHD that you can drop in your Vegas timeline and see for yourselves to compare, and you can tell me if you see the same things I see or something different. The file is only 11MB and it's here: http://www.mediafire.com/?dk3jyxjetm0

So, can anybody provide any insight on this? Think of a workaround, a hack, anything? I think at minimum Sony should remove that paragraph from the Vegas information page, because it's very misleading. To say "Next-generation monitoring tools allow full screen timeline playback to LCD and CRT secondary displays via component or DVI connections, with support for scaling, de-interlacing, and color profiles." when it clearly is not like that is pure misleading advertising, and laughable, unless they include in the same paragraph that the monitoring tools are only for progressive video such as 24p or 30p.

Comments

Jay Gladwell wrote on 5/19/2010, 3:13 PM

Sorry, but I never had, nor am I having at this time, a problem previewing interlaced footage on an external monitor. I've been using Vegas since version 3.0 and it's perfectly for me.

Maybe I'm not understanding fully what your problem is.


farss wrote on 5/19/2010, 3:39 PM
I've seen what I think is the same problem with preview over firewire.
Save me a lot of typing and tell me, is your TV/Monitor 1920x1080 capable?

Bob.
Sebaz wrote on 5/19/2010, 3:56 PM
Bob, my TV is a Sony that is originally 720p but it's capable of taking a 1080i or 1080p, both 60 and 24 Hz, and display it with better picture quality than some 1080p native sets. I know that because when I bought it in the now defunct Circuit City they had several 1080p sets, some even cheaper than mine, and this Sony looked better than most of them, and not only cheap brands like Vizio and Sharp, but the ones supposed to be top brands like Samsung. In fact, that day I found it hard to even find a difference between this model and another Sony that was 1080p native but about $300 more expensive, which is why I went with this one.

However, despite being 720p, like I mentioned, the same TV displays interlaced sourced content perfectly such as TV shows from Media Center, or my own interlaced footage in AVCHD. Besides, while the TV is 720p, the main monitor I recently got is a Samsung that is 1080p native and when I choose it in Vegas in the preview device preferences, the double field problem shows exactly the same.
Yoyodyne wrote on 5/19/2010, 4:24 PM
I downloaded the clip and popped it into a HDV 1080 60i project. I think I'm also seeing what your seeing if I understand your issues properly. FYI going out 1920x1080 via DVI to a 32 inch Toshiba Regza LCD monitor. Native resolution of monitor is 720x1280

I see a double curtain in the "preview to external monitor" window but a single curtain in the "preview" window. The clip looks smooth when played back in window media player, a bit "jumpier" when played back in Vegas. Either via Preview window or secondary monitor.

Did a render to Cineform for the heck of it. Looks the same in Vegas but is super jumpy in windows media player, looks like a field order problem. Not sure this is helping but is kind of interesting.

Wish I could give more info but I try to avoid working with interlaced media and just don't have a lot of experience with it.
Yoyodyne wrote on 5/19/2010, 4:35 PM
oh man, now you got me curious....

So did a wmv render of the native m2ts file, looked pretty bad. I'm kind of surprised, I've had great luck with Vegas' render engine. It seems this file and Vegas do not get along. What camera did this come from or have I missed that info in above post?

and just to add more info:

wmp seems to be cool with the interlacing in the native file but when rendered to wmv from Vegas wmp shows a lot of interlacing artifacts. Looks pretty bad. I'm also getting the "double curtain" in the bad looking Vegas wmv render. This file seems to be a real torture test for Vegas.
Sebaz wrote on 5/19/2010, 5:02 PM
Yoyodyne, it really has nothing to do with the file itself. I can see the same problem in files originated in two different AVCHD cameras, one a Panasonic HMC40 and the other a Canon HF100, and also in HDV files from a Sony HVRHD1000U. I can see it even with 1080i MPEG2 files that are exactly the signal from the local affiliates of the big networks that are recorded in Media Center, but not recompressed because my tuner doesn't have that capability, if I change those files to a transport stream without recompressing of any kind, and throw them in Vegas, the double field effect is still there. The file I uploaded is a simple example that is good to show this problem, I could even grab my camera and record 5 seconds of me moving it and you would see it.

The reason why you might be seeing weird behavior from wmv files is that I don't think that wmv is a format intended for interlaced video.
Yoyodyne wrote on 5/19/2010, 5:46 PM
I guess by file i mean the interlaced footage from the camera.

I agree that wmv is a "computer" format but i thought it would be smarter in how it would deal with interlacing. I don't remember these issues when working with standard def stuff. Of course it's been a while...
farss wrote on 5/19/2010, 8:19 PM
So if I read you correctly the actual DISPLAY in the TV is 1280x720.

What I believe is happening is Vegas is scaling your 1080i footage with no de-interlacing method being used and what you are seeing is not native interlace artifacts at all, you're getting the typical 'dogs teeth' that happen from interlace aliasing when scaling.

I don't see this on my 1290x1080 monitor, probably Jay doesn't see this either on his monitor for the same reason. I do see this problem on my 4:3 CRT if Vegas letterboxes 16:9 If you were to connect via HDMI then Vegas would feed the full raster 1080 to the monitor (which would actually then be a TV) and you would not have this problem. With the DVI connection Windows and hence Vegas knows the display is only 720 so Vegas does the scaling

Hope this all makes sense.

Bob.
Sebaz wrote on 5/19/2010, 9:00 PM
No, Bob, the TV set might be 720p, but the signal being sent to it is 1920x1080 at either 60 hz or 30 Hz (normally 60 Hz). I have it setup that way in the ATI control panel. Like I said, while the TV set is 720p, it displays interlaced content perfectly from the computer, my video cameras, the HD cable box, the blu-ray player (when I play interlaced content in it), etc, all sending a 1080i or 1080p signal, and this kind of thing doesn't happen on anything else.

"If you were to connect via HDMI then Vegas would feed the full raster 1080 to the monitor (which would actually then be a TV) and you would not have this problem. With the DVI connection Windows and hence Vegas knows the display is only 720 so Vegas does the scaling"

No, that's not correct. The graphics card has two DVI outputs, one of which is going to the TV using a cable that is DVI on one end and HDMI on the other so it can connect to the TV set. While the computer may know that the TV set is natively 720p, it still sends a full 1080p signal to it per my choice. Besides, like I said earlier, it doesn't matter if the second monitor is 720p because when I choose the main monitor from the preview device preferences in Vegas and click on the external preview button, the double field effect still shows, and the main monitor is a native 1080p monitor. To me it's clearly a bug in the external preview module in Vegas.
Grazie wrote on 5/19/2010, 10:15 PM
I just wanted to throw this in and at the risk of being criticised as going off-message here, I have got back the audio SYNC>0 issue.

If I attempt to increase the Sync option I get an immediate hang in VP9e. It WAS cured prior to "e", but it is now back. I can repro this at a drop of a hat.

I'm running PAL SD out to a JVC 720 line ExtMon.

Grazie
GlennChan wrote on 5/19/2010, 10:59 PM
1- You will always have problems converting interlaced to progressive perfectly... because that isn't possible. But if you aren't so picky, then high quality de-interlacing will look pretty good.

There are various forms of low quality de-interlacing found in TVs... that may be something to watch out for.

2- One way of doing it would be to get a SDI hardware card and output the signal to a broadcast monitor. If it's a CRT, then it will handle interlacing perfectly.
If it is a LCD, then hopefully it has a good de-interlacing chip. Or you can send the signal to a piece of hardware that does deinterlacing (but some of those introduce a lot of delay because they go for really high quality).

Of course, some people expect more than what they paid for...

3- Vegas does not have high quality de-interlacing. It only does very basic de-interlacing.

If there was a high quality de-interlacing algorithm built into Vegas, it would suck up CPU cycles and slow down the preview. And it would give you better looking video when it is in motion. And it would help you spot field order issues. Or... you can preview your video properly. Play it back on a system the end user would use. e.g. If you are making a DVD, watch your DVD in a DVD player hooked up to a TV (or broadcast monitor if you have one).

3b- If you choose the Interpolate de-interlacing mode, I think you can get Vegas to only show one of the fields??? (I can't remember if you can do this for the windows secondary display.)

4- For Vegas to show both fields at once is correct. CRTs will show both fields. Some LCDs will show both fields, some LCDs just discard one of the fields and shows the other.

5- (I may be wrong here.)
When you have an image on the secondary windows display, Windows probably thinks that it is a progressive image being shown on that display and tells the TV that it is sending it a progressive image. So the TV will not try to apply its own de-interlacing to its input.

Maybe it would be an improvement if Vegas could send the image to the TV in a way that lets the TV do the de-interlacing (hopefully it does not have an awful deinterlacer). Maybe it would work, maybe it wouldn't.

----
But bottom line...
The cheap monitoring solutions will probably always have flaws that can't be overcome. Accurate video is a niche that hasn't benefited from the economies of scale of the larger consumer market... because the consumer market manufacturers aren't interested in accurate video. There are solutions for more accurate monitoring, you just have to pay for them. Remember that editing systems used to cost six figures.
farss wrote on 5/19/2010, 11:02 PM
"Besides, like I said earlier, it doesn't matter if the second monitor is 720p because when I choose the main monitor from the preview device preferences in Vegas and click on the external preview button, the double field effect still shows, and the main monitor is a native 1080p monitor."

Sorry, I missed that bit. Kind of blows my theory out of the water then. When I get a chance I'll go and check again how it looks, no dogs teeth and de-interlace box is ticked however if I recall correctly either ticking the box or not made no difference that I could see and it surely should.

Bob.

Bob.
GlennChan wrote on 5/19/2010, 11:22 PM
If you make your project 30p and set de-interlacing on with interpolate fields, then you can force Vegas to do basic de-interlacing on the footage. I wouldn't recommend this though.
(I believe that the render as settings override your project settings when you render, so what you see may not be the same as what you render.)
farss wrote on 5/19/2010, 11:28 PM
The discussion is about how Vegas feeds video to the External Preview Monitor not how it does de-interlacing internally. Vegas would appear in theory at least to use the GPU, not the CPU to do the de-interlace for the external monitor.

Bob.
willqen wrote on 5/19/2010, 11:50 PM
Sorry to have to tell you this; but I get beautiful playback on my 1080p 42" LG TV that I use as my main monitor. Also on a crappy e-machines 1440x990 LCD I use as a secondary monitor only for convenience, not quality check.

For hardware I have a Q6600 (2.4Ghz) with 8GB DDR2 RAM , ordinary hard drive (500 GB) NVidea GForce 9500 GT Video card , and oh , I run WIN7 64bit.

I run interlaced video almost all the time with usually (unless I start stacking tracks) no problems on playback. Either as SD DV or 1080i HDV. I do occasionally get to edit AVCHD in progressive and it does look marginally better in the preview, but the difference is subtle.

The video clip I downloaded from your link looks great. No artifacts, especially no double windows, or anything like that.

I set my properties to 1080HD 1920x1080 (I didn't use the HDV setting as those are 1440x1080, your clip showed itself as 1920x1080) upper fields first, square pixels, 32bit video, and blend fields.

Good Luck,

Will
Yoyodyne wrote on 5/20/2010, 12:10 AM
Glenn, I liked your interlacing is evil comment better :)

Interesting discussion here, great to read all the responses. I confess I don't quite understand why there are interlaced HD cameras. I know there were a few CRT HD sets in the early days, I had one, but it seems that everything has gone LCD/Plasma. From what I understand all modern HD sets are capable of dealing with an interlaced signal, some good - some bad, but can only display a progressive image.

It seems to me that interlacing is just a can of worms in HD. I know that there are tons of very high end HD cameras and edit suites that can deal with interlaced HD but......why? Was it a bandwidth thing? Some kind of HD/NTSC legacy compromise? I know that networks have gone with either a 720p or 1080i standard for HD. The 720p makes sense but the1080i I don't get. Why send a signal that no HD set can display natively. Aren't you delivering a signal that is at the mercy of the televisions de-interlacing chip?

Sorry if this is a bit off topic.
farss wrote on 5/20/2010, 12:37 AM
I downloaded you sample file, opened it in a conforming project and looked at it on my 24" monitor. With the External Preview control in Vegas's own display control's De-Interlace box ticked it does pretty much what I'd expect. There are large grey borders down both sides the black curtains where the camera moves quickly past them. Both fields contain the same grey.
Turning Off the de-interlace check box I see both fields i.e. fine lines of grey from one field to the left of the curtains and fine lines of grey to the right of the curtains.
I can see nothing unexpected.

I did notice another check box labelled "Scale output to fit display". I wonder what happens when you check and uncheck this box?

Bob.
willqen wrote on 5/20/2010, 12:46 AM
Yoyodyne,

Spot (Douglas Spotted Eagle) explains this in his Full HD book. It's a great read.

If I have this right (I need to read the book at least 3 more times) the 1080i is not only 1440 x 1080 but it's anamorphic; where the video that would be1920 square pixels is squeezed into a more rectangular 1440 pixels to save space for broadcast, then "un-squeezed" or rescanned by the TV set to regular square pixel 1920x1080 with of course the prerequisite quality loss.

Hopefully I got this right,

Will
Sebaz wrote on 5/20/2010, 7:08 AM
1- You will always have problems converting interlaced to progressive perfectly... because that isn't possible. But if you aren't so picky, then high quality de-interlacing will look pretty good.

OK, but this is not a problem with the deinterlacing quality of my TV set (or any other for that matter), because other than with Vegas' output, the final deinterlacing when the TV receives a 1080i signal is perfect. I never see any comb effect on anything, but I still see smooth motion to give me the illusion that I'm seeing 60 frames per second when I'm actually seeing 60 half frames per second. That is the basic goal of interlaced video, correct? When I watch TV using the cable box, which outputs in 1080i, and I'm watching a program that comes from interlaced HD cameras, let's say the news for example, the TV set has to do the deinterlacing because it's receiving a 1080i signal, and it looks perfect. Same thing when I connect my AVCHD cameras to it, using either component cables or HDMI. The footage looks perfect, giving me the illusion of watching 60 full frames per second.

And when the deinterlacing is done in the graphics card instead, it also looks perfect. I keep both outputs in the graphics card to 1920x1080 at 60 Hz, so the TV set is receiving a signal that is already progressive and doesn't do the deinterlacing itself. Which means that if I play interlaced content from the computer to the TV set in this mode, the card has to do the deinterlacing before sending it to the TV set, or I would see the comb effect. In fact, if I select "Weave" in the deinterlacing section of the Avivo video settings, the TV set shows the comb effect because no deinterlacing is being done neither by the card nor by the TV set.

My point is that Vegas doesn't route the video signal properly to the card according to the Directx guidelines. If it were, I would be able to adjust picture quality and deinterlacing mode with the Avivo Video parameters and I can't, but only for Vegas, while I can for any other video player.

I don't think that any deinterlacing specific equipment, cheap or expensive, would help at all here, because Vegas is not outputting its video signal in a proper way. If it were, then when I set the graphics card to output 1080i to the TV set, and Vegas is playing back an interlaced video signal, then the TV set would perform the deinterlacing like it does for any other interlaced source when it's receiving a 1080i signal. But it doesn't. It still shows the comb effect! That makes crystal clear to me that Vegas is definitely not sending the signal to the card the way it's supposed to.

You say that Vegas doesn't have high quality deinterlacing and it only does basic deinterlacing. To me it doesn't even need to do any deinterlacing. If it routed the signal properly to the graphics card, then the card would handle deinterlacing, or in any case, the TV set would handle it, but it would be done, and it's not.

5- (I may be wrong here.)

No, you're right. If the TV set is receiving a 1080p signal it's not going to do any deinterlacing, because it's provided that the deinterlacing was already done in the card. Which makes me go back to the point, that Vegas is not routing the video properly to the card so the card provides the deinterlacing, or, if it's set to 1080i, that the TV set does the deinterlacing.
Marco. wrote on 5/20/2010, 8:47 AM
Output your file to secondary Windows display which does 1920x1080. Deinterlacing is unchecked in the external preview preferences.

Video looks exactly same as it does in the internal preview. It's clean.

Marco
Sebaz wrote on 5/20/2010, 9:20 AM
Output your file to secondary Windows display which does 1920x1080. Deinterlacing is unchecked in the external preview preferences.

Then why if I select the display adapter No.1, which is a native 1080p monitor, it shows the same double field effect that shows on the TV set? I don't think it's a matter of the monitor or TV's native resolution, if it were, I would have the same problem with all the videos that I play from video players, including this same exact clip played to the TV set using Windows Media Player and Splash Lite, not showing the double field effect at all.
richard-amirault wrote on 5/20/2010, 11:00 AM
I don't think that any deinterlacing specific equipment, cheap or expensive, would help at all here, because Vegas is not outputting its video signal in a proper way..... (snip).... That makes crystal clear to me that Vegas is definitely not sending the signal to the card the way it's supposed to.(snip) If it routed the signal properly to the graphics card, then the card would handle deinterlacing, or in any case, the TV set would handle it, but it would be done, and it's not.

Maybe I'm missing something here (not the first time) but for YEARS I've read here that any graphics card will do for Vegas because it does not use the capabilities of the card, but sends the video out direct.

To me it seems like you complaining because you want Vegas to use the capabilities of the card, and it doesn't.
Sebaz wrote on 5/20/2010, 12:25 PM
Maybe I'm missing something here (not the first time) but for YEARS I've read here that any graphics card will do for Vegas because it does not use the capabilities of the card, but sends the video out direct.

It's been clear forever that Vegas doesn't use GPU acceleration, however my goal is to find out why does it send out a signal in such an improper way and if there's a way to fix it or to workaround it.
craftech wrote on 5/20/2010, 12:58 PM
I have a 720p television as well. It is a Panasonic Plasma. Like all 720p televisions 720p is the only resolution the set can display. If it receives a 1080p or 1080i signal the television converts it to 720p. If it receives a 1080i signal it must downconert and also deinterlace the signal as well.

Some do a good job of this and others deteriorate the signal badly depending upon the quality of the internal processing chip and other factors.

Have you tried sending it to someone else's 1080p television? Although from what you described it sounds like Vegas isn't sending out a properly configured signal.

John