I believe that any good NLE should have a flawless external monitor preview without relying on expensive third party cards, as long as the computer has a more or less powerful graphics card with a second HDMI or DVI port, because certainly the capability is there. In Edius, for example, you can choose to use DirectDraw Overlay in the preferences and using the card's Theater Mode you can have a great preview of the footage being played in the NLE, and this includes interlaced footage. It's not a perfect solution, because it forces Windows to go into basic mode and showing the hideous basic interface, but at least it's possible.
In Vegas however, previewing interlaced footage accurately is not possible with the secondary monitor preview that Sony claims in the “Video Editing” section of the Vegas Pro 9 page on Sony's website:
Enhanced Video Monitoring
Well, either this is a case of seriously misleading advertising, or I'm doing something terribly wrong, but I don't think I am, because I have tried several different things to be able to preview interlaced footage properly in my TV set, which is connected to the second DVI output of my ATI HIS Radeon 4850 graphics card, and nothing works. Surely, it displays the footage, and if it were natively progressive footage I wouldn't have a problem with it, but it's 29.97 interlaced footage, and I hate seeing it as progressive when it is not that way. It's as annoying as seeing widescreen movies in 4:3 pan&scan like in the old days.
I would really like to find out what's the cause of this. Perhaps there's nothing that can be done unless SCS fixes it, but perhaps somebody that knows about the inner workings of Vegas, graphics cards and video rendering engines in Windows can figure something out. I have tried several different things and nothing seems to work. No matter what deinterlacing is applied in the project settings, or if deinterlacing is selected in the preview device options, Vegas still displays both fields at the same time on the secondary display, while it displays only one on the monitor window within Vegas' GUI. I'll show you what I'm talking about. In footage I shot last weekend I found this frame that serves well to illustrate this, since I was lifting the tripod with the camera, moving it from one place to another, so it's very shaky footage. In this frame you see big black curtains, partially open so it offers a good contrast to see this problem. I uploaded a few photos rather than screenshots to show properly what I'm talking about. In all these the preview quality is set to Best and Full.
Note: when I first posted this message, I made image tags, and the images would not display, so I changed them to links.
This is the photo I took from the computer monitor, showing the preview area in the Vegas GUI:
http://imgur.com/T0DXb.jpg
And this is the photo I took of the TV set, which is connected to the second output of the graphics card (HIS ATI Radeon 4850, 1 GB of RAM), with deinterlacing unchecked in the preview device options, with the card set to 1920x1080, 60Hz from the ATI control panel:
http://imgur.com/8KNTU.jpg
As you can see, it shows the two fields at the same time (see the ghost effect in the curtains). This wouldn't be a problem if when in motion it would show the fields properly, or at least deinterlace properly, with BOB or some other deinterlacing method, but it doesn't. It shows interlaced video in a weird way that makes it look progressive. You might say, why don't you check the deinterlace option in preview device? Well, I did, and here's what the frame looks like:
http://imgur.com/zg8h1.jpg
Some of you may think that because the card is set to 60Hz (meaning 1080p), but that output is normally set to that and I play all sorts of interlaced content from the computer to the TV set and it displays properly. For example, I watch recorded TV shows from Media Center, all of which come from an interlaced broadcast signal, and some of which are interlaced from the cameras to the final product, such as Saturday Night Live. I also play clips from my AVCHD cameras that are mostly interlaced, and I play them from Media Center, from Media Player Classic, Window Media Player 12 or Splash Lite, and they also display properly in my TV set.
Now, if I switch the signal from the ATI control panel for the TV set from 60 to 30 Hz, the playback in Vegas is a bit smoother, but it's still not interlaced. Playback still makes interlaced video look like weird progressive video. Here's 30 Hz selected in the ATI control panel without “apply deinterlace filter” selected.
http://imgur.com/V0WC3.jpg
And here's 30 Hz with the apply deinterlace filter applied:
http://imgur.com/GUGXh.jpg
As you can see, it barely makes any difference. No matter if the signal selected in the graphics card control panel is 1080p or 1080i, Vegas still shows the two fields at the same time. What the supposed deinterlacing filter does is just remove the comb effect, however, it seems to me that Vegas' external preview module is very poorly written and the deinterlacing filter is just a toy which does nothing more than applying deinterlacing to a video signal that is coming as progressive showing the two fields at the same time. Because as you can see, Vegas shows the two fields at the same time in both 60 and 30 Hz, and with or without the deinterlacing filter applied.
To make things even more interesting, Vegas can show a decent preview that looks kind of like the original, but if you convert the footage to 59.94fps and set the project to the same frame rate. When it's played back, it looks the same to the eye as when 1080i footage is played with the proper field order, and it doesn't show the double field problem, both with the graphics card set to 30 Hz or to 60 Hz. Here's a photo of the TV set with the footage already converted to 59.94 fps progressive, with the graphics card set to 30 Hz for the TV set (it looks the same at 60 Hz):
http://imgur.com/ceXAp.jpg
As you can see, the double field effect is gone.
Just in case you wonder, setting the deinterlace method in the project settings has absolutely no influence in this at all in any of the three choices. All three choices look exactly the same in the second monitor. Also, this is not a matter of which brand of monitor or connection type it may be. This is very easy to test because if, in the Preview Device Preferences you change the display adapter from 2 to 1, making the video preview to show on your main monitor in full screen, it looks just the same as it looks on the TV set, with the double field effect.
I also found out, after I took the photos, that the only way to make the double field effect disappear in the second monitor is setting the preview quality to any of the Draft modes, but the only one that makes sense to select is Draft Full, which is not really full but half size, and it effectively removes one field, so when the cursor is stopped, it may not show the double field effect, but when being played back it still shows the interlaced footage as progressive, obviously because one field is gone.
So I wish I knew exactly what's the problem with this. On one hand, Vegas doesn't seem to connect to any DirectX video rendering engine such as Overlay, VMR 7, VMR 9 or EVR, because any video that goes through those engines can have its picture properties tweaked with the ATI control panel, in the Avivo Video section. See for yourself if you have the Catalyst Control Panel. Select the All Settings tab in the Avivo Video section and try to change brightness, contrast, tint, saturation, or any of the other controls below it, including deinterlacing method. They change video when it's being played from any regular player, but they don't change a thing in the video signal that Vegas sends to the second monitor when external monitor preview is selected. To change the picture settings for that, you have to go to the Color tab in the Desktop Properties section after you selected the second monitor and chose properties.
So the way Vegas sends a signal to the secondary monitor, it seems to me, is absolutely non compliant with DirectX. There are several video players out there, and some are better than others, but one thing they all have in common is that you can always change the picture settings of the video being played with the settings in the Avivo Video section of the ATI control panel (as long as “Use application settings is unchecked, of course”) while you can't do that with Vegas.
But this wouldn't be a terrible problem if it sent a video signal to the external monitor with the fields in the right order, instead of displaying the two at the same time, causing a weird effect that is neither interlaced nor progressive video. It's more like blurry progressive video, caused by displaying the two fields at the same time.
Besides having a very poor monitoring engine for interlaced video, this makes me wonder if the drivers in Vegas for the other preview devices have the same poor quality. I would consider spending $190 in an Intensity Pro card if I knew for sure that it's going to give me perfect interlaced preview, but I don't know for sure and nobody seems to know. I don't want to have to pay for it and the shipping and then waste more money in shipping because I have to return it.
If anybody wants to test this for yourselves, I recompressed a small part of the footage as AVCHD that you can drop in your Vegas timeline and see for yourselves to compare, and you can tell me if you see the same things I see or something different. The file is only 11MB and it's here: http://www.mediafire.com/?dk3jyxjetm0
So, can anybody provide any insight on this? Think of a workaround, a hack, anything? I think at minimum Sony should remove that paragraph from the Vegas information page, because it's very misleading. To say "Next-generation monitoring tools allow full screen timeline playback to LCD and CRT secondary displays via component or DVI connections, with support for scaling, de-interlacing, and color profiles." when it clearly is not like that is pure misleading advertising, and laughable, unless they include in the same paragraph that the monitoring tools are only for progressive video such as 24p or 30p.
In Vegas however, previewing interlaced footage accurately is not possible with the secondary monitor preview that Sony claims in the “Video Editing” section of the Vegas Pro 9 page on Sony's website:
Enhanced Video Monitoring
Well, either this is a case of seriously misleading advertising, or I'm doing something terribly wrong, but I don't think I am, because I have tried several different things to be able to preview interlaced footage properly in my TV set, which is connected to the second DVI output of my ATI HIS Radeon 4850 graphics card, and nothing works. Surely, it displays the footage, and if it were natively progressive footage I wouldn't have a problem with it, but it's 29.97 interlaced footage, and I hate seeing it as progressive when it is not that way. It's as annoying as seeing widescreen movies in 4:3 pan&scan like in the old days.
I would really like to find out what's the cause of this. Perhaps there's nothing that can be done unless SCS fixes it, but perhaps somebody that knows about the inner workings of Vegas, graphics cards and video rendering engines in Windows can figure something out. I have tried several different things and nothing seems to work. No matter what deinterlacing is applied in the project settings, or if deinterlacing is selected in the preview device options, Vegas still displays both fields at the same time on the secondary display, while it displays only one on the monitor window within Vegas' GUI. I'll show you what I'm talking about. In footage I shot last weekend I found this frame that serves well to illustrate this, since I was lifting the tripod with the camera, moving it from one place to another, so it's very shaky footage. In this frame you see big black curtains, partially open so it offers a good contrast to see this problem. I uploaded a few photos rather than screenshots to show properly what I'm talking about. In all these the preview quality is set to Best and Full.
Note: when I first posted this message, I made image tags, and the images would not display, so I changed them to links.
This is the photo I took from the computer monitor, showing the preview area in the Vegas GUI:
http://imgur.com/T0DXb.jpg
And this is the photo I took of the TV set, which is connected to the second output of the graphics card (HIS ATI Radeon 4850, 1 GB of RAM), with deinterlacing unchecked in the preview device options, with the card set to 1920x1080, 60Hz from the ATI control panel:
http://imgur.com/8KNTU.jpg
As you can see, it shows the two fields at the same time (see the ghost effect in the curtains). This wouldn't be a problem if when in motion it would show the fields properly, or at least deinterlace properly, with BOB or some other deinterlacing method, but it doesn't. It shows interlaced video in a weird way that makes it look progressive. You might say, why don't you check the deinterlace option in preview device? Well, I did, and here's what the frame looks like:
http://imgur.com/zg8h1.jpg
Some of you may think that because the card is set to 60Hz (meaning 1080p), but that output is normally set to that and I play all sorts of interlaced content from the computer to the TV set and it displays properly. For example, I watch recorded TV shows from Media Center, all of which come from an interlaced broadcast signal, and some of which are interlaced from the cameras to the final product, such as Saturday Night Live. I also play clips from my AVCHD cameras that are mostly interlaced, and I play them from Media Center, from Media Player Classic, Window Media Player 12 or Splash Lite, and they also display properly in my TV set.
Now, if I switch the signal from the ATI control panel for the TV set from 60 to 30 Hz, the playback in Vegas is a bit smoother, but it's still not interlaced. Playback still makes interlaced video look like weird progressive video. Here's 30 Hz selected in the ATI control panel without “apply deinterlace filter” selected.
http://imgur.com/V0WC3.jpg
And here's 30 Hz with the apply deinterlace filter applied:
http://imgur.com/GUGXh.jpg
As you can see, it barely makes any difference. No matter if the signal selected in the graphics card control panel is 1080p or 1080i, Vegas still shows the two fields at the same time. What the supposed deinterlacing filter does is just remove the comb effect, however, it seems to me that Vegas' external preview module is very poorly written and the deinterlacing filter is just a toy which does nothing more than applying deinterlacing to a video signal that is coming as progressive showing the two fields at the same time. Because as you can see, Vegas shows the two fields at the same time in both 60 and 30 Hz, and with or without the deinterlacing filter applied.
To make things even more interesting, Vegas can show a decent preview that looks kind of like the original, but if you convert the footage to 59.94fps and set the project to the same frame rate. When it's played back, it looks the same to the eye as when 1080i footage is played with the proper field order, and it doesn't show the double field problem, both with the graphics card set to 30 Hz or to 60 Hz. Here's a photo of the TV set with the footage already converted to 59.94 fps progressive, with the graphics card set to 30 Hz for the TV set (it looks the same at 60 Hz):
http://imgur.com/ceXAp.jpg
As you can see, the double field effect is gone.
Just in case you wonder, setting the deinterlace method in the project settings has absolutely no influence in this at all in any of the three choices. All three choices look exactly the same in the second monitor. Also, this is not a matter of which brand of monitor or connection type it may be. This is very easy to test because if, in the Preview Device Preferences you change the display adapter from 2 to 1, making the video preview to show on your main monitor in full screen, it looks just the same as it looks on the TV set, with the double field effect.
I also found out, after I took the photos, that the only way to make the double field effect disappear in the second monitor is setting the preview quality to any of the Draft modes, but the only one that makes sense to select is Draft Full, which is not really full but half size, and it effectively removes one field, so when the cursor is stopped, it may not show the double field effect, but when being played back it still shows the interlaced footage as progressive, obviously because one field is gone.
So I wish I knew exactly what's the problem with this. On one hand, Vegas doesn't seem to connect to any DirectX video rendering engine such as Overlay, VMR 7, VMR 9 or EVR, because any video that goes through those engines can have its picture properties tweaked with the ATI control panel, in the Avivo Video section. See for yourself if you have the Catalyst Control Panel. Select the All Settings tab in the Avivo Video section and try to change brightness, contrast, tint, saturation, or any of the other controls below it, including deinterlacing method. They change video when it's being played from any regular player, but they don't change a thing in the video signal that Vegas sends to the second monitor when external monitor preview is selected. To change the picture settings for that, you have to go to the Color tab in the Desktop Properties section after you selected the second monitor and chose properties.
So the way Vegas sends a signal to the secondary monitor, it seems to me, is absolutely non compliant with DirectX. There are several video players out there, and some are better than others, but one thing they all have in common is that you can always change the picture settings of the video being played with the settings in the Avivo Video section of the ATI control panel (as long as “Use application settings is unchecked, of course”) while you can't do that with Vegas.
But this wouldn't be a terrible problem if it sent a video signal to the external monitor with the fields in the right order, instead of displaying the two at the same time, causing a weird effect that is neither interlaced nor progressive video. It's more like blurry progressive video, caused by displaying the two fields at the same time.
Besides having a very poor monitoring engine for interlaced video, this makes me wonder if the drivers in Vegas for the other preview devices have the same poor quality. I would consider spending $190 in an Intensity Pro card if I knew for sure that it's going to give me perfect interlaced preview, but I don't know for sure and nobody seems to know. I don't want to have to pay for it and the shipping and then waste more money in shipping because I have to return it.
If anybody wants to test this for yourselves, I recompressed a small part of the footage as AVCHD that you can drop in your Vegas timeline and see for yourselves to compare, and you can tell me if you see the same things I see or something different. The file is only 11MB and it's here: http://www.mediafire.com/?dk3jyxjetm0
So, can anybody provide any insight on this? Think of a workaround, a hack, anything? I think at minimum Sony should remove that paragraph from the Vegas information page, because it's very misleading. To say "Next-generation monitoring tools allow full screen timeline playback to LCD and CRT secondary displays via component or DVI connections, with support for scaling, de-interlacing, and color profiles." when it clearly is not like that is pure misleading advertising, and laughable, unless they include in the same paragraph that the monitoring tools are only for progressive video such as 24p or 30p.