Windows secondary display as display device

Sebaz wrote on 12/14/2008, 9:53 AM
Despite Vegas' many flaws, one thing I give it high marks for is the ability to use the secondary monitor as a display device. This allows me to connect my second DVI port to my TV set using a long DVI to HDMI cable. As far a I know, neither FCP nor Premiere have that, and it's a great way to make levels and color correction and any other filter without having to spend money on more expensive solutions.

One question I have regarding this. Even while editing 1080i, my output on the secondary display is never 60i, it looks more like 30p but crappy. I think it strips out one field even if you're using a Full preview setting. It's still useful for previewing purposes, but I would prefer to see the footage at full 1080i.

Is anybody else experiencing the same issue? My graphics card is a piece of junk, an eVGA 7950 GT, which has terrible performance when it comes to HD, or even SD, since video stutters quite often. I tried enabling and disabling the "Apply deinterlace filter" in the settings but it displays exactly the same.

Comments

TheHappyFriar wrote on 12/14/2008, 10:05 AM
you're right, preview only does half res. Good does full res. But it will be slower.

your GPU won't help with this either as it's coming straight out of Vegas. To change the quality via the video card control panel you'll need to edit monitor 2, not the TV out settings, those are for the s-video/RCA hookups.
Sebaz wrote on 12/14/2008, 10:22 AM
you're right, preview only does half res. Good does full res. But it will be slower.

No, I wasn't talking about the Preview quality as opposed to Good or Full. Even if set it to Best and Full, Vegas still takes one field away and plays back choppy.

As for the quality of the second monitor, I adjust it using the Nvidia control panel, and whether I set it to 1080i or 1080p it's the same as it relates to this.
TheHappyFriar wrote on 12/14/2008, 10:34 AM
I don't think it's vegas... I've hooked my ATI up to my parents 720 HD TV & it always previews fine (no dropping of fields). could very well be the vid card or the TV if that's the case.
Sebaz wrote on 12/14/2008, 10:50 AM
I don't think it's vegas... I've hooked my ATI up to my parents 720 HD TV & it always previews fine (no dropping of fields). could very well be the vid card or the TV if that's the case.

I may try an ATI at some point in the future. I'm very disappointed in Nvidia chips after spending $200 on this card two years ago which turned out to be a piece of junk. I bought it because it came with HDCP and I thought I would need that for whenever I got a Blu-ray burner, but also because the fact that it has HDCP obviously means that it's prepared for smooth HD playback, otherwise the HDCP would be pointless. However, HD playback is terrible in this card, it plays back a few frames normally and then it starts dropping one field for about two seconds, then it goes back to normal, then back to dropping fields, and so on. A total piece of junk. Next time I buy a card it will be an ATI.
farss wrote on 12/14/2008, 1:08 PM
Your video stuttering problem is not the fault of the video card. If your PC and NLE cannot feed it frames fast enough of course it's going to stutter.
The other problem I see is the refresh rate of the video card.

If you want a feed to an external monitor and don't want to use HD SDI then the BMD Intensity card feeding HDMI might suit your needs.

The nVidia QuadroFX series cards are pretty much the standard for graphic artists and compositors. I've got three systems with them and I'm very happy with them. The top of the line ones are expensive.

Bob.
Sebaz wrote on 12/14/2008, 2:39 PM
Your video stuttering problem is not the fault of the video card. If your PC and NLE cannot feed it frames fast enough of course it's going to stutter.

I have a 2.66 quad core with 4 GB of RAM on an Intel D975BXB2 motherboard. Hardly the fastest system these days, but more than enough to feed frames fast enough to the graphics card. The refresh rate is set automatically by the NVidia control panel when I select 1080p or 1080i for the secondary monitor.
farss wrote on 12/14/2008, 3:01 PM
Vegas will drop frames for many reasons.
For example if you're zoomed in on the T/L then the T/L refresh as the T/L scrolls can cause a frame or two to drop.
Vegas pretty much uses none of the hardware on the video card. A better video card is very unlikely to improve your problems.
Getting rid of any background processes stealing CPU cycles could help. Turning off video scopes' RT update and zooming the T/L right out are the kinds of things that make a difference from what I've noticed.

Bob.
TheHappyFriar wrote on 12/14/2008, 3:06 PM
I was doing HD playback on my AMD XP 1800, Win2k & an ATI 9600 AIW. Something's messed with your system

Some GPU's only play back HD well if it's in specific codec's. IE WMV or Quicktime. Some times the system just can't push through the frames for some reason: swap file, antivirus, etc. Normally if the swap file is on a separate drive as the HD vid, it's not an issue. I can play back HD Mpeg-2's from my camera no issues. I've got a slower system then you: AMD Phenom 2.3ghz & it still runs fine, so it's some piece of software or a setup on your system slowing you down.
Sebaz wrote on 12/14/2008, 4:04 PM
It's not the system, it's the card itself that it's a piece of junk. Some time ago I searched online and I found several reports from people that experienced the same thing. They switched to another model in the 8 series and the problem was gone, but I'm not going to reward either Nvidia's or eVGA's pathetic quality of hardware by giving them more money, even if they have another model that's good. I'll switch to an ATI next time.

But going back to the issue I posted about, I'm not sure if this is the video card causing that field stripping when editing in Vegas. While my video card is a piece of junk, when I play HD video from other players, whether it's Windows Media Player, GOM PowerDVD, it doesn't play as bad as from Vegas.
farss wrote on 12/14/2008, 6:25 PM
I'll try again.

Vegas has to decode every frame of video using the CPU not the GPU. It's doing more work that WMP etc updating the timeline, that's the playhead, thumbnails, VU meters and waveforms PLUS it it will not use any of the GPU's power to do that. As far as I know it possibly doesn't even use any of the enhanced instructions in the latest CPUs to speed things along. Of course if you've got any audio FXs it has to compute them as well. If your audio sample rate doesn't match your project's sample rate it has to resample the audio as well.

On top of all that Vegas uses the very old vfw interface so no hardware accelerator will work with Vegas either.

So if your video card costs $100 or it's a $5K nVidia CUDA system it'll make no difference to Vegas. The one and only thing that'll make Vegas go faster is CPU speed and depending on what type of video you work with, a big RAID array. The later applies to any NLE. The rest pretty much more and more only to Vegas.

Bob.
kairosmatt wrote on 12/14/2008, 7:02 PM
Bob,

Thanks for the tips about the T/L zoom and realtime scopes affecting playback. I'll be trying that out tomorrow. Learn something new everytime I check in here...

kairosmatt