Here's what I just don't get. On my system and lets ignore whether its right wrong or plagued with every virus known to man, when I have Vegas play out vanila DV, no FXs, nothing requiring rendering, why does the preview quality make a difference to the frame rate?
Assuming I'm monitor at native res with Simulate Device Aspect then OK somewhere some pixel remapping has to go on to adjust for the non square DV pixels, yet it makes not a tad of difference if that's set on or off.
But here's where I see it getting really wierd. If I reduce the size of the preview window the frame rate goes UP! But doing that should mean Vegas has to do more work, there's no longer a one to one correspondance between the source pixels and those being displayed.
Not really a big issue for me, but I'm now getting the sort of client who like to watch and this even happens with an external monitor. I'm not asking for anything to be fixed, just trying to see why it is so.
Assuming I'm monitor at native res with Simulate Device Aspect then OK somewhere some pixel remapping has to go on to adjust for the non square DV pixels, yet it makes not a tad of difference if that's set on or off.
But here's where I see it getting really wierd. If I reduce the size of the preview window the frame rate goes UP! But doing that should mean Vegas has to do more work, there's no longer a one to one correspondance between the source pixels and those being displayed.
Not really a big issue for me, but I'm now getting the sort of client who like to watch and this even happens with an external monitor. I'm not asking for anything to be fixed, just trying to see why it is so.