Is there any particular reason why Vegas' WMV options won't let me render an interlaced file? I mean as far as I know WMV supports interlaced mode, and Vegas supports interlaced for other codecs... just curious why it's not an actual option.
While the Windows Media Encoder has a deinterlace option, that is an option for preprocessing. I may be wrong, but I believe that all wmv files are progressive.
I am using an IO Data Avel LinkPlayer 2, which plays WMV HD. My television doesn't work with 720p, it only plays 1080i (for HD). I have rendered HD videos to 1080i with WMV from Vegas and played it back on the LP2 (as far as I know it isn't deinterlacing, but I could be wrong). I don't remember all the settings, but I remember there was a checkbox that essentially said to maintain source file properties. It's also important to get the PAR correct. For the LinkPlayer, the bitrate needed to be about 8Mbps max.
But now I've quit using WMV because the LinkPlayer2 handles playback of .m2t files (and does a nice job of it).
I have here the Xoro 8500, based on its Sigm chip 8620 this player is able to play back m2t files, mpeg-programm streams, but also wmv-HD and DivX-HD.
DivX-HD has some advantages versus wmv-HD - rendering is much faster (4x realtime on a P4 3.2 Ghz, versus 20x realtime with wmv-HD). And DivX-HD can be maintained interlaced, while that is not possible with wmv-HD if you render from the Vegas timeline. So, from that prespective I like DivX-HD much more.
To adjust the DivX 6.x codec in Vegas is a little bit tricky, to receive correct results. Screenshoots, how that can be done (without optimisation of the codec) can be found here (text in German, but you have full sceenshoots for 1080 50i - adjust that to 1080 60i if you need that):