I've been shooting 1080i with my FX1 cameras for a few years now, and making SD DVDs. I don't know if there is such a thing as a generic "best" way for all purposes. However Vegas does a reasonable job of downconverting. I am editing at 1080i resolution and then I render out a 720x480 (widescreen) MPEG2 in Vegas, this works ok and is the most convenient option.
There are several different ways to convert 1080i to standard-def DVD (which could be 480i or 480p). If you want to tweek the details, like the nature of the motion blur on the downconverted material you could export the 1080i from vegas using the DebugMode frameserver and then do the downconversion in Avisynth. Here is a script I have used to do this: http://www.bealecorner.com/trv900/DVD/DownconvertHDV-1080i-to-480i.avs.txt
I don't currently do it this way just because for most of my projects, using the Vegas built-in downconvert is good enough, easier and significantly faster.
I posted this before but it did not show up so here goes again:
Your camera almost certainly will allow you to capture as widescreen DV which is one option. The most common work-flow and the best I think is to finalize your project in HDV and then render out to the "DV Widescreen for DVDA" template and take the rendered file to DVDA.
Ulf, where will your DVD be shown? Computer display only, you'll likely want to convert 60i to 30p for best display. If you're going to be showing mostly on televisions, are they CRT or LCD/other potential progressive display?
There are a few tricks in either direction, which you've probably realized already
Spot's point is a very good one, you can get a better picture quality by optimizing your process for the intended display.
In my case my customers generally end up doing "all of the above" with the DVD- for example- display on their parent's old interlaced CRT-type TV, on their own TV which might be a LCD flat panel or maybe not, and on their laptop and desktop PCs. So it can be hard to know how to optimize. My strategy has been to leave material that was shot interlaced in that format, on the theory that a good progressive display should have a good deinterlacer and if not, it's not my fault- they should buy a better one :-). However it's true that progressive material compresses better in the MPEG process, other things being equal, and this can be significant if you've got 2 hours or more to put on a single DVD.
From a high level, keep your project at HDV level; Best quality (you will be downsizing later); have either Blend or Interpolated in the deinterlace option (Interpolated for fast-moving footage, otherwise Blend).
Check your project audio tab, default I think for me out of the box has always been 16bit 44.1.Khz. Set that to 48Khz and click the "start all projects with this setting" so you don't have to mess with it again.
When editing is completed, output to MPEG2, select the DVDA DVD Widescreen templates, Best; then render AC3 separately, as normal.
If you downconvert during capture from the camera instead of Vegas (I have FX1, not sure if FX7 has this option) , you loose all the extra detail that HDV capture provides for zooming.
Edited: by the way, a general Vegas MPEG2 encoding comment: I've been using 8MB/sec constant bit rate exclusively where possible (less than hour or so footage). Phenomenal results.
When I decided I needed a new NLE this past summer, especially for HDV work, I tried the demos of EDIUS 4 and Vegas 7, and I ended up buying Vegas 7. But I did do a 1080i HDV project in both of them, and downconverted with their respective tools to widescreen DVD. I found that the EDIUS 4 downconversion still looked like interlaced video to me, but the Vegas 7 downconversion, using the DVDA template, looked as if it had been deinterlaced in the process. Is there a default one needs to change for it to not do this? I haven't actually done another HDV project since, I shoot mostly widescreen DV, but I will be in the future, and I'm curious if there's a simple setting I missed? One thing I did notice was in the properties for the project, I think deinterlace was set to frame blend by default, but since I didn't tell it to deinterlace, I didn't think that it would have. Is that what happened?