Analogue broadcasting around the world is regarded as of broadcast quality where you receive a strong uncorrupted signal (typ through free space) that can be digitally represented by 720x480 (576 for PAL), 4:2:2 color subsampling and no spacial or temporal compression.
These days you could be looking at a MPEG-2 2Mbps transport stream or even a 1Mbps MPEG-4 or 850kbps Windows Media Series 9 public trial.
Broadcast quality (or higher) lives within the studio. Some productions need lightweight cameras in quantity or have low budgets. So miniDV can even be the compromised studio/outside-broadcast format.
Digital technology has led to many not so obvious short-cuts in TV production.
It has also led to many historical events being recorded in HDTV/ATSC formats, in part because moving towards this is obligatory to the FCC controls and also to achieve the best archive format for the future to delve into.
Thanks for the info but I basically want to know would there be a big noticable difference in quality in my own production from Mini DV and something I caotured. I suppose the real answer is - it depends...
"It depends" is probably the best answer possible. I've got TimeWarner cable and last night i tried capturing "Simpsons" on Fox 33 and "Star Trek Enterprise" on WPNY using the A/V outs on my VCR going into a Sony DVMC-DA2 and thenfirewire into VidCap. Both looked fine on the TV. However, while the "Simpsons" came out crystal clear even after MPEG encoding, "Star Trek" was very very grainy, dark, and pixellated even as DV AVI. Apparently there is a wide variety of signal quality that may or may not show up on the TV but can influence the capture process.