In this report, a researcher reports that Vista will reduce the video quality of playback devices that don't comply with HDCP. This will include those $600 video cards you have that output video via component or DVI.
the articles just mentioned DB/HDDVD discs, not video on the hard drive. So prehaps it could still play HD no problem (my Win2K & ATI 9600 plays HD just fine). it doesn't say that the producer has an option anywhere, it's a "our way or the highway" kinds thing (like the PS3/component/HDMI quality "issues").
Monitors are higher res then HD allows (at least mine it... 1600x1200), so there's no reason to not have the highest quality.
But if users can't tell the difference between 720p & 1080p then why do people care about 1080p? Because people can tell the difference. :)
i don't know who'd want to watch video on their PC anyway... 30-50" TV's vs 17-20" monitors. :? (plus interlaced looks like crap, no matter what you try, the progressive ALWAYS looks better)
The two tokens that come into play are ICT and DOT. As I understand it there are a variety of token possibilities, but ICT is the one that downrezes output to 960x540. DOT on the other hand can deny playback altogether.
Studios are expected to implement the tokens gradually as SD analog hardware sunsets.