OT; VISTA and MS issues-----NYTimes

apit34356 wrote on 3/8/2008, 5:13 PM
In NYTimes, an article about if VISTA is "ready" for the general public use as promoted.
----------------------------------------------------------------------------------------------------------------
They Criticized Vista. And They Should Know.
James Yang


Article Tools Sponsored By
By RANDALL STROSS
Published: March 9, 2008

ONE year after the birth of Windows Vista, why do so many Windows XP users still decline to “upgrade”?

Microsoft says high prices have been the deterrent. Last month, the company trimmed prices on retail packages of Vista, trying to entice consumers to overcome their reluctance. In the United States, an XP user can now buy Vista Home Premium for $129.95, instead of $159.95.

An alternative theory, however, is that Vista’s reputation precedes it. XP users have heard too many chilling stories from relatives and friends about Vista upgrades that have gone badly. The graphics chip that couldn’t handle Vista’s whizzy special effects. The long delays as it loaded. The applications that ran at slower speeds. The printers, scanners and other hardware peripherals, which work dandily with XP, that lacked the necessary software, the drivers, to work well with Vista.

Can someone tell me again, why is switching XP for Vista an “upgrade”?

Here’s one story of a Vista upgrade early last year that did not go well. Jon, let’s call him, (bear with me — I’ll reveal his full identity later) upgrades two XP machines to Vista. Then he discovers that his printer, regular scanner and film scanner lack Vista drivers. He has to stick with XP on one machine just so he can continue to use the peripherals.

Did Jon simply have bad luck? Apparently not. When another person, Steven, hears about Jon’s woes, he says drivers are missing in every category — “this is the same across the whole ecosystem.”

Then there’s Mike, who buys a laptop that has a reassuring “Windows Vista Capable” logo affixed. He thinks that he will be able to run Vista in all of its glory, as well as favorite Microsoft programs like Movie Maker. His report: “I personally got burned.” His new laptop — logo or no logo — lacks the necessary graphics chip and can run neither his favorite video-editing software nor anything but a hobbled version of Vista. “I now have a $2,100 e-mail machine,” he says.

It turns out that Mike is clearly not a naïf. He’s Mike Nash, a Microsoft vice president who oversees Windows product management. And Jon, who is dismayed to learn that the drivers he needs don’t exist? That’s Jon A. Shirley, a Microsoft board member and former president and chief operating officer. And Steven, who reports that missing drivers are anything but exceptional, is in a good position to know: he’s Steven Sinofsky, the company’s senior vice president responsible for Windows.

Their remarks come from a stream of internal communications at Microsoft in February 2007, after Vista had been released as a supposedly finished product and customers were paying full retail price. Between the nonexistent drivers and PCs mislabeled as being ready for Vista when they really were not, Vista instantly acquired a reputation at birth: Does Not Play Well With Others.

We usually do not have the opportunity to overhear Microsoft’s most senior executives vent their personal frustrations with Windows. But a lawsuit filed against Microsoft in March 2007 in United States District Court in Seattle has pried loose a packet of internal company documents. The plaintiffs, Dianne Kelley and Kenneth Hansen, bought PCs in late 2006, before Vista’s release, and contend that Microsoft’s “Windows Vista Capable” stickers were misleading when affixed to machines that turned out to be incapable of running the versions of Vista that offered the features Microsoft was marketing as distinctive Vista benefits.

Last month, Judge Marsha A. Pechman granted class-action status to the suit, which is scheduled to go to trial in October. (Microsoft last week appealed the certification decision.)

Anyone who bought a PC that Microsoft labeled “Windows Vista Capable” without also declaring “Premium Capable” is now a party in the suit. The judge also unsealed a cache of 200 e-mail messages and internal reports, covering Microsoft’s discussions of how best to market Vista, beginning in 2005 and extending beyond its introduction in January 2007. The documents incidentally include those accounts of frustrated Vista users in Microsoft’s executive suites.

Today, Microsoft boasts that there are twice as many drivers available for Vista as there were at its introduction, but performance and graphics problems remain. (When I tried last week to contact Mr. Shirley and the others about their most recent experiences with Vista, David Bowermaster, a Microsoft spokesman, said that no one named in the e-mail messages could be made available for comment because of the continuing lawsuit.)

The messages were released in a jumble, but when rearranged into chronological order, they show a tragedy in three acts.

Act 1: In 2005, Microsoft plans to say that only PCs that are properly equipped to handle the heavy graphics demands of Vista are “Vista Ready.”

Act 2: In early 2006, Microsoft decides to drop the graphics-related hardware requirement in order to avoid hurting Windows XP sales on low-end machines while Vista is readied. (A customer could reasonably conclude that Microsoft is saying, Buy Now, Upgrade Later.) A semantic adjustment is made: Instead of saying that a PC is “Vista Ready,” which might convey the idea that, well, it is ready to run Vista, a PC will be described as “Vista Capable,” which supposedly signals that no promises are made about which version of Vista will actually work.

The decision to drop the original hardware requirements is accompanied by considerable internal protest. The minimum hardware configuration was set so low that “even a piece of junk will qualify,” Anantha Kancherla, a Microsoft program manager, said in an internal e-mail message among those recently unsealed, adding, “It will be a complete tragedy if we allowed it.”

Act 3: In 2007, Vista is released in multiple versions, including “Home Basic,” which lacks Vista’s distinctive graphics. This placed Microsoft’s partners in an embarrassing position. Dell, which gave Microsoft a postmortem report that was also included among court documents, dryly remarked: “Customers did not understand what ‘Capable’ meant and expected more than could/would be delivered.”

All was foretold. In February 2006, after Microsoft abandoned its plan to reserve the Vista Capable label for only the more powerful PCs, its own staff tried to avert the coming deluge of customer complaints about underpowered machines. “It would be a lot less costly to do the right thing for the customer now,” said Robin Leonard, a Microsoft sales manager, in an e-mail message sent to her superiors, “than to spend dollars on the back end trying to fix the problem.”

Now that Microsoft faces a certified class action, a judge may be the one who oversees the fix. In the meantime, where does Microsoft go to buy back its lost credibility?
----------------------------------------------------------------------------------------------------------------

Of course, is not MS first time in a major legal action over its conduct.

Comments

TheHappyFriar wrote on 3/8/2008, 5:37 PM
it's false advertising, just like this post. I thought there was some kind of "underhanded dealing" with MS & VISA, the credit card company. :D (you misspelled vista in the thread subject AND the first line you typed out).

anyway... Ms will need to do what nintendo did when the first batch of NES's were faulty: replace them free of charge. That means either a) buying people new machines or b) paying people back for their OS purchase so it's free & then people won't need to complain about a free OS.

That's how to get the reputation to be good. But that won't happen & that's why there's a class action. :(
apit34356 wrote on 3/8/2008, 5:59 PM
I think its a fair question to question if VISTA is as stable of an OS as they suggest. It appears the number of internal documents revealing problems keeps increasing by legal reporters of the MS cases suggest. In summary, maybe VEGAS8 strange issues could be VISTA hardware and OS management issues. ( wishful thinking ;-) )
DGates wrote on 3/8/2008, 7:30 PM
This is why the best salespeople that Apple has all work for Microsoft.
Coursedesign wrote on 3/8/2008, 8:00 PM
...including the top chief who oversaw the development of Vista, Jim Allchin, who said he would buy OS X over what became Vista any day.

Let it be noted that Vista's problems came about in spite of Jim Allchin who is a very competent and caring guy.

What happened was that Dick Cheney, er, Steve Ballmer didn't let Jim run the Vista development without interference from junior bureaucrats who were extremely eager to splash their own pee on every lamp post in Vista. End result: 45 ways to turn off a laptop, the infamous UAC that is second only to waterboarding when it comes to unnecessary torture of users (without increasing security, because people just press buttons to bypass it in desperation), and the rest that is the biggest OS disaster in history.

Even Microsoft doesn't think Vista is fixable. They are going full tilt on its successor, Windows 7, and there is reason to believe that they will release W7 already next year to save their revenues.

farss wrote on 3/8/2008, 8:41 PM
Some slightly less selective reporting of the story here:
http://www.tgdaily.com/content/view/36279/118/

In summary a lot of the issues were due to Intel's inability to develop the fast graphics chips that they'd promised would be available. M$ seem to have been left with a few unpalatable alternatives.

The thing I find curious is in the end although M$ botched this in so many ways, much of it was not of their making. I've still got servicable hardware that I can barely get to run under Win2K let alone XP due to lack of drivers. Linux is a driver disaster area as well. Many apps still will not run under Leopard and Apple's browser of choice has got so many security flaws Paypal are warning people off it.

The one long term hope may lie in a research project being run by M$. As they rightly say every OS today is built on the same underlying core concepts. Pretty well since the dawn of computing we haven't changed the guts of how an OS works. There's some coverage of SIngularity here. This is a purely a research project but maybe M$ are finally going to write an OS that gets around what has always been their problem, backwards compatibility. Once you decide to make a clean start life gets much easier. That's in part why Apple do have some success, forcing users to replace hardware and ditching support for legacy apps give the developers much more room to work in.

Bob.
Coursedesign wrote on 3/8/2008, 9:36 PM
In summary a lot of the issues were due to Intel's inability to develop the fast graphics chips that they'd promised would be available. M$ seem to have been left with a few unpalatable alternatives.

Fast graphics chips have always been available. You probably mean that MS was expecting Intel to surely let their integrated graphics chip prices drop faster than actually happened, and they built a business around this expectation. This business may end up costing Microsoft hundreds of millions of dollars, but probably worse for them, a major loss of credibility. "Oh, sure I crashed your car real good but Johnny said he'd have it fixed by now. I can't help it if Johnny doesn't keep his word. That's not my fault!"

Apple's browser of choice has got so many security flaws Paypal are warning people off it.

Hoo-hoo-hoo! Since when did Paypal become an authority on computer security?

Just in case you live in ignorant bliss, Paypal has had 5-day outages where no transactions could be made at all. No one could sell anywhere in the world, no merchants could access their money anywhere in the world, and Paypal was saying through the end and beyond that there were only some isolated problems for a few customers, totally denying everything until they got busted by some good detective work.

Their infrastructure is not remotely befitting a major transaction processor. Incompetence coupled with "cheapness" in hardware and software design (without redundancy) has made their downtime orders of magnitude worse than others.

I run both Safari and Firefox on OS X, but I use Firefox primarily for web apps that don't support Safari well (such as Yahoo Mail). Safari on Mac has lots of advantages, although a few missing features compared to FF.

Security flaws? Bull. People have talked about social engineering attacks, but those are not unique to any particular browser.

I'll take Safari over IE anytime when it comes to security.

apit34356 wrote on 3/8/2008, 10:28 PM
Acid test 3 is used on bowsers has produced some interested results, IE7=12,safari=40 and firefox=50 out of 100. To test your bowser-link= http://acid3.acidtests.org/

here is the reference article;
-----------------------------------------------------------------------------------------------------------------
Ian Hickson, the Google employee tasked with creating the next generation of acid test, has completed his work, which is now available for public consumption at its new home, acidtests.org. Unlike the first acid test, which focused on the box model, and the second acid test, which covered a broad variety of basic HTML and CSS features, Acid3 covers 100 of the nooks and crannies of HTTP, HTML, CSS, ECMAScript, SVG and XML, all through the medium of DOM scripting, a critical requirement for any modern web application. Ian Hickson is also the primary author of the HTML5 specification, which started life as a spec. called ‘Web Apps 1.0’, and as such has lots of application‐related features such as client‐side storage and enhanced forms. Ian wrote 64 of the tests, with the remaining 36 being submitted by both browser vendors and interested web developers.

Work started on the new acid test almost as soon as the IE developer team posted notification that IE8 passes Acid2. As was widely criticised around the ’net recently, it was revealed Internet Explorer 8 would now only pass the test if the server was modified to output a special HTTP header. It is not known to css3.info at this time whether the header would be required for IE8 to achieve compliance in the new test.
-----------------------------------------------------------------------------------------------------------------
MH_Stevens wrote on 3/8/2008, 11:26 PM
I my opnion, the problem with Vista is the same problem as every Windows version has has since the beginning. It is no ,re than a graphcal representaion of DOS. DOS has not eveolved. untill Bill Gates throws his baby away we will never see a real advance in OS design.
blink3times wrote on 3/9/2008, 6:42 AM
I my opnion, the problem with Vista is the same problem as every Windows version has has since the beginning. It is no ,re than a graphcal representaion of DOS. DOS has not eveolved.

I must say, I'm running Vista 64 and am quite happy with it. I find it both stable and in general quicker than XP. Now that may be that my XP was the 32 bit version and therefore I'm comparing apples and oranges. But having said all of that.... I think you nailed it pretty good MH_Stevens.

There are only so many graphical versions of DOS that you can put out before they start looking all the same. I think XP was the apex of this program set and from here on in it will be little more than down hill. This whole windows thing is beginning to look like the same movie over again only with a refreshed sound track so to speak. I think "Windows" has run its course and it's time to develop something COMPLETELY different..... or at least it's time to stop ripping off the public with re-runs in a different colored box.
Terje wrote on 3/9/2008, 8:43 AM
It is no ,re than a graphcal representaion of DOS. DOS has not eveolved.

This is 100% incorrect. There is no DOS in windows, and there hasn't been since WIndows ME. Windows NT, which is the "grandparent" of Vista, was re-written from scratch with no DOS anywhere to be found. In fact, Vista runs DOS programs and Win16 programs is the same way that a Mac or a Linux box runs DOS, under emulation.
blink3times wrote on 3/9/2008, 9:03 AM
This is 100% incorrect. There is no DOS in windows,

He did not mean in a physical sense.... or at least I didn't take it that way anyway. Windows may not use the physical DOS structure anymore, but it still operates on the DOS based principle and has many of the same constraints.

The physical DOS actually started dying in Win98 with the deletion of himem.sys. but if you look in even XP there are still remnants of DOS (config.sys, and autoexec.bat folders for example).
Terje wrote on 3/9/2008, 4:52 PM
Windows may not use the physical DOS structure anymore, but it still operates on the DOS based principle and has many of the same constraints.

No, it doesn't actually. Windows has none of the constraints of DOS, and there is not a single shared "principle" among them. In fact, Windows and DOS have nothing at all in common except from the fact that they were created by the same company. Windows Vista actually shares a lot more with Mac OSX and other Unix variations than it does with DOS. As you might know, Mac OSX is a variation of Unix.

Windows NT was written completely from scratch, based more on the OS/2 code base than anything else, but not sharing all that much with OS/2 either. Every single remnant of DOS was purged for the new OS, but a simulation engine was added so that old DOS applications could be run. Very similar to what was available for the Apple at the time. Pure simulation. The Windows NT DOS simulator was terrible and hardly ran any DOS software at all. It's the same simulator that is in Vista, but it is improved (not that it matters now).

n even XP there are still remnants of DOS (config.sys, and autoexec.bat folders for example).

Again, these are not at all remnants of DOS since Windows XP doesn't use these files at all. If you never run DOS software (the command line in XP is not DOS and it doesn't use autoexec.bat or config.sys) you can safely delete these files. The only reason the files are there is so that the DOS simulator can find them. Again, it is exactly the same with a DOS simulator on Mac or Linux, they also need config.sys and autoexec.bat somewhere, but you would never claim that they are based on DOS or have any DOS heritage in them.

There are lots of things that is problematic with both Windows NT/XP and Vista, but none of them are related to DOS.
TheHappyFriar wrote on 3/9/2008, 5:52 PM
After Win2k, dos support was dropped. Win2k & older could boot in to 100% dos if you wanted. WinME/XP just ran an emulator/command line. The Win2k disc provided all the DOS drivers if you wanted (just like 98). I would sometimes run dos programs under the dos boot, just for kicks. After Win2k, "command prompt" booting was still the GUI with a window running the command line, so if something was wrong with your video settings or something with the gui was wrong, you need to do a repair, can't fix it in dos/command prompt. Linux fixed things easier (duel boot).

The DOS emulator in OS/2 ran dos programs better then any windows version ever did. Go figure. :)

anyway... the dos emulator sucks. it doesn't do long filenames. Isn't case sensitive. The only people who use it are "power users" and it's no better then dos 6.22. It's worse, doskey doesn't work & it doesn't give you any option to quickly "scroll" through previous commands you've entered. The game Quake did that back in 1996.. and has retained that feature in every future version of the engine, only improving on it.

MS will never drop obsolete stuff because a) that would make 90% of what they have obsolete & b) why else use Windows? If I only wanted to run the latest version of photoshop & that's it, I'd have a mac & wouldn't bother updating/upgrading. If I wanted to run the latest version of photoshop, keep my copy of Office '98, use my EPP scanner & only replace parts as they break, I'd run windows on a PC. I wouldn't be happy shelling out ~$5k every three years for all new hardware/software because something new & cool came about. Again, look at the mac users: the people who bought the duel-CPU G5's with 4gb RAM got screwed week or so later when it was "Intel is the future, G5's suck!".
blink3times wrote on 3/9/2008, 6:00 PM
This is 100% incorrect. There is no DOS in windows, and there hasn't been since WIndows ME.

http://www.infoworld.com/articles/op/xml/00/10/16/001016oplivingston.html

They started removing direct methods of controlling DOS in Windows 98.... even more so in ME, but it's still there, and even in XP there are still bits and pieces of it.
Terje wrote on 3/9/2008, 8:05 PM
They started removing direct methods of controlling DOS in Windows 98.... even more so in ME, but it's still there, and even in XP there are still bits and pieces of it.

Let me try to explain.

DOS -> WIndows xx -> WIndows 95 -> Windows 98 -> Windows ME

These are all operating systems based on DOS, DOS underlies everything here.

OS/2 -> Windows NT -> Windows 2000 -> Windows XP -> Windows Vista

These operating systems are all written from scratch and underneath these there is no DOS whatsoever. None. There is no more DOS in Windows 2000/XP/Vista than there is in Mac OSX.

The only relationship between Windows 98/ME and XP/Vista is that they have a very similar user interface.
apit34356 wrote on 3/9/2008, 9:10 PM
Well, NT was design to be a multi-tasking file server in the beginning, with very limited drivers to control resources--- overhead. But that change in windows 2000, more common drivers, more compatibleness with MS 98&95 apps and etc plus because of Office products demands. MS changed its view of the Internet and networking after studying M. browser massive success. This major refocus blended a few well defined ideas of future MS OSs into a mess. Also, in fairness, MS was having a hard time selling the new OSs to the big corp and government depts that using using Windows 95 and did not want to invest major money buying new apps after being sweet talked into win95. So MS did its normal re-design thing by committee of the dead end thinkers ----------.
blink3times wrote on 3/9/2008, 9:30 PM
Again Terje that is not QUITE accurate.

The NT based systems don't require DOS as a base start up... that is the difference with them, but DOS is still there. Previous OS's COULD NOT operate without DOS because they were DOS based.... NT systems are not. They don't depend on DOS the way the DOS based systems do... but that does not mean that DOS is not there and is not used.

Compare the document I presented above with the following document:

http://www.infocellar.com/winxp/DOS-with-XP.htm

Pay close attention to thew line:

Unlike Win95 and 98,

In other words Windows ME is lumped into the same category as XP... but yet in the above document it shows you how to hack into the now somewhat buried Dos subsystem.

Admittedly, as the windows systems progressed over the years, they begin to break away from the DOS dependency (they had no choice mainly because of the 640K memory barrier). But this doesn't mean that DOS is gone.... it's just no longer a mandatory requirement for operations, and it's no longer in any kind of usable form for the operator (in the sense that we knew it anyway). We now know it as a dos emulator
craftech wrote on 3/9/2008, 9:56 PM
VDM's (Virtual DOS Machines) are present in all 32-bit versions of Windows. They are there to emulate DOS for programs and peripherals mainly such as Soundblaster, etc., but cannot usually do direct access, and implementation is often poor. The NT family (such as W2000, XP, Vista) doesn't use them in running the API, only when running any DOS based or Windows 3.x programs (search for ntvdm.exe). NT 64-bit architechture doesn't contain that file because there are no processors that support real mode in NT 64-bit OS's thereby eliminating the 16-bit DOS and 16-bit Windows compatibility subsystem contained in all the 32-bit versions of W2000, XP, and Vista.

Wikipedia does a decent job (for a change) in outlining the major points of criticism with Windows Vista.
But essentially Windows Vista which was touted as the latest answer to the security vulnerabilities of previous Windows OS alas was another hoax designed to protect Microsoft and it's partners and not the public. If Windows wanted to protect the public it would have never integrated it's browser or it's e-mail client into the Operating System. OR it would have removed it from integration long ago. Rather than do that they issued hundreds of security patches because integration serves them not you.

John
nedski wrote on 3/10/2008, 12:06 AM
Well, this revealing of Microsoft's and PC makers mistakes in marketing Vista as an UPGRADE to PC's being sold with Windows XP is probably well deserved.

HOWEVER, what about PC's sold with Vista already installed???????
How many of these PC's are problematic? Is Dell, HP, Toshiba, Gateway and others selling PC's with Vista installed and they don't work?

I for one do not blame Microsoft for not having drivers for every piece of hardware. It is up the the hardware makers to have drivers ready for Vista. I think all hardware makers should have Vista drivers for their current, not discontinued, products. The development of Vista was not a secret, was it?

I also believe that all software makers should have made their current products Vista compatible. Of course, that includes Sony Creative Software!

These two scenarios are vastly different and the difference should always be emphasized by critics.

These kinds of problems with upgrading operating systems is why I always advocate installing a new OS "clean" and not trying to "upgrade" over the previous OS!
DrLumen wrote on 3/10/2008, 12:48 AM
from nedski
---
I for one do not blame Microsoft for not having drivers for every piece of hardware. It is up the the hardware makers to have drivers ready for Vista. I think all hardware makers should have Vista drivers for their current, not discontinued, products. The development of Vista was not a secret, was it?
---

From other articles I have read about the chocolate mess known as vista, the OEM's were doing development on Vista drivers based on the betas and specs M$ had released for the drivers. Then at the last minute, M$ changed parts of the WDM to make room for changes to MP11 and a different DRM structure. This broke many of the drivers that were in development effectively burning the OEM's with developments costs that could not be recovered and making them start from scratch. I don't believe it's the OEM's fault because M$ can't make up their (collectively lost) mind.

FWIW, I think all the software makers should have released linux ports.

intel i-4790k / Asus Z97 Pro / 32GB Crucial RAM / Nvidia GTX 560Ti / 500GB Samsung SSD / 256 GB Samsung SSD / 2-WDC 4TB Black HDD's / 2-WDC 1TB HDD's / 2-HP 23" Monitors / Various MIDI gear, controllers and audio interfaces

apit34356 wrote on 3/10/2008, 1:20 AM
"I for one do not blame Microsoft for not having drivers for every piece of hardware. It is up the the hardware makers to have drivers ready for Vista. I think all hardware makers should have Vista drivers for their current, not discontinued, products. The development of Vista was not a secret, was it?" MS was tightly controlling the drivers certification for VISTA for security issues concerning new forms of digital rights management (DRM) into the operating system, specifically the Protected Video Path (PVP), which involves technologies such as High-bandwidth Digital Content Protection (HDCP) and the Image Constraint Token (ICT).

-------------------------(The following from Wikipedia, link posted by Craftech)--
The Protected Video Path mandates that encryption must be used whenever content marked as "protected" will travel over a link where it might be intercepted. This is called a User-Accessible Bus (UAB). Additionally, all devices that come into contact with premium content (such as graphics cards) have to be certified by Microsoft. Before playback starts, all the devices involved are checked using a Hardware Functionality Scan (HFS) to verify if they are genuine and have not been tampered with. Devices are required to switch off or artificially degrade the quality of any signal outputs that are not protected by HDCP. Additionally, Microsoft maintains a global revocation list for devices that have been compromised. This list is distributed to PCs over the Internet using normal update mechanisms. The only effect on a revoked driver's functionality is that high-level protected content will not play; all other functionality, including low-definition playback, is retained.

Driver signing requirement (from Wikipedia, link posted by Craftech)

64-bit versions of Windows Vista only allow signed drivers to be installed in kernel mode, and this feature cannot be easily overridden by system administrators.[8][9] In order for a driver to be signed, a developer may have to pay Microsoft a sum of money for the driver to be tested by Microsoft's WHQL Testing.[10] If the driver successfully passes WHQL testing, Microsoft then issues a digital signature that Windows can use to verify the authenticity of the driver before allowing it to be loaded. Alternatively, if WHQL testing is not required, the developer must purchase a "Software Publisher Certificate"[11] with which to sign the driver. While this has been praised as a security feature, it has also been criticized for reducing Vista's compatibility with older hardware (as sometimes, as in the case of VMware Server, the manufacturer of the hardware won't bother releasing a new, signed driver) and for disallowing experimentation from the hobbyist community.[12] There has also been criticism that this requirement might exist not because of security, but to enforce DRM policies, especially the Protected Video Path.[13]

Microsoft maintains that the signing requirement is only to "identify the author/creator of a piece of software or code so that the author/creator can be approached in the event a reliability issue, vulnerability, or malware is discovered. Signing is not designed to confirm the “intent” of signed code (i.e. good or bad), or whether exploitable bugs or malicious code is present."[14] The required authenticode certificate for signing Vista drivers are expensive and out of reach for small developers, usually about $400-500/year (from Verisign).

Unsigned drivers can be installed through the use of tools included with Vista [15], but doing so requires use of an elevated command prompt and command-line tools, making it difficult for normal users to understand or small business administrators to implement. Microsoft has closed this workaround with hotfix KB932596,[16] which is included in Service Pack 1.
--------------------------------------------------------------------------

So, in summary, MS had tight control over the 3nd party driver development process for VISTA approval. Plus, just incase you did not know, MS writes most of its own drivers, or "borrows" basic designs ;-)
Paul Mead wrote on 3/10/2008, 6:50 AM
MS was having a hard time selling the new OSs to the big corp and government depts that using using Windows 95 and did not want to invest major money buying new apps after being sweet talked into win95. So MS did its normal re-design thing by committee of the dead end thinkers.

That isn't totally accurate. Anybody who was looking beyond their own navel knew that Windows NT (eventually renamed to XP) was going to be Microsoft's OS of the future. I, along with many others in the industry, started learning NT in the early 90s, long before all the Windows 95 hype. NT was no kneejerk redesign -- it had been on the drawing board since the late 80s. MS knew that they wouldn't be able to carry on with some DOS based hack that had some windows layered on top so they came up with a new OS redesigned from the ground up. The problem is that OS was such a resource pig that there was no way that a typical consumer PC at that time could effectively run it, so they bided their time selling it to power users until the hw caught up with the sw.

NT is really based more on a never released DEC (Digital Equipment Corporation) OS called Mica. There is a long history behind that OS, but, in a nutshell, when DEC canceled the project the longtime architect, Dave Cutler, who was living in Seattle at the time, basically took the ideas and the development team and just transfered to Microsoft. (There are rumors that DEC threatened lawsuits when they discovered verbatim Mica code in NT -- it is said some sweet deals came about due to that discovery.) Since we are doing Wikipedia references, you can learn more about Dave and his exploits here..

Btw, some people find the Micrsoft OS name to have a curious relationship to Cutler's previous, and quite successfull, OS named VMS:

V++ = W
M++ = N
S++ = T
Paul Mead wrote on 3/10/2008, 8:56 AM
I forgot that I still have this article on Cutler. OS trivia buffs may find it interesting. It accurately describes the birthing of Microsoft's flagship product. Note that the Ultrix product mentioned in this article died an unceremonious death a few years after Cutler left DEC. DEC ceased to exist soon after when it was bought by Compaq. The follow on product to Ultrix, called DEC OSF did fairly well, but was jettisoned when HP bought up the remnents of DEC with its purchase of Compaq.

The Brain Behind NT
Microsoft's David Cutler
UNIXWorld February 1993

David Cutler once walked into a high-level meeting at Digital Equipment Corp. wearing a T-shirt that read, You're Screwing Me!" the company -true to the shirts word- eventually did.

Today, he's trying to screw DEC and, in passing, the UNIX industry. Here's why.

Cutler, now Microsoft Corp.'s director of Windows NT Development, will never forget the October day in 1988 when he brought an edict from DEC's Maynard, Mass. headquarters -or Mecca as he sarcastically calls it- to the 180 people in DEC West's" Seattle offices announcing the cancellation of their three year old project It was the worst day of his life.

Cutler, the head of the development team for the successful VMS operating system, was working on an advanced operating system for DEC's Prism line of computers. The new operating system was intended to replace both VMS and Ultrix, DEC's version of UNIX but DEC canceled the Prism line because it could obsolete the company's proprietary products. "Three years of dedicated work went down the drain," he says ruefully. And he's still mad.

The cancellation of Prism -about which DEC spokespeople refuse to speak- clearly hurt Cutler's pride. He had spent -wasted- three years of his life working on the project. So Cutler, who is known for his ordinarily Marine drill instructor approach to management, told the staff the bad news and then gave them a furlough -a one month paid vacation- without getting approval from higher-ups.

He spent the month weighing his options. Cutler had thought about quitting DEC before. In 1978, he interviewed with Intel Corp. because he wanted to move to the west coast A year later, he resigned from DEC to start a compiler company, but a group of senior-level executives convinced him to reconsider. He stayed under the condition that he would be able to form his own engineering facility outside of New England: DEC West. But although he enjoyed the autonomy of being away from the "bullshit" in Maynard, DEC's cancellation of Prism was too much for him.

After Prism was killed, some venture capitalists wanted to give Cutler money for his own company. He considered the offers, but he didn't want to run a company, he wanted to develop operating systems. News leaked to Microsoft executives that Cutler wanted out of DEC.

DEC's Loss, Microsoft's Gain The timing couldn't have been better for Microsoft, which was struggling through its advanced operating system strategy. Its own next-generation operating system -OS/2 l.0 had started shipping in December 1987, but it was a flop. Why did it fail? For one thing, the product was co-developed by IBM Corp. and Microsoft and the two companies fought through the entire project.

Microsoft needed a plan for the future. Be problem was that as processors became faster, users wanted to run more advanced applications and many power users were turning to UNIX for multitasking solutions. DOS, which was Microsoft's nearly 20 year old technology, couldn't handle robust multitasking applications.

Microsoft Chairman Bill Gates talked with Cutler. He had been courting him to build a new portable operating system since August 1988. Cutler decided to join Microsoft because the company's wide distribution meant that any operating system he helped design was likely to afford him far more recognition than would a DEC-only operating system. This was Cutler's chance to fulfill his goal of writing the Next Great Operating System.

And to get back at DEC.

He moved the 10 blocks to Microsoft. A group of 16 disgruntled DEC consulting engineers joined him. Consulting engineers are considered DEC's most talented software engineers. Eventually, about 40 more DEC programmers signed on with the software giant

If there's one thing I want to do it's to beat DEC," says Cutler "I think the greatest way to do that is for them to have to buy and sell an operating system that they could have gotten for free if they had not driven me out."

Cutler says he still feels a sense of betrayal, and with betrayal comes mistrust-especially of what he calls the "weather vane" marketers at DEC who blocked his vision for developing great operating systems. People who have worked with Cutler say he wants -needs- NT to be successful so he can get back at the doubters. Is he confident? "Windows NT could very easily sell more copies in its first year of existence than the sum total of all UNIX systems ever sold," says Cutler. He adds that he doesn't have a gripe with UNIX itself, but with the people who adamantly support it. He disdainfully call them "religious bigots."

UNIX isn't going to die, he predicts, but NT will "seriously cut into UNIX sales." He explains that his confidence stems from knowing that NT is not a half-baked operating system. NT is not just two or three years old, he believes, but actually the product of many years of DEC-funded research.

He's No Nerd (He Says)

Unlike Rich Kid Gates, Cutler's father was a janitor and his mother worked in various civil service jobs. Cutler says they didn't have much money. As a high-school student the broad shouldered Cutler was an avid athlete who earned 15 letters. His athletic prowess helped him get a scholarship to attend Michigan's Olivet College, where he was quarterback of the football team and point guard of the basketball team. He graduated with a major in mathematics and a minor in physics.

After college he worked for E.I. Du Pont deNemours and Co. in Wilmington, Del., maintaining a Univac 1108. In August 1971, Cutler decided he wanted to work for a computer company not be a computer administrator.

He says he went to work for DEC because at the time it was "very freewheeling," and that "people worked night and day at DEC to implement their own ideas."

At DEC he was the principal designer and implementer of RSX-llM, a popular real-time system that ran on DEC's PDP-11 series. then he was the key designer for VAXELN, an embedded system toolkit for Microvax processors. "It was far too far ahead of its time for Digital to fathom," says Cutler unable to resist a jab at his prior employer. "It was never sold aggressively for fear of cannibalizing VMS sales."

Although Cutler has worked on many operating systems, he claims he's no "intellectual nerd." Responding to a direct question, he says he can beat Gates in a wrestling match (which probably isn't saying a lot). He's known as an avid skier who enjoys heli-skiing. Is he married? "No," he says, "that is a mistake you only make twice." the 50-year-old prefers work to recreation. He worked 12 hours a day, seven days a week, for 18 months on the RSX-l1M project. "There is code in RSX-llM that is authored by me dated Christmas Day, " he says."The guy is very intense," says a former DEC West engineer, "and he runs a very tight ship."

And he can be emotional. Just before the second prerelease of NT, a Microsoft engineer broke a kernel debugger. Cutler wanted to know what had been done wrong. Someone replied that a technical flaw in the code had been "corrected." Cutler's reaction? He put his fist through the wall. About two months later at his 50th birthday party, the NT team presented him ith the "ole" in a picture frame.

"Once I accept an assignment I finish it. Nothing gets in the way save for total cancellation of the project" he says. What kind of a boss is he? "People that don't do their job get run over," he explains.

Should the UNIX world be scared of this guy? Yes.

UNIX? Blab!

Cutler believes NT will easily outsell UNIX partly, he claims, because NT is written from scratch by programmers who understand complex operating systems.

Says Michael Goulde, editor of the Open Information Systems newsletter, "One condition Cutler had placed on coming to Microsoft was that he could build a real operating system, not a run-time environment like DOS, and not, in his words, "an 80's operating system built with 70s technology like OS/2," he says. "The way momentum builds among developers is interesting. They are so frustrated with what they have to work with today, that anything Microsoft gives them to improve their lots is a gift from heaven. No questions asked." Cutler is known as an anti- UNIX bigot, a reputation he's trying unsuccessfully to shake.

"I don't hate UNIX. I do, however think the quality of UNIX is poor." He says UNIX has improved in reliability, but it has an inherently weak command interface. Because UNIX is used in universities, people fall in love with it because they don't know anything else."

"In general, [UNIX] has been hacked on by a plethora of graduate students for many years," he explains. `When done this way, things don't tend to be reliable or well-implemented. I'm sure the designers of UNIX would like to have a clean slate and start over."

The Final Irony

Yes, David Cutler is serious. He's still smarting from DEC's cancellation of Prism. "I don't believe I will ever get over it," he says. And he's rectifying his bitterness by trying to make a great operating system. A better operating system than UNIX. "There are lots of ways to build operating systems," he says. UNIX is just one of those ways. People have built lots of other systems that were just as good, if not better "

"UNIX is akin to a religion to some," he says. "If things aren't done like they are in UNIX, then they must be bad. Sorry, I don't believe in this religion."

But in some ways, Cutler does have religious convictions. He's a guy with strong beliefs about operating systems. He wants to write the most advanced, cleanest system ever. He also wants it to rule the world.
Terje wrote on 3/10/2008, 10:29 AM
Well, NT was design to be a multi-tasking file server in the beginning, with very limited drivers to control resources--- overhead. But that change in windows 2000

Not so. Windows NT workstation was a capable product, and it worked very well on the desktop. It was designed entirely as a desktop OS as opposed to Windows NT server which was tuned as a server OS. There were a full set of office apps for Windows NT, and as of 3.51 it ran Windows applications, even 16 bit windows applications, a lot better than any other operating system, including the then 16 bit versions of regular Windows.

MS was having a hard time selling the new OSs to the big corp and government depts that using using Windows 95 and did not want to invest major money buying new apps after being sweet talked into win95.

The corps were never "sweet talked" into Windows 95 over NT/2000. Microsoft made it absolutely clear from the get-go that Windows NT (later 2000) was the way to go and that Windows 95/98/ME was a stop-gap measure.

So MS did its normal re-design thing by committee

What re-design? Windows 2000 was Windows NT with the Windows 95/98 UI. There was no real re-design. The one major re-design of Windows NT came between WIndows NT 3.1 and Windows NT 3.51 when Microsoft essentially abandoned the micro kernel approach in favor of a more monolithic system. In NT 3.1 drivers (including the display drivers) generally ran in user mode and not in supervisor mode. This was done to make NT stable. Windows NT 3.1 was, by far, the most stable OS Microsoft ever released. 3.51 was also very good. In 2000 things started going wrong and it has been down-hill since. Not as bad as Windows 95/98/ME, but not as good as it should have been.

The experience with the current version of Windows has led Microsoft, sensibly, back onto the path of micro kernel. We can only hope.