OT: Degunking my Windows XP

Comments

johnmeyer wrote on 7/29/2008, 11:28 AM
The issue is not whether NTFS-formatted disks become fragmented, because they DO become fragmented. I agree 100%.

Instead, the issue is whether there is any measurable performance improvement from defragmenting? People seem to get all worried when they see the visual representation of the storage pattern on their disks, but it is a graphic that doesn't necessarily translate into performance.

For the last five years, every time this issue has come up, I have searched -- so far, in vain -- for an independent testing agency, not affiliated IN ANY WAY with a vendor of disk defrag software, to publish the results of a study showing how much performance could be improved by defragmenting a disk drive that was fragmented as the result of normal usage.

I phrase things this way because I do not doubt that a disk could be brought to it's knees if you wrote a utility to create a pathological case with file sectors scattered in a way so that each sector was fragmented and also aligned in such a way that the disk stagger would require two revolutions to reach the next sector, etc.

Also, I am not interested in performance gains of a few percent. I have no doubt that the performance of a heavily fragmented drive might indeed be a few percent slower. However, I think that most people who recommend defragmenting are expecting 10, 20, 30 or more percent improvement. I believe this because most of the time when people in these threads are advised to defragment, it is usually to cure a huge slowdown in their PC, where it is operating at less than half (or worse) its former speed, or is dropping frames during capture, or something else that exhibits a drastic, dramatic reduction in performance.

Finally, being an engineer, I always run before/after tests, and have not once been able to measure ANY performance change after defragmenting. I measure performance by doing read/write tests, and also by simply opening applications and using a stop watch (re-booting after each series of tests so as not to be fooled by disk caching).

But, as always, I am totally open to new information if someone can find that independent test which shows dramatic improvement resulting from defragmentation.

Finally, I have a computer that is almost six years old. I haven't run disk defrag in years. My D: drive (my first data drive) reports 16% total fragmentation and 32% file fragmentation; my E: drive shows nothing but red in the Windows defrag report and say that I have 32% total fragmentation and 65% file fragmentation.

This PC is a single CPU, single core, single thread 2.8 GHz P4 with cheap IDE drives. My applications all take less than five seconds to load; it boots from turn on to first app use in about 35-40 seconds, and I have yet to drop a frame in almost six years (except for my analog capture card, but that's a whole different story).

Chienworks wrote on 7/29/2008, 1:28 PM
To add a tiny bit to John's excellent analysis ...

A defragmented drive can indeed run faster, if your primary application is to copy entire files sequentially from one drive to another. A defragmented drive may accomplish this particular task measurably, but not much faster than a fragmented one. But how often is this a task that you perform? And when you do copy files from one drive to another is the process so time consuming that you are primarily concerned with the speed of the transfer? No? I didn't think so.

In real world media applications the drive heads are bouncing all over, accessing parts of different files as they are needed. On a pristinely defragmented drive the head will be bouncing like crazy merely because the amount of data is so large that the drive can't cache all of one file before it needs to move to the next. This isn't any different from reading a fragmented drive.

Also in real world media applications the drive is capable of reading highly fragmented data faster than the application can process it. It doesn't even matter how many files are involved in your current project. It's an interesting spectrum because if there are fewer files then the drive has less data to handle and can run faster than the application. If there are more files then the application gets bogged down correspondingly more and the drive still keeps ahead. Fragmentation plays so little role in the real world scenarios that it is ignorable.

What is an issue is that a typical defragmentation process keeps the drive running at full capacity for sometimes hours on end, moving nearly every bit of data around, reading and writing continuously. This is a far more grueling and abusive task to subject the drive to than using it in a fragmented state. On top of that, every one of those writes is one more chance for the data or the disc format to be corrupted.
Steven Myers wrote on 7/29/2008, 2:18 PM
This is going to be hard, but I'm going to do it. This time I really mean it. I think I've really hit rock bottom.
No more defrag.
farss wrote on 7/29/2008, 2:36 PM
I'm not so certain about all of this. Jeff makes a good point.
I've always kind of agreed that defragging is a waste of time and in fact when handling a number of video and audio streams a fragmented (i.e. interleaved) set of files would be better than a set of contiguous files.
What we seem to focus on is only the physical head movement and not what the OS has to do to workout where the data is to know where to move the heads. Every fragment means more entries in a table. If the table itself becomes fragmented and large that's more work for the OS. The impact of this might not show up that easily. In some scenarios when the CPU doesn't have much else to do and has plenty of RAM on hand it might not make any noticeable difference, start pushing the OS with lots of tasks gobbling up cycles and memory and it could be a different story.

Bob.
johnmeyer wrote on 7/29/2008, 3:15 PM
Does anyone have something against before/after measurements? Just do the measurements and report back here the improvements. All the posts about how disks access files are good, and most of them are accurate, but the only thing that matters is whether we can substantially improve the performance of our computers. If we can, then I'll defrag every day.
jabloomf1230 wrote on 7/29/2008, 7:17 PM
Other than defragging making each file's clusters contiguous on the hard disk, it does one other important thing. It tries to fill in the fastest part of the HD with your files. If you run any disk diagnostic software, you will see that the disk read and write speeds on the "lower" sectors are usually about twice as fast as those at the high MB end. It's a matter of geometry and another reason why SSDs will be the norm within a year or two.
Bobpin wrote on 7/30/2008, 3:01 AM
My Pc has been also getting slower and I thought about a clean install of XP, but before i did I completely unistalled Zone Alarm Pro that I had been updating for a few years then installed it again and my PC seems to be its old self again,so maybe just updating each year builds up rubbish.

Time will tell.

Bob
johnmeyer wrote on 7/30/2008, 9:50 AM
My Pc has been also getting slower and I thought about a clean install of XP, but before i did I completely uninstalled Zone Alarm Pro that I had been updating for a few years then installed it again and my PC seems to be its old self again, so maybe just updating each year builds up rubbish.I haven't dealt directly with this particular utility, but if uninstalling/reinstalling an app provides noticeable performance, you have to ask yourself why this process would result in faster performance? After all, the same program is back on the PC and doing the same thing. I can think of only two reasons: First, you reinstalled an OLDER version. Therefore, something in one of the updates caused the slowdown. However, if you re-installed AND then updated, this won't be the case. That leaves the second possibility, which I already mentioned in previous posts, namely that this particular application creates thousands of files somewhere and doesn't delete them. You wouldn't believe how many ill-behaved applications don't clean up after themselves. You actually have to find the cache they use and delete the files yourself.

This includes, BTW, our good friend Vegas (and most other Sony apps). When installed, they create a setup directory/folder, put all the installation files in that folder, install the application, but then never uninstall the useless and un-needed setup files. I sent a letter to the development team years ago back when I thought they might actually listen to me (but they don't, which is OK, my wife doesn't either) suggesting that they should delete these files. So, if you go to your Program Files folder and find a bundle of Sony Setup folders, all that stuff is just taking up space on your hard drive that could be used for something else, and slowing down your computer (although probably not by any measurable amount) because it adds to the disk directory flotsam.

Jeff9329 wrote on 7/30/2008, 11:36 AM
Does anyone have something against before/after measurements? Just do the measurements and report back here the improvements. All the posts about how disks access files are good, and most of them are accurate, but the only thing that matters is whether we can substantially improve the performance of our computers. If we can, then I'll defrag every day.

I guess that in my last post I forgot to mention that I have extensively tested my systems for at least the last 4 years. I kinda (wrongly) assumed most of us, being geekish, would test their systems performance regularly to see;
1. What improvement upgrades made, like a new HD or video card.
2. If our systems performance is degrading.

I generally use Passmark software for measuring system benchmarks because it gives a complete picture of all systems performance. It also creates a test data file with date that you pull up in a chronological performance history graph.

My basic observations on HD performace are:
1. The first test of a new disk is always the highest result you will ever get.
2. If you measure performance every month, the HD performance is slightly lower each month.
3. After a defrag, you will almost always gain significant performance. I say almost always because I have had an occasional poor defrag where the performance increase is negligible. On the other hand, I ocassionally get a really good defrag where there is a noticable boost.

I will provide some improvement figures when I get back to my office.
johnmeyer wrote on 7/30/2008, 12:36 PM
Jeff: Good post and very useful. I look forward to seeing your results.

As for everyone testing, I think very few people test. When it comes to performance, I can understand that. However, what is astonishing is that many people will do a render that takes a day or longer to complete, doing something they haven't done before, and then post here that the render came out all wrong. Why didn't they take five seconds and encode it to see if conversion to progressive, or up-resn'g or down-resn'g, or IVTC or 24p to 60i, or whatever "non-standard" thing they were doing was going to work well? That's what gets me puzzled. Also, people constantly ask things like what bitrate gives the best results? The answer, of course, is that "it depends." I've given lots of generic, blanket answers, but the only way to know is to test a few seconds of footage at various bitrates, put them on a DVD+RW and view them on a TV monitor and see what looks good.

So, I totally agree: everyone on this forum should always be doing their own tests, but I don't think that is the case.
CorTed wrote on 7/30/2008, 2:01 PM
John, I agree with you, I am a firm believer in testing, however I also think that, that is the power of this forum.
I may want to spend some time testing the various bit rates, or I can ask on this forum to see if someone already has done these tests and keep me from waisting more time on it.
will-3 wrote on 8/12/2008, 11:29 AM
Wow! What a thread.

Watch CraigsList... buy a couple of old XP machines... only buy one with the actual recovery disk so it's legal. Don't pay more than $100-$150 per. Don't buy one less than 1.8Ghz if you can help it. The more ram and hard disk the better.

Use your main machine only for your main software... maybe only Video, audio & graphic's. Start off with Zone Alarm and AVG. Don't give ZA permisson to allow anything on the net other than programs you absolute must. It is interesting to see which programs are accessing the net behind your back... so this will be an education.

Use the first Craigslist machines for your other regular app's.

Use the second Craigslist machine as a test bed for any programs you are evaluating ... re install the OS as needed on this machine. You may put your main software on here too to test plug-ins and add on's... etc. That's up to you.

Anyway... it takes more desk space... but you can minimize that with a KVM switch... or by running XP Remote Access and doing away with the keyboards and monitors for the 2nd & 3rd machines... but I'm not sure if XP Remote access will let computer 2 & 3 get to your main computer... so that may not be a good idea.

Anyway... just one of many alternatives :)


Coursedesign wrote on 8/12/2008, 11:57 AM
It seems that every software manufacturer today feels compelled to suck up your resources with a resident program that checks for updates like that's all you bought your PC to do.

That's one thing I like not having to deal with in OS X. Apple Update checks for updates to all installed apps, only one OS task for all of them.

Windows does it for things like third party printer drivers, etc., so why couldn't it do it for all installed 3rd party applications too?

Simonm wrote on 8/12/2008, 3:33 PM
Coupla thoughts:

1. I didn't notice anyone comment (might have missed it) on the OP's machine only having 1GB of RAM. If possible, adding another GB will make a HUGE difference in speed (memory is several orders of magnitude faster access than disk.

Adding more memory is usually the biggest single performance upgrade you can do.

2. On disks, defragging, etc. If you really want better performance, the best way is RAID (with a decent controller). This is simply because of the disk's bandwidth. Ignoring the burst transfer rate (RAM to RAM, so meaningless), the continuous transfer rate is limited by the head channel, and these are typically not very fast, compared to, say, LTO or SDLT tape. Using RAID enables parallel reads/writes, so your bandwidth increases with the number of disks in the system. It can also improve seek times, but it depends on how the array is configured.

I'm afraid my experience is with SCSI hardware. I haven't tried the current generation of SATA RAID controllers, but they do make the technology affordable for home and small business use now. If I was doing bigger projects, I'd seriously consider setting up some sort of RAID 5 system based on SATA hardware. With a half decent controller (and the SCSI ones used to vary hugely) you should get a fourfold performance improvement, plus the ability to rebuild in the event of single-disk hardware failure (but be aware that this takes *ages* usually).
Coursedesign wrote on 8/12/2008, 5:08 PM
Another way (that is also far less expensive) is to get a very fast system drive.

Not a 10k rpm or 15k rpm drive, but for example the WD 640 GB 6400AKS ($84.99 from Newegg) which gives me a solid throughput of 100+ MB/sec in both read and write.

Even my 10K Raptors can't keep up with this drive.

Shergar wrote on 8/13/2008, 1:43 AM
it's totally free. cleans up all temporary files, removes the junk from the registry and lets you remove all those startup programs that sneaked in while you weren't looking.

I'd also recommend spybot - sits in memory and won't let anything touch your registry without your sayso.
Simonm wrote on 8/13/2008, 6:00 AM
[i]Another way (that is also far less expensive) is to get a very fast system drive.[\i]

I agree, but you'll see huge benefits if you get TWO fast drives and put them on different buses. Arrange your work so that the input comes off one drive and the output goes to the other.

It effectively doubles your transfer rates, at which point you'll probably find the motherboard I/O can't cope!
JohnnyRoy wrote on 8/13/2008, 7:27 AM
> ...the WD 640 GB 6400AKS ($84.99 from Newegg) which gives me a solid throughput of 100+ MB/sec in both read and write.
>
> Even my 10K Raptors can't keep up with this drive.

OK. I've got a 10K Raptor for my C: drive and I'm running out of space. Are you saying that if I buy a the WD 640 GB 6400AKS it will outperform the drive I currently have?

I couldn't find any 6400AKS but I did fine the 6400AAKS. Is this the one you mean?

~jr
Coursedesign wrote on 8/13/2008, 8:09 AM
JR, that should be "6400AAKS" as you suspected.

This drive outperforms the 10K Raptors for read and write of large blocks (which is what video editors usually deal with), thanks to the much higher bit density.

The 10K Raptors are faster for random access of small blocks though, such as in DBMS lookups.

Jeff9329 wrote on 8/13/2008, 8:14 AM
Go here for actual drive test results:

http://www.storagereview.com/php/benchmark/bench_sort.php

I agree that a fast system drive is a great speed boost. It should also never use more than 25% of its capacity for top performance.

The Raptor 300 is clearly the best current performer.

I have updated each time from the 74 to the 150 to the 300s and have seen significant improvements.

I have about 10 Raptor 150s laying around if you want to try one/some. They are SATA of course.

Edit: The above link is not pointing correctly, but you can figure it out.

They tested the newer 750GB version of the WD 640GB drive. here are some results:

Device Western Digital VelociRaptor WD3000GLFS (300 GB SATA) Western Digital Caviar Black WD1001FALS (1000 GB SATA) Samsung Spinpoint F1 with NCQ (1000 GB SATA) Western Digital Caviar SE16 WD7500AAKS w/ NCQ (750 GB SATA)

Low Level Suite 4.0 WD3000GLFS WD1001FALS HD103UJ WD7500AAKS
Average Random Access Time (Read) 6.8 ms 12.2 ms 13.6 ms 13.7 ms
Average Random Access Time (Write) 7.8 ms 13.2 ms 14.6 ms 15.0 ms
Maximum Transfer Rate 127.0 MB/sec 111.0 MB/sec 109.0 MB/sec 97.0 MB/sec
Minimum Transfer Rate 86.2 MB/sec 60.4 MB/sec 59.6 MB/sec 54.4 MB/sec

A better link:
http://www.storagereview.com/php/benchmark/suite_v4.php?typeID=10&testbedID=4&osID=6&raidconfigID=1&numDrives=1&devID_0=366&devID_1=368&devID_2=361&devID_3=350&devCnt=4
Steven Myers wrote on 8/13/2008, 9:05 AM
"That's one thing I like not having to deal with in OS X. Apple Update checks for updates to all installed apps"

Disclaimer: On my Mac, I'm running the 10.4.11 version of the operating system. On that version, it's called Software Update, not Apple Update.

That version checks for and installs only Apple software updates, such as FCP, the OS itself, and the various toys (e.g., ITunes) that come with the OS. (Wow, the ITunes updates take forever! )
Anyway, I have to do Peak and After Effects myself.
GlennChan wrote on 8/13/2008, 9:26 AM
REGISTRY CLEANERS: I've found them to do more harm than good. I've seen them hose computers but I've never seen them fix a computer.

2- I've mostly found that turning off background processes don't really make a performance difference unless they are heavyweight (e.g. distributed computer, virus scanning).

3- Viruses: Watch out for USB keys if you take them to a copy shop... will likely come back with virus. And there are some really nasty viruses out there that the commercial antivirus programs won't get rid of.
Coursedesign wrote on 8/13/2008, 9:47 AM
Go here for actual drive test results:

I ran Decklink Speed Test on one of my four Raptors and a WD6400AKS mounted side-by-side and using the same disk controller inside the machine.

Doesn't get any more actual than that. :O)

As I tried to explain, the Decklink test measures reading and writing of large blocks, not random access which is ruled primarily by rotational speed.

I don't think the WD 750GB drive is a two-platter design, so it won't have the bit density that makes the new 640GB drive such a screamer.

WD is supposed to release a three-platter 1TB version with the same bit density soon, I'm looking forward to that.

JohnnyRoy wrote on 8/13/2008, 10:04 AM
> This drive outperforms the 10K Raptors for read and write of large blocks (which is what video editors usually deal with), thanks to the much higher bit density.

SOLD! ;-) (thanks)

~jr