Comments

johnmeyer wrote on 11/18/2005, 7:40 AM
The Hitachi deal at CompUSA is exactly the same price per GB, and you get a 160 GB drive (i.e., it is $40).

160 GB $40 drive
Infinite5ths wrote on 11/18/2005, 8:36 AM
Is Hitachi a serious contender in the HDD market ?? I have only purchased WD drives for years - no problems. I'd also consider a Maxtor, and (with a little more hesitancy) perhaps a Seagate.

How long has Hitachi been in the market? Have they been making drives for the big brand names before now?

Perhaps I just missed something...
--
Mike
riredale wrote on 11/18/2005, 8:53 AM
Could be wrong, but I think Hitachi took over the IBM hard drive products a couple of years ago.

I'd be happy with any of the major drive makers. Competition is fierce, and they all have good products.
John_Cline wrote on 11/18/2005, 9:06 AM
Hitachi bought IBM's hard drive division several years ago. IBM (and now Hitachi) have been responsible for most, if not all, of the breakthrough developments that have resulted in the enomous and inexpensive drives that we all enjoy today. So, yes, Hitachi is a serious contender. Seagate is also a serious contender and they come with a 5-year warranty. I have had nothing but trouble with Maxtor drives and some trouble with WD drives in the past.

These days, I use Hitachi and Seagate drives exclusively and have had no issues whatsoever.

John
Infinite5ths wrote on 11/18/2005, 9:16 AM
Verrrrry interesting.

The only WD drive that has ever died on me was the 4 year old 20GB drive that I used to burn 500+ CDs (550MB/CD) in a couple months. It slowly let me know that it was ailing, and I replaced it before anything was lost.

I figured that asking 275GB of transfer from a 4 year old 20GB drive over the course of 2 months was probably about the max one could/should ask of any HDD. Am I wrong?
--
Mike
Chienworks wrote on 11/18/2005, 10:07 AM
WDs are great and i use nothing but.

275GB from a drive is just a drop. I probably read at least a couple TBs from each of my drives every week. While you might think you drove that drive into the ground on that job, that 275GB was probably only a tiny fraction of what you had read from that drive over the years.
Infinite5ths wrote on 11/18/2005, 10:21 AM
Yup...I agree. I'm religious about defragmenting, running full-disk backups (using Norton Ghost - Copy Drive), full-system virus scans and so forth. I have been working with 44.1/16 audio for several years also.

Now, here is another related question: Drive size is proportional to 'minimum transfer before failure', correct? In other words, a 300GB HDD will withstand a LOT more total data transfer over time than a 20GB drive - without failing.

I assume this is the case, otherwise bigger drives would fail much more frequently under heavy use. But it would be nice to have this theory confirmed. [Perhaps 'minimum transfer before failure' is related to TRANSFER or ROTATION SPEED instead of disk size??]
--
Mike
Chienworks wrote on 11/18/2005, 10:24 AM
Defragmenting is moot these days. Drives are fast enough that you don't need to defragment them. Defragmenting is now only unnecessary extra wear & tear. All you're doing is wearing the drives out faster and risking your data every time you defragment.
Infinite5ths wrote on 11/18/2005, 10:30 AM
Yes, but I thought defrag's were necessary for two reasons:

1) Data that is fragmented must be pieced together from multiple locations - This takes extra time for ANY drive, regardless of its speed.

2) Piecing together fragments requires extra spinning (hence the increased read/write times), which results in more wear/tear on the drive over time. In theory, occasional defragmenting should be LESS taxing than constant additional spinning.

Am I wrong?

Note: I also added another question to my last post.
--
Mike
busterkeaton wrote on 11/18/2005, 11:50 AM
IBM had one bad run of disks and a lot bad press due to quality control issues at one plant. They then sold the business to Hitachi and as far as I know, they have been very solid under Hitachi.

StorageReview.com is a good resource for hard drive info.

No problems for me on recent Maxtor drives.
kentwolf wrote on 11/18/2005, 12:58 PM
>>...Defragmenting is moot these days...

Time for the monthly defrag debate... :)
johnmeyer wrote on 11/18/2005, 3:39 PM
I have been part of the defrag debates in the past. As a result of those debates, I emailed Brian Livingston, whom many of you may remember as a columnist for InfoWorld who now writes a newsletter. A frequent contributor is Woody Leonhard, whom many of you may recognize from his many books and his MS Office macros.

In my email I made the same statement that I've made here many times, namely that I have never seen even ONE scientific study, not done by a defrag vendor, that showed improved performance on a modern computer after defragmentation.

As a result of my email, they both then made this a major part of their latest newsletter. I will quote one line from Woody's article:

"Nowadays, defragmenting almost always rates as a colossal waste of time."

Brian attempted a counterpoint, just for the fun of it, but the only reference he could find, after two months of looking was an old PC Magazine test:

PC Mag Defrag

This test did indeed show performance improvements after defragmentation. But wait, there's more to the story. Some defrag programs made a MUCH bigger improvement on some tasks, and much worse on others. But, if defragmentation was the only thing at work then, unless the products were all faulty, the performance after running any of them should be identical. Defrag is defrag. Once it is done, the file is contiguous. Why would there be any difference in performance at all??

Of course, there are other things at work here. If you put the files on the outside of the disk, they will read about 2x faster (on most disks) than if you put them on the inside of the disk. In addition, larger directories (i.e., lots of files) can slow down initial reading of large directories, even with NTFS. Once the file directory has been read, subsequent access can be far faster (because the directory is cached in memory), and if you don't reboot between each test, you invalidate the results.

Thus, my guess is that what really happens is that when you defrag, your files get moved to a different part of the disk, and depending on where they get put compared to where they started out, this will make them read faster. Some defrag programs will do this by design, moving the more frequently-accessed files to the faster (outside) parts of the disk.

Bottom line, as always: The wear and tear on your disk while defragging is considerable, and it takes a TON of time on a big hard disk. The link above is the only independent performance test anyone has found, and its results are "self invalidating," meaning that they fail to show that defragmentation was the cause of the improvement, and not simply directory caching or movement of the file to a faster area of the disk (something that IS real, and worthwhile doing).

Since some defrag programs DO move frequently used files to a faster part of the disk, this is the one benefit of some defrag programs, and once every year or so, it is probably worth "defragging" simply to get your most often-used programs moved to the fastest part of your disk. But, don't think that any performance improvement is caused by "fragmentation."

Other than that, defrag is a complete, utter, waste of time.

johnmeyer wrote on 11/18/2005, 3:39 PM
message deleted -- sorry, I hit the "Post" button twice in a row.
Chienworks wrote on 11/18/2005, 4:03 PM
Mike,

1) Data that is fragmented must be pieced together from multiple locations - This takes extra time for ANY drive, regardless of its speed.

Yes, and the data must be pieced together even when the drive is defragged. The heads don't always read the sectors in order. The heads most move from track to track. True, this may be a tiny bit faster on a defragged disk, but not much. Consider also, if you are accessing two or more files at once, which is almost always the case, the head will be skipping around accessing the different files. So much for any benefit of defragging.

2) Piecing together fragments requires extra spinning (hence the increased read/write times), which results in more wear/tear on the drive over time. In theory, occasional defragmenting should be LESS taxing than constant additional spinning.

See above. It's going to happen anyway. A single defragmenting session will probably read and write most of the data on the drive, and will do so as fast as the drive can be made to function, and will do so for a very long period of time. Microsoft's defrag routine seems to move all the data 2 or 3 times! On the other hand, even with an extremely fragmented drive, reading a good number of video tracks simultaneously probably isn't going to tax the drive to it's limits, and it's only a read operation. Note that writes are always performed as consecutively as possible so fragmentation affects writing less than reading.

Defragmenting also puts your data in jeopardy. If anything messes up during a defrag process the data that is currently being moved is lost, possibly corrupting an entire large file. That isn't fun. Why take the risk?

Best method to defragment if you're really hot about doing it? Copy all your files from one drive to another freshly formatted one. This is a very rapid and safe process and doesn't tax the drives badly at all.

Defragging is evil! ;)
Harold Brown wrote on 11/18/2005, 4:35 PM
I used to defrag all the time and now I never defrag. I have never seen a difference in performance. For me, if I cannot see it then it is not worth it. Example, I had my Vegas files on a drive that was 2 years old and never defraged. I copied all of them to a brand new hard drive. I ran both (original and copy) in Vegas to see what I could do as far quality of playback. I have 40 minutes of finished video from hours of captured tape. Bottom line, not one bit of difference.
Both 250gig WD 8mb cache, 7200rpm
No defrag for me.
DrLumen wrote on 11/18/2005, 9:33 PM
Some of the points made about not defragging hard drive do seem to make since. However, as it goes against everything I've ever experienced about hard drives and this is the first time I have ever seen anything negative about defragging, I thought I would do some checking.

I had that PC Mag issue and tried the utility on a 286 system with a 20MB (yes MB) Seagate hard drive. The drive and system did get faster and the drive quieted a lot. About a year or so later, MS invented defragging and put it in Win95... (long story with lots of laundry) Anyway, they (eventually) licensed a defrag technology from Intel so I started there and then checked the web/support sites of Hitachi, AMD, IBM., etc... I bolded one from IBM that I thought was interesting.

Granted, I could not find any definitive scientific studies to prove or disprove defragging but plenty of sources that recommend it. Maybe they liken it to the study of a toothpick. It's a tool they know works so why spend time and money studying/analyzing it... <shrugs>

http://www.intel.com/personal/wireless/battery/battery_uses.htm#drive
http://www.intel.com/support/videocapture/isvr3/sb/cs-011863.htm
-----------
http://www.microsoft.com/technet/prodtechnol/winxppro/reskit/c28621675.mspx

Although FAT and NTFS are designed to make storage faster and more efficient when you save files, these file systems take longer to read and write fragmented files than unfragmented files. When the files on a disk become badly fragmented, performance noticeably suffers because the disk heads must move to different tracks on the disk to locate all the clusters of the file.

Defragmentation tools fix this problem by moving the files into contiguous clusters on the disk. Reducing fragmentation reduces the amount of mechanical movement required to locate all clusters of a file, which improves hard disk performance.
-----------
http://www.microsoft.com/technet/archive/win98/reskit/part2/wrkc10.mspx
Disk Defragmenter

The Disk Defragmenter (also called a disk optimizer) is used to defragment information on a disk. Windows 98 monitors applications that you launch and creates a log file for each application in the \Windows\Applog directory. Disk Defragmenter uses the log files to arrange program files in the order they are accessed when the program starts, causing the program to start more quickly.
-----------
http://www.hitachigst.com/hdd/support/qcheck.htm

http://www.amd.com/us-en/assets/content_type/DownloadableAssets/Rpt_v1.pdf
-----------
http://www.ibm.com/search/?lv=c&o=10&en=utf&v=14〈=en&cc=us&q=defrag+benefits&x=0&y=0

http://www.redbooks.ibm.com/redbooks/SG245287/wwhelp/wwhimpl/java/html/wwhelp.htm

9.14.8 Use disk defragmentation tools regularly

Over time, file become fragmented in non-contiguous clusters across disks, and system performance suffers as the disk head jumps between tracks to seek and re-assemble them when they are required.

Disk defragmentation tools work to ensure all file fragments on a system are brought into contiguous areas on the disk, improving disk I/O performance. Regularly running a disk defragmentation tool on a server is a relatively easy way to yield impressive system performance improvements.

Tip: We recommend you defragment your drives every night if possible.

Most defragmentation tools work the fastest and achieve the best results when they have plenty of free disk space to work with. This provides another good reason to monitor and manage disk space usage on production servers.

In addition, try to run defragmentation tools when the server is least busy and if possible, during scheduled system downtime. Defragmentation of the maximum number of files and directories will be achieved if carried out while applications and users that typically keep files open are not accessing the server.

Windows Server 2003 and Windows 2000 Server include a basic disk defragmentation tool. While offer good defragmentation features, it offers little in the way of automated running - each defragmentation process must be initiated manually or via external scripts or scheduling tools.

A number of high-quality third-party disk defragmentation tools exist for the Windows operating system, including tailorable scheduling, reporting and central management functionality. The cost and performance benefit of using such tools is normally quickly realized when compared to the ongoing operational costs and effort of defragmenting disks manually, or not at all.
-----------

Sorry for the long post...

Just my $.02

----
edited for typos

intel i-4790k / Asus Z97 Pro / 32GB Crucial RAM / Nvidia GTX 560Ti / 500GB Samsung SSD / 256 GB Samsung SSD / 2-WDC 4TB Black HDD's / 2-WDC 1TB HDD's / 2-HP 23" Monitors / Various MIDI gear, controllers and audio interfaces

craftech wrote on 11/19/2005, 7:47 AM
When I first started participating on these forums "defragging" was the Foley's Kidney Cure for all the problems people were having with the software. Fortunately that has all but disappeared from the suggestions offered here.

John