defrag drive C?

LReavis wrote on 3/7/2009, 11:20 AM
I've read that it is a waste of time to defrag disks, but after seeing my system get stuck often during boot, slow opening of programs, and occasional crashes, I used Windows defrag on Drive C. However, it still showed many fragmented files.

So I used some long power leads and a long SATA cable to hook up my boot disk to another computer so that it wouldn't have any open files that would foil the defrag process. THAT really did defrag my disk. After several days, I have not had any mysterious boot problems or crashes and programs definitely open faster.

If you also have maybe 100 programs installed and have used the same Windows installation for a long time (I've used this one for about 18 months) and have uninstalled and re-installed lots of stuff (Vegas 7, various DVDarchitect, media players, DIVX converters, etc. etc.), you might want to do the same.

Comments

xberk wrote on 3/7/2009, 1:03 PM
Clever. I believe defrag can help as in a case you described but some folks think a defrag weekly or monthly is effective. I doubt that. A re-format of the drive and reinstalling everything works the best for speeding things up.

I wonder -- would a disc image backup, nuke the drive and then restore the disc image work better than a defrag as you described -- and which takes longer?

Paul B .. PCI Express Video Card: EVGA VCX 10G-P5-3885-KL GeForce RTX 3080 XC3 ULTRA ,,  Intel Core i9-11900K Desktop Processor ,,  MSI Z590-A PRO Desktop Motherboard LGA-1200 ,, 64GB (2X32GB) XPG GAMMIX D45 DDR4 3200MHz 288-Pin SDRAM PC4-25600 Memory .. Seasonic Power Supply SSR-1000FX Focus Plus 1000W ,, Arctic Liquid Freezer II – 360MM .. Fractal Design case ,, Samsung Solid State Drive MZ-V8P1T0B/AM 980 PRO 1TB PCI Express 4 NVMe M.2 ,, Wundiws 10 .. Vegas Pro 19 Edit

Chienworks wrote on 3/7/2009, 1:37 PM
The only way a defrag would help in that situation is if you had some drive sectors going marginally bad. The only real fix for that is to replace the drive. If a defrag helped drive C then replace it ... soon!
kentwolf wrote on 3/7/2009, 2:21 PM
>>...would a disc image backup, nuke the drive and then restore the disc image work better than a defrag as you described...

No.

If you have a fragmented drive, you have a fragmented disk image backup.

When you restore the disk image, you also restore the fragmented state.
rmack350 wrote on 3/7/2009, 4:15 PM
That's why it's called a disk image. However, I just looked at a backup made with Casper XP (kind of a Ghost-like program) and the copy is totally unfragmented. So the basic premise is right as long as the copy isn't truly a disc image.

Rob Mack
John_Cline wrote on 3/7/2009, 9:46 PM
Acronis True Image has two options, a "standard" disc image in which it copies all the files individually. A restore from this backup will be defragmented. The other option is a sector-by-sector image which when restored will be identical to the original drive and whatever fragmentation was on the drive initially will be there after it is restored.
LReavis wrote on 3/8/2009, 2:58 PM
Thanks for the tip that my drive may be faulty; Just for the record, I never ever allow any file to save to Drive C (no Pagefile, no MyDocuments, no email - I use VMware's virtual machine running Linnox to get on the web, and it's on one of my work drives; but I constantly need to update many programs . . .)

- with one exception: Occasionally (but rarely), I do use Super, and I have not figured out how to edit it's config files to save to one of my 9 work disks that are habitually connected to my machine. Does anyone know how?

by the way, I too once kept a clean Windows install and put Drive C back to the way it was before my programs were installed. Unfortunately, through the years I have added so many programs - many of which are old Adobe programs that require all sorts of tedious patches to be installed (such as now-defunct PageMaker, which I still use to update a book that is our main source of income), that I no longer can afford the week that re-installing all the programs requires. But I do heartily recommend the clean-Windows reinstall (I use the free BartPE running DriveImageXML)
Skuzzy wrote on 3/9/2009, 2:04 PM
You can place the page file on drive C. Just make sure you do not allow Windows to dynamically allocate it. First thing to do is to fix the size of the pagefile as soonas you can after Windows is installed.

That prevents any fragmentation due to it being grown and shrunk all the time by Windows.
Chienworks wrote on 3/9/2009, 6:06 PM
I set mine to 1.5GB. When it was under 1GB Windows was constantly complaining that it needed more page file space. However, i didn't do this to avoid fragmentation. Fragmentation is very much a non-issue and you waste way more time thinking about it than your computer uses dealing with it. You'd be much better off completely ignoring fragmentation and never thinking about it or doing anything about it ever again. You'll save much much much more time by not thinking about it than you'll ever save by defragmenting, not even counting the fact that defragmenting takes a huge amount of time that is totally wasted.

The reason i set mine to a fixed size is because whenever Windows resized it, my whole system would hang for up to 30 seconds while the resizing was happening, and that i found annoying.
Former user wrote on 3/9/2009, 7:34 PM
Chienworks and I have always disagreed on this. I find that my computers improve after defragging my system drive. I don't do it often, but every couple of months or so and I do see an improvement in performance.

Dave T2
MSmart wrote on 3/9/2009, 9:59 PM
- with one exception: Occasionally (but rarely), I do use Super, and I have not figured out how to edit it's config files to save to one of my 9 work disks that are habitually connected to my machine. Does anyone know how?
Right click anywhere in the SUPER application and choose Output File Saving Management - or Ctrl-T. Choose folder, click SAVE Changes.
Jeff9329 wrote on 3/10/2009, 7:56 AM
Thank goodness for SSDs.

No defrag needed, ever.

FYI - The X25-M is down to $390USD, still a little high, but getting there.
LReavis wrote on 3/10/2009, 2:11 PM
"Right click anywhere in the SUPER application . . ."

actually, I had right-clicked up on the brown top bar where it says:

"(Right-Click for Menu)"

and all I got were two options: "Minimize," and "Close"

Thanks so much for clueing me it to the fact that there is a more extensive menu when right-clicking BELOW the brown bar; I luv this forum!
DGates wrote on 3/10/2009, 4:46 PM
I defrag about once every 6 months.
farss wrote on 3/10/2009, 6:26 PM
I'm not so certain about such sweeping generalisations.
A few days ago (thanks in part to this thread) I decided to dig deeper in this PC. It turns out that running a defrag of C: did make a significant difference to performance but for a pretty arcane reason.
I must have started a defrag ages ago that was interrupted so it kept trying to run. Problem was Kapersky had decided that dfrgntfs.exe was a threat and stopped it. Kapersky's logs were filled with nearly 1 million events from this and no doubt this was gobbling up significant system resources.
Problem went away when I simply did a defrag to completion of C:

I've also flown the "don't defrag, it's a waste of time" argument with some pretty serious systems people and my kite was shot down pretty quick. They pretty much all agree that defraging a volume containing large files e.g. video, is a waste of time. Defragging systems volumes can be another matter if done using good tools that optimise file location and they had the data to back up their claims.
Then again on the other hand these guys are looking after large corporate data centres so I'd tend to discount what they're saying to some extent as most of us don't have massive databases and a zillion applications running.

Maybe that changes for those running Media Manager and use a lot of the Vegas default file locations which are on C: .

Bob.
FilmingPhotoGuy wrote on 3/11/2009, 4:36 AM
How can Drefrag be a waste of time?

When you run defrag, click on "report". There it will list all files that are fragmented. Lets say a file is fragemented into 3,500 pieces. If you access that file the hardrive would have to collect ALL those 3500 pieces, join them in memory. That's got to take up time no matter how fast your PC is.

When defraging, the green "unmovable file" is your Swapfile. If this file is fragmented it too will slow the PC down. You then remove the file by setting your "Virtual memory" to none, rebooting. Then do a defrag, then set your virtual memory back again.

If you running XP32 bit and you have 3GB of physical RAM then setting your pagefile (virtual memory) to anything is a waste of time, apparently. I found this out reading the thread on the HD Render Test where the pagefile was deleted and it made no difference to the render times. This is because XP32 can only address 3GB of RAM

Once you have defraged you need only do it once a month.

- Craig


farss wrote on 3/11/2009, 4:52 AM
"If you running XP32 bit and you have 3GB of physical RAM then setting your pagefile (virtual memory) to anything is a waste of time, apparently. I found this out reading the thread on the HD Render Test where the pagefile was deleted and it made no difference to the render times. This is because XP32 can only address 3GB of RAM"

XP32 can address 4GB of RAM.
The problem is that some of that is used up by hardware and the RAM on your video card. Buy a video card with more RAM on it and goodbye more memory.
Applications running under XP32 can can only use 2GB of RAM anyway.

What any of this has to do with having a pagefile escapes me, what happens if you were to run two instances of Vegas or Vegas and another app or heck, even the OS. If you have no page file and physical RAM needs to be paged then things don't run slower, they hit a brick wall and stop running.

Bob.
LReavis wrote on 3/11/2009, 11:21 AM
I'm pretty sure that a pagefile sometimes is needed. I once saw the PF Usage, as monitored in Windows Task Manager, standing at 3.35GB while rendering a Vegas project (go to this thread:

http://www.sonycreativesoftware.com/forums/ShowMessage.asp?ForumID=4&MessageID=621117

and search for "eye-popping")

I'm pretty sure this was after I installed 4GB of RAM. If I would have had no PF, from where could Vegas get that much memory space?
FilmingPhotoGuy wrote on 3/11/2009, 1:46 PM
The techies say that XP32 can only address 3.xx of RAM. The pagefile only gets used if you run out of physical RAM. So if you have 4GB of RAM then you would never use the pagefile and you might as well delete it. XP64 and Vista64 can address up to ???? anybody know?

-Craig
Terje wrote on 3/14/2009, 11:33 AM
I'll try to make this short. Conclusion first: for most systems defragging is a waste of time, for some it can have a negative impact, for a few it can be good. When NTFS first came out it was meant for business and for servers. NTFS therefore fragmented files on purpose, and defragmentation was not possible and considered harmful. Why?

Imagine a computer with a lot of different software running simultaneously, each program accessing it's own files. On a multitasking system this means that the OS lets each app run for a fraction of a sec, it is then preempted and thrown to the back of the queue. Before it gets another shot at the CPU, ALL the other software is given some time. Most software accesses the file system.

So, examine two of them. One has all its data at the beginning of the drive, the other at the end of the drive. Each fraction of a second, with these two accessing the drive, the read head has to move from one end of the drive to the other. This is slow and it makes for a lot of tear and wear.

Now imagine the data in question being fragmented and spread across the drive in a random manner. How much would the drive head move each time? We don't know, not exactly for each time, but we know how much overall. On average the read head would move accross HALF the disk.

So, with disk based work and multitasking we'd see HALF the tear and wear and TWICE the performance. (Amazingly theoretical)

Is this true on a PC today? No. Intelligent disk cache and moderate multi tasking reduces the problem of data clustering to a non-issue for most. But then again, fragmentation and performance issues resulting thereof is also a non-issue, no matter what the sales guy says.

Keep the amount of files on your system drive low, don't fill the disk up with illicit and steamy avis, and you'll be fine.

For video dudes and dudettes, de-fragmenting the video drive(s) regularly, and notultitasking software that accesses that drive too much is probably also a good idea.
Terje wrote on 3/14/2009, 5:49 PM
>> The pagefile only gets used if you run out of physical RAM.

This is not at all true, not by a long shot. The pagefile is used all the time by Windows and all modern operating systems. The system pages out things that have not been used for a while, whether you are low on RAM or not. Paging out takes a lot of time and resources and the OS will not wait until it is needed to do so. Code, DLLs etc are paged out quite fast. This is why, no matter how much RAM, it is not recommended you run with no page file. I know some people do these days, but I would not recommend it.

As a rule of thumb you should allocate twice the RAM size for your page file (fixed gives the best performance) if you have below 1G of RAM. If you have more than one G of RAM you should allocate at least the same amount for your page file.

As a systems person I will always recommend you allocate twice your RAM, no matter how much you have, for your page file. Generally you'll have a more stable system. If your system starts failing and removing the page file helps it would, in my opinion, be a strong indicator that there is something wrong with your disk drive or your disk controller.
Jøran Toresen wrote on 3/14/2009, 6:09 PM
Terje, I recently purchased a new Intel i7 PC with 12 Gb RAM (Vista Ultimate 64). Should I allocate 24 Gb to my pagefile?

Jøran Toresen

Terje wrote on 3/15/2009, 5:38 AM
If I was using the PC for a DB server, yes, I would allocate 24G for the pagefile. You should allocate 12G. As I was saying above, if you have 1G or less, go for 2x, more go for 1x. That is a good rule of thumb. Going forward, as RAM becomes more plentiful and in demand, I would probably change that to using 2xRAM up to 2 or four G, and 1xRAM above that. Seems excessive, but the way Windows uses the PF, 1xRAM is minimum.
blink3times wrote on 3/15/2009, 5:50 AM
I've been running with 8 gigs ram and no page file for quite a long time with no issues what so ever. Editing is smooth and fast and I have no issues at all with multi tasking... even with rendering in the back ground.

Interesting to note.... the windows7 beta... it defaulted with NO page file when installed to my system.
FilmingPhotoGuy wrote on 3/16/2009, 2:48 PM
So Terje, if my files are fragmented thousands of times you suggest not to defrag? I totally disagree!

You make these statements as if you know what you are talking about. However, I do agree that some users who defrag their drives untill all files are neatly packed together on one side "may" not be speeding disk access at all, so "that" is a waste of time. But you need to defrag to keep files concatenated.

-Craig