defrag large m2t files?

ushere wrote on 6/14/2008, 10:27 PM
in passing,

can defragging 'damage' m2t files?

by 'damage' i mean create 'errors' that could lead to vegas throwing a wobbly?

i have in the past defragged 300gb capture drives with numerous avi's on them with no problems at all. my present question is in relationship to a 500gb that has an assortment of m2t's, avi's scattered all over the place..... along with areas left by deleted files...

leslie

Comments

johnmeyer wrote on 6/14/2008, 11:43 PM
First of all, don't bother to defrag. It is a complete waste of time. You won't gain any performance, and you needlessly thrash the disk drive around, waste energy, and tie up the computer when you could otherwise be doing productive work.

And no, it makes no difference whatsoever in the file unless your drive has a problem, in which case, bad things will be happening whether you defrag or not. However, defragging makes zero difference in the content of the file.
ushere wrote on 6/15/2008, 3:46 AM
thanks jm,

so you think my 500gb drive that's got a variety of files (m2t's, avi's, etc.,) of various lengths scattered (and i mean really scattered by now!) over it wouldn't be better off for defragging?

my understanding is pretty basic really, but i would have thought that with files being recorded, deleted, recorded, etc., my drive would be really heavily fragmented - thus causing new files to be written 'higgledy-piggledy' in 'gaps' all over the place.

you're saying that this doesn't matter nowadays? i'm using single (not raid) seagate sata drives @7.6k, so presumably, even for hdv there's plenty of head room for writing without problems?

i don't doubt what you're saying, i'm just trying to understand why i always had to defrag drives on my old avid / emc / and disk recorder before starting new captures (at least that's what they recommended...)

leslie
farss wrote on 6/15/2008, 4:06 AM
I'd also like to hear an explaination of why it doesn't achieve anything as well. I don't think the standard Windoz defrag is worth much but a while back I did try Perfect Disk and it did seems to help. This program does something Defrag doesn't, it defrags the empty space.

I'd also suspect that those here running large numbers of audio tracks would benefit from having things as ordered as possible. Although converting as many audio tracks as possible to polyphonic wave files might be a bigger help.

No doubt it achieves less than nothing for lightly fragment disks with small numbers of large files but the systems I'm typing this on has over 35,000 files on it and it does get a bit speedier after a defrag which does take forever so when I get around to it I kick it off on a Sunday night which give it 20 hours to run before I need the machine again.

Bob.
Chienworks wrote on 6/15/2008, 4:14 AM
The single biggest reason that defragging doesn't matter in a multimedia editing session is the case when more than one file is being accessed simultaneously. When might this happen? Well, how about when you add a piece of backgound music behind the video? Or when you crossfade. Or when you superimpose one video on top of another? Or when you mix audio tracks? Or when you .... well, basically whenever you edit/preview/render almost any situation. What happens when more than one file is accessed? Well, heads are bouncing all over the drive to find the parts of the different files. No amount of defragging is going to help this situation in the slightest. And yet ... the drive keeps up with Vegas' demands quite nicely in all but the most extreme cases. Now isn't that interesting?

So, if the drive can keep multiple file streams from all over going simultaneously, what does it matter if one file is broken up in pieces?
farss wrote on 6/15/2008, 5:16 AM
One file isn't going to matter, what about 100s of them though.
I'm not questioning that it doesn't make enough difference to matter in most situations but to my mind that's very different to saying it doesn't make any difference.
Thing is too, when Vegas slows to a crawl it's hard to know the why of it. I've oftenly found the CPUs barely raising a sweat and assumed it's because the disks cannot keep up. On the other hand though even with 100 tracks of audio the degree of fragmentation probably isn't going to have much impact on how hard the disk has to work. The proximity of the files and where they are on the disk probably has more impact than anything.

Bob.
Himanshu wrote on 6/15/2008, 1:27 PM
People have their opinions, but defragging is a regular part of my routine, and it works well for me. It will not damage your file while moving around assuming that your hard drive itself isn't damaged, and that the process isn't interrupted in a way that causes it to abort abnormally.

If your hard-disk has to seek different locations on the drive because of user demands is another issue...if you are demanding one file be read it's best to have that file defragmented for quicker access. Windows fragments files and requires (allows?) the user to defrag manually. Several OSes (think UNIX) have strategies built-in to the file-system to try harder to avoid defragmentation, and some even run background processes to keep the file-system in a good state. Defragging when Windows thinks it's necessary is a good idea in my opinion. You don't have to do this when you are doing some other tasks (in fact, don't!)...do it when you leave for the day or over the weekend.
johnmeyer wrote on 6/15/2008, 2:05 PM
People have their opinions,

I don't have opinions, just facts.

Defragging was VERY important back in the 1980s with MFM drives and slow head seek times, and interleave values that could be changed (the stagger between adjacent tracks). It makes almost no difference in modern drives with modern electronics, on-board cache memory, and fast head mechanisms.

This issue has been discussed on this forum dozens of times, and in those threads where I have participated, I always issue this challenge (and I would be HAPPY if someone could actually prove me wrong by answering the challenge). Here it is:

Find one well-done test, not done by or sponsored by a defrag software vendor, that shows any non-trivial improvement in performance immediately after doing a disk defrag.

Many people report that the computer "feels" better after defragging, but I think this is due to two things: 1. The placebo effect (you think you've done something good, so you perceive an improvement). 2. Other things are also done during defragging which in some cases might actually improve things.

In particular, many people defrag as part of doing a Windows or third-party maintenance protocol. These very often delete temporary files and also check for and repair lost and cross-linked clusters. In the case of Internet temporary files, and similar caches (like those left behind by Flash), you can have tens of thousands of files on your drive. Those files dramatically increase the time it takes to find each piece of each file in the disk directory and can significantly slow disk access.

To do a useful defrag test, you would have to use a good disk testing tool. You would have to disable all background processes. You would have to do the test in a way that eliminated disk caching from the performance measurements. It is not a simple thing to do correctly.

I did see such a test about seven years ago, sponsored by PC Magazine, back when they still had some actual technical expertise (not any more). It showed only the slightest improvement in performance, and most of this would never actually be realized outside of laboratory conditions.

But, if anyone can find a test that proves me wrong, I'd love to see it and will be more than willing to admit I'm wrong.

In the meantime, every time you defrag, realize that you are grinding away at that hard drive and you are tying it while you could instead be doing useful work. Those are two things of which I am 100% certain.
johnmeyer wrote on 6/15/2008, 2:24 PM
Some interesting reading:

Typical Test From Disk Defrag Vendor

Old, Often-Quoted Test Showing Disk Defrag Improvements

The second one was done by an outfit that no longer appears to be in business.

Now, here's a test that supports my position:

23 Ways To Speed WinXP Without Defrag
Chienworks wrote on 6/15/2008, 2:58 PM
I'll also add that every time you defrag you are also moving most of the data on your drive in a dangerous and unnecessary way, often multiple times.
farss wrote on 6/15/2008, 3:50 PM
Probably for what we do the optimal solution is a highly fragmented disk!
Say you've got 50 tracks of video and audio. The ideal is to have them physically interleaved in small chunks. That would minimise head movement. You can get this to some extent on the audio side by using polyphonic wave files.

Bob
riredale wrote on 6/15/2008, 3:52 PM
I'd come down somewhere in the middle. Fragmentation still slows things down, but drives are so much faster nowdays that you don't often see the effect.

As for "proof" of the slowdown, I don't have any hard benchmarks to fall back on, but I do know that the Diskkeeper defragmentation program can check this instantly. Just now I clicked on my C drive, and it showed me this. What it means is that, for those files that are currently fragmented, the read time is about 8 minutes. Defragging those files would reduce the read time to about 5.5 minutes. The bottom bar graphs show that since the fragmented files are only a small part of the total contents of the C drive, defragging would gain about a 2% improvment in overall read time, from about 104 down to 102 minutes (top graph is in seconds, bottom in minutes).
Chienworks wrote on 6/15/2008, 4:40 PM
I still say that the fastest, safest, and most complete defrag method, should you decide you really must do it, is to copy all the files from the drive over to another empty or almost completely empty drive. Way faster than any defrag program by miles (kilometers?) and hours because each block is copied once and only once, no shuffling going on, and writes are to a separate drive from the one being read. Way safer because if anything goes wrong at all with the process, the files are still intact on the original drive. Way more complete because every single file gets written contiguously one by one to the new drive.

Only two downsides: you need a free empty hard drive, and it's unnecessary anyway.
johnmeyer wrote on 6/15/2008, 4:45 PM
riredale,

Good information.

I've tried to find test information today, just to see if I could back up my claims, or else find something that supports doing defrag. I still haven't found anything that strongly supports either position. However, the one thing that emerged from my search is that while defrag is certainly going to make some improvement, it sure doesn't look like it is going to be noticeable for most people, in most circumstances. A few percent at most, and that's just the disk read or write portion of whatever the total time required for a given operation might be. I can't imagine it would be an improvement that you could even measure when doing a render. As for playback speed, where we are all trying to get full playback in the preview window, the slight improvement -- if any -- is swamped by all the other issues that impact playback (e.g., the speed of various Vegas fX).
Terje wrote on 6/16/2008, 3:13 AM
I don't have any hard benchmarks to fall back on, but I do know that the Diskkeeper defragmentation program can check this instantly.

The problem is that Diskkeeper does its calculation based on the assumption that you will be able to read the entire file, start to end, in more or less one go. In a multi processing system that will never happen. The OS will suspend whatever task you are running to run the next task, and that will move the read-head of the disk.

When you access small fragments of your file at random times, rather than the entire file in one go, the best thing (mathematically) is that the file is distributed randomly across the disk in fragment that are identical in size to the chunk that is read off the disk each time. NTFS basically does this for you. This is why NTFS is designed not to try to keep your files un-fragmented.

When you de-fragment your drive you remove the ideal random nature of where your files are, and you stuff them in single locations. Let's say, for example, that there are two tasks (TA and TB) running on your PC, one with a file at the beginning of the disk, the other with a file at the end of a disk (worst case). Now, please remember that there is always other tasks running on Windows, nothing you can do about it. So, back to our worst case, your PC is running, one file on each "end" of the disk. Imagine the two pieces of software has equal requirements for disk access. What happens?

Well, every other read is going to come from every other piece of software, that is the nature of pre-emptive multi tasking. So, TA gets to read a bit, TB gets to read a bit and then it is back to TA... Each time the read head has to travel from all the way one end of the disk to the other. That becomes a lot of head movement.

Imagine the alternative. The files are spread randomly around the disk. How much must the read head travel now? Again, mathematically, on average, half the disk. In other words, if you de-fragment the files, you risk decreasing the performance of your drive, even though Diskkeeper will tell you that the drive is faster.

In the scenario above, after you had de-fragmented the two files to each end of the drive, Diskkeeper would tell you that performance was up significantlty while in reality you have increased the seek time of your drive by 50%.

Don't de-fragment. It has no positive effect, and increases the wear and tear on your drive. If you want faster disks, set up a couple of them in a RAID0 configuration
megabit wrote on 6/16/2008, 4:16 AM
I find this thread the more interesting now that I have just added lots of storage to my system (I now have 5 SATA drives inside - one for the system and apps files, and 4 in two striped RAIDs, plus one RAID1 for constant backup of mission-critical files in the form of the WD By Book Studio II 2T hooked through eSATA).

I personally think that the "placebo effect" of having done something "good" (plus the nice look of your HDD estate that defragging apps like DiskKeeper provide) is probably well over 50% of what there is to it with modern drives, and Windows pre-emptive multitasking nature. I think that at least as important as defragging is spreading the file structure onto several, physically separate, disk drives - and keeping your system/apps on one, the swap file on another, perhaps your source clips (mostly read) on another drive than Veags temp/render (written mostly), etc.

Has anyone got a proven "best scenario" for this? Because looking at it from the user's comfort viewpoint, having everything on drive C: would be most desirable.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

riredale wrote on 6/16/2008, 7:20 AM
Don't discount the placebo effect. We all know a clean car just drives better, don't we? And oxygen-free copper wire makes our loudspeakers sound clearer?
Keyan wrote on 6/16/2008, 10:25 AM
What would probably be more useful for media applications is a tool like JKDefrag that, when launched with the proper switch from the command line, will reorder all files based on NAME. Since the name of a file is actually the full folder name, it will put all of your files in the same folders into the same area of the hard disk. In theory if you have all of your source material under the same set of root folders (i.e. C:\Editing\Source) it would put everything very close together and decrease the seek time and the amount of head movement required to find any given file or a collection of source files.

Most defragmentation engines do more than just plop all of the files together, they will also put system and other commonly used files closer to the higher speed access portions of the drive (near the edge of the platter, the data is moving faster there due to the nature of a wheel), which will cause a greater increase in performance and decrease the perceived lag - most of the system files are close together as well.

Saying that defragmentation is completely useless isn't completely fair - say you have a file that is broken up into 5 fragments, even on a modern drive with a 5ms average seek time, it is going to take it 25ms to find all of that file, vs. 5 if it was in one contiguous block. For media creation it may not be that big of a deal since the bottleneck is usually the rendering time through the CPU, but for other tasks (such as OS boot) it can make a small difference when the OS is just caching a bunch of small driver files etc into memory.

IMO, if you can afford it, having OS, Source, and Final data all on physically separate drives is the ideal.
Terje wrote on 6/16/2008, 2:02 PM
Since the name of a file is actually the full folder name, it will put all of your files in the same folders into the same area of the hard disk. In theory if you have all of your source material under the same set of root folders (i.e. C:\Editing\Source) it would put everything very close together and decrease the seek time and the amount of head movement required to find any given file or a collection of source files.

Actually, no, it probably wouldn't. It would probably make everything slower or at best, not change much at all. Is this counter intuitive? Not if you consider that this is not DOS we are talking about. I'll try to repeat what I said above, but this time be a little more specific. Lets say you have two active processes running on your PC at a particular span of time. The processes are Vegas Pro and Microsoft Memory Manager (MMM - don't know the exact name). Vegas is concerned about editing video, MMM deals with handling your memory, swapping things in and out as it sees fit. Please note, you have absolutely no control over when MMM runs, the only way to deal with that is to shut the process down, and the only way you can do that is to shut your computer down.

Now, let's say you have de-fragmented your dive, and you have two movie files and an audio file in your project. All neatly close together. By chance, these files are on the opposite side of where the swap file is, and also far away from files in the Wndows directory.

So, Vegas is rendering. It reads your video files. It renders one second, then it is pre-empted. That means that for a short period of time Vegas is not running. Time is given to the MMM process. The MMM process decides to swap out some data from memory to make room for your render. What happens?

Vegas reads tiny part of file from beginning of disk.
MMM writes file to end of disk
Vegas reads tiny part of file from beginning of disk
MMM reads DLL from Windows directory at the end of the disk
Vegas reads tiny part of file from beginning of disk
MMM writes some stuff to the swap file again

And so on and so forth. In other words, there is absolutely no advantage to you that Vegas has its files all bundled together, the drive head is always going to be traversing large parts of the disk. What would have happened if the movie files were not de-fragmented?

Vegas reads tiny part of file from beginning of disk.
MMM writes file to end of disk
Vegas reads tiny part of file from just next to the swap file, meaning head traveled less than if it was de-fragmented
MMM reads DLL from Windows directory at the end of the disk
Vegas reads tiny part of file from middle of disk, again head travels less
MMM writes some stuff to the swap file again
and so on and so on

In other words, if you run an operating system where your application is the only thing running, and it is the only thing touching the disk and it has to read big files, then sure, having the files close together on the disk makes sense.

In a multi-processing system where processes are pre-empted at arbitrary intervals and control is given to other processes that need to get data from other places on the disk, the most efficient way of storing data on the disk, leading to the optimum performance, is through some semi-random, fragmented scheme. De-fragmenting is typically not only a useless exercise, it can have a detrimental effect on performance of many applications.

having OS, Source, and Final data all on physically separate drives is the idea

Maybe. If you have multiple identical disks and a good SATA/SCSI RAID controller your best bet, performance wise, is probably some sort of striped RAID array.Ultimate performance perhaps from RAID-0, but not overly secure. Use backup.

Oh, and in case you wondered, no, you can not prevent the pre-empting of your primary task, so this holds true for Windows no matter what (unless you are on 95/98/ME).
farss wrote on 6/16/2008, 2:56 PM
What you're saying sounds right for system disks however I hope not too many of us are putting our media files on the same physical disk as our OS. So although the OS might pre-emptively stop running Vegas whatever it starts running instead shouldn't be moving the heads on the media drives.
Also we can to some extent control what other processes the OS is running.
Not that any of that changes a more fundamental problem, unless you've only got one track of A/V in your project then as you play it out the heads have to jump between the files that your project is reading anyway. The consequence of that is the same, a fragmented disk would almost certainly be faster than one that wasn't, in fact if all the files you were playing back were interleaved would give the best result.

Bob.
Chienworks wrote on 6/16/2008, 3:13 PM
I kinda doubt that a fragmented disk would be noticeably better than a defragged one. There might be *some* rare cases in which the next needed data happens to be close to the last read data. This is unlikely on a defragged disc, but also pretty much just as unlikely on a fragmented one. About the only difference is that a defragged disk is pretty much guaranteed to be inefficient, whereas a fragmented one might just get lucky now and then.

Same thing with interleaving. Unless the interleaving just happened to match what Vegas was looking for while playing your project it won't be any better than the fragmented crapshoot.

But, (questionably dubious) efficiency aside, my contention is still that the defragmenting process itself is a waste of time, extra unnecessary wear and tear on the drive, and puts your data in jeopardy. Don't do it.
blink3times wrote on 6/16/2008, 5:27 PM
Well.... interestingly enough, in the troubleshooting guides of BOTH Avid liquid and Pinnacle studio, they list "fragmented drives" as a possibility for random crashes. and they strongly suggest defragging as a possible repair.

I'm not sure if I believe that heavily in defragging but I certainly don't believe it hurts anything and I choose to do it at least once..... and that of course is just before I do a disk image. Then from that point on it's simply a matter of recalling the disk image when you want a fresh drive again.... (which I do every couple of weeks)
farss wrote on 6/16/2008, 5:40 PM
In defense of both Avid and Liquid I've had some truly off suggestions as to how to fix Vegas problems. Even the more troubling when they finally admitted it was a known bug.

Support people in general like to give the complainant something to do or create the appearance of having tried something. Just human nature I guess but the support people I work with drives me nuts at times. They can waste a week trying to "fix" a problem in my code causing the client all manner of grief before they escalate the problem to me.

Bob.
Himanshu wrote on 6/16/2008, 5:44 PM
The C: drive which contains Windows and other apps is what I usually defrag. My media is spread out over several drives, which I don't fragment much (or at all) partly because I know the contents of the drive will keep changing, and partly because Windows doesn't report that the drive needs a defrag. Here's a Microsoft KB article, "Features of the Windows Vista hard disk defragmentation utility" for anyone interested.

Here's MS's KB article, How to perform common troubleshooting steps for Windows Media Player 11, which states, in part,

blink3times wrote on 6/16/2008, 6:13 PM
"If you have video playback issues, verify the following:

Again, this is pretty much what they're getting at in the Avid Liquid and Pinnacle troubleshooting guides..... Makes sense to me.