defrag large m2t files?

Comments

Terje wrote on 6/16/2008, 8:25 PM
in the troubleshooting guides of BOTH Avid liquid and Pinnacle studio, they list "fragmented drives" as a possibility for random crashes.

That's just their way of saying: "Please try to do some random stuff to occupy your time, then re-boot your computer, and if the Gods of Software are willing, the bug in our software that crops up on random intervals will not crop up for a while. Fingers crossed".
Terje wrote on 6/16/2008, 8:36 PM
Most hard disks can send video data from the hard disk to the processor in a timely manner for smooth playback to occur.

Again, another way to make you occupy your time while support does something more important. It is particularly interesting that this comes from Microsoft given that Microsoft wrote NTFS and recommends that for their computers. NTFS fragments files on purpose, and for a long time Microsoft strongly recommended against de-fragmenting drives, claiming that NTFS drives were not to be de-fragmented (they were correct).

Any drive put into a PC after 1990 can keep up with video bandwidth no problem. Typical drives today have a transfer rate of 1Gbit/s or better, that is 1000Mbit/s. HDV is what,25Mbit/s? This means that your HD could, in theory, feed 40 (yes, that is forty) simultaneous media players showing HDV. Obviously you are not going to get that kind of sustained transfer rate, but at the same time, you are also not going to get 1/40th of it unless you are doing some really disk-bound stuff while you are playing your HD movies.

So, to summarize, if your HD is a bottleneck when it comes to feeding your media player or your video application, you really need to look at your work-flow, you are doing something you probably shouldn't.

Oh, finally, in a time-sharing system like Windows, de-fragmenting your drive is going to give you a few percentages better performance. Do you really think that it matters whether your HD can serve your 25Mbit/s stream at 800Mb/s or at 1000Mbit/s? It doesn't.

Hopefully a bit of logic works? Any questions?
blink3times wrote on 6/17/2008, 1:55 AM
"That's just their way of saying: "Please try to do some random stuff to occupy your time, then re-boot your computer"

That could be true, the thought did cross my mind..... but then that would simply be a guess with no basis to it. Not withstanding.... Microsoft is saying it as well.
Terje wrote on 6/17/2008, 2:51 AM
Not withstanding.... Microsoft is saying it as well.

Support is support no matter what. Microsoft dev explained long ago that NTFS drives didn't need to be de-fragmented. I trust dev to know more than support always. Support always tells you to do dumb stuff that you really have no need to do. Trust me, I just dealt with Microsoft support after downloading Windows 64 and trying to install it on my Wife's computer.

For some bizarre reasons Microsoft has decided to sell a downloadable version of Vista 64 that basically is not possible to install. Their dev guys know that they f#cked up, their support people will tell you to try to download the software again just in case it got messed up in the download process. Given the fact that the software is downloaded by a tool that not only downloads the software, but also verifies that the download is good, that is impossible. On the other hand, you spending another 15 hours downloading gets you off their back for a while and they can close the open ticket for a while.

I work for a large computer company and we sell software to help people with this kind of stuff for example. One of my customers, a large well-known company, was struggling with customer satisfaction issues. They looked at everything, decided that the main problem was their support department, and the fact that support tickets stayed open for days and even weeks without resolution. They warmed the ears of the support manager and he invested in our very excellent software to help him.

After implementation he reduced the open time for his average trouble ticket from several days to less than two hours on average. Accolades all around, he was commended by his superiors and he praised our company for the fantastic software that helped him close tickets faster. Our software, among other things, automatically gave the support staff a set of procedures they could get the customers to do to further identify the issues.

Unsurprisingly significantly reducing the time to close a TT didn't fix our customers main problem, which was customer sat. Why? Because their product still sucked, and the fact that first-line support was able to move the issue off their plates and on to second line support faster didn't actually mean that things worked for the customer.
megabit wrote on 6/17/2008, 3:56 AM
I think the opinion that defragging is bad, and NTFS is actually more efficient when (heavily) fragmented, is saying too much - if it was so, why would MS include an "even better" defrag tool in their newest OS?

Having said that, I must admit I've always wondered how well applications like Diskeeper (which I own and use) go together not so much with Windows' own defraggers, but with the other "intelligent" tools included in Windows XP and Vista. I mean the mechanism of organizing files and directories on your disk based on the way you're actually using them (I read in multiple places that Windows/Vista will "learn" over some time of using them after installation how the user is actually accessing the HDD(s), and optimize their layout accordingly).

I even submitted this question to the Diskeeper tech sup, but no conclusive answer was ever given. However, the learning mechanism (in addition to the usual prefetching algorithms) must be there indeed, because what I have noticed many times is Windows actually slowing down during the first several hour after having Diskeeper do its full job, including boot-time MFT and directories consolidation.

AMD TR 2990WX CPU | MSI X399 CARBON AC | 64GB RAM@XMP2933  | 2x RTX 2080Ti GPU | 4x 3TB WD Black RAID0 media drive | 3x 1TB NVMe RAID0 cache drive | SSD SATA system drive | AX1600i PSU | Decklink 12G Extreme | Samsung UHD reference monitor (calibrated)

blink3times wrote on 6/17/2008, 4:13 AM
Terje:

I have NOOOO idea what "downloadable version of Vista 64" have to do with this but (and please don't take offense at this) people much smarter than you and I (the writers and Engineers of the software) are stating that fragged hard drives DO cause problems. These troubleshooting guides are not invented by some secretary or phone handler on the support lines. They originate from the Engineers themselves (who BTW are simple humans and do make the odd mistake on things like "downloadable version of Vista 64").

I'm terribly sorry if I offend but I think I would much rather believe an Engineer that wrote the software as opposed to a Vegas user/software vendor.

An additional word on support....
They may "tells you to do dumb stuff that you really have no need to do", but then try to imagine the type of person they deal with most of the time. I sure WOULD NOT want to be a support phone handler!!! There could be a million and one things wrong in any one of these situations including some unique problem with the complainant's machine, and an operator is supposed to diagnose this all in a phone call from the other side of the world??? Meanwhile most of the callers that they get can't even figure out how to turn the machine on without help. You will also not find an Engineer sitting on one of these support phones.... that costs money and I doubt you will get too many Engineers that will have the patience to explain over and over again, what a power button looks like and how to use it. These are simple phone handlers that have been given a general script to follow, and a SMALL bag of tricks, nothing more
blink3times wrote on 6/17/2008, 4:33 AM
"I mean the mechanism of organizing files and directories on your disk based on the way you're actually using them "

Interesting you bring that up. I don't know about any new Norton versions (don't use Norton anymore) but the old Norton speed disk system used to arrange your files by frequency of use. The files used most often were place on the inner portions of the disk (where the diameter is smaller and the heads don't have to travel as far), while the infrequent files were place on the outer portion of the disk (larger diameter.... more travel time)
farss wrote on 6/17/2008, 5:32 AM
All I can is any software that has problems because a disk is fragmented is in very serious trouble. I'm not that much involved in low level coding but it'd take a fair effort to write code that can even work out IF a disk is fragmented, let alone crash because of it. The OS hides all that from an application, it's one of the primary functions of an OS.

Bob.
blink3times wrote on 6/17/2008, 5:44 AM
It hasn't got a lot to do with the software.... it's more a matter of retrieving the info in a speedy and efficient manor. Like a filing cabinet full of hundreds of files.... if it's in order then it will take you seconds to find what you're looking for. But if the "B's" are where the "A's" should be and the D's" are where the "F's should be.....etc, then it doesn't matter how efficient the seeker is, it will take more time to find then needed.

Granted, today's machines are MUCH faster than when defragging first came out and most files on a HDD just aren't that big (faster seek times and small files IMO makes defragging an asset as opposed to a necessity)..... EXCEPT for video files. Some of my uncompressed avi's are a terabyte in size.... that's a lot of info and the easier you can make it for your machine..... then probably the better.
Terje wrote on 6/17/2008, 6:00 AM
I have NOOOO idea what "downloadable version of Vista 64" have to do with this

Just a general competence issue. Once you leave the dev organization it goes downhill really fast.

eople much smarter than you and I (the writers and Engineers of the software) are stating that fragged hard drives DO cause problems

Almost correct, but only almost. People trying to sell a particular piece of software are telling us we should buy it. The engineers who designed and developed the NTFS file system in the first place are telling us that it is not only a waste of time and effort but that it also adds additional tear and wear on your drive. In addition to that, people who have no investment in the issue, that is people who don't work for the Diskeeper corporation nor Microsoft, generally agree that it is a waste of time.

On the other hand you have the support staff of companies with faulty software who tells us that a fragmented disk may make their software crash. The interesting thinb about that is that when a disk is fragmented it can only have one (and a that is a dubious claim) effect on the software from Pinnacle and others, namely that disk acceess becomes slower. So, what Pinnacle is saying is that their software is so brittle that if it runs on a disk that is 10% slower than spec their software will crash irrecoverably. That is such an absurd statement that even you should be able to look through it Blink. It is conflict avoidance bullshit, and they know it. There is not even a theoretical possibility that slow disks can make your software crash unless you have specifically designed the software in such a way that it crashes on slow disks. -- and no, that is not an admission that defragmentation will slow disk access.

I'm terribly sorry if I offend but I think I would much rather believe an Engineer that wrote the software

Good, then believe the engineers that designed and wrote NTFS. They stated from the get-go that de-fragmentation was a waste of time. An engineer who claims he wrote software in such a way that it crashes if a disk gets 10% slower needs to be aummarily executed so that the world is not plagued by his incompetence any more (please take that as it was meant, with a bit of humor). You see, an engineer who claims this is full of shit. Nobody is dumb enough to design their software to crash because a disk is slow.

try to imagine the type of person they deal with most of the time. I sure WOULD NOT want to be a support phone handler!!!

I have been, but it is not for me. I have an extremely low tolerance for stupidity (ref my comment about the summary execution of dumb engineers above).
Terje wrote on 6/17/2008, 6:07 AM
All I can is any software that has problems because a disk is fragmented is in very serious trouble

I used to write code for a living but it became too much of a hassle, now I write code only for fun. I have written commercial software in Pascal (Delphi) C, C++, Java and a few other more esoteric languages. I have interfaced with most of the commercial files systems out there, and also all of the main relational database systems.

For a while I moved into management. If one of my developers came to me back then and claimed that his application code crashed due to a fragmented file system I would have fired him on the spot. It is a ridiculous claim . Now, if he claimed that his code could not complete its task in a timely manner due to fragmentation of files, I would accept that. In fact, my dev team spent two months trying to find out how we could overcome performance problems that was related to fragmentation once, not fragmentation of the disk though, but memory fragmentation. We fixed the problem at the cost of a 20% increase in memory usage, but that was deemed acceptable.

But, if any of the developers on that crash team had claimed that the software crashed due to this memory fragmentation, again I would have fired him on the spot. No mercy. It is an idiotic claim.

Oh, and if you wonder how we got around the memory fragmentation problem, it wasn't too hard, we hacked the operating system and re-wrote malloc( ...) for them, re-wrote it to allocate in bigger, fixed-size chunks. The Operating System vendor in question later incorporated our version of malloc( ... ) into their operating system.
Terje wrote on 6/17/2008, 6:23 AM
It hasn't got a lot to do with the software

It has everything to do with the software. Any software that is so brittle it can't handle slow disk access is written by developers who are so amazingly incompetent that you should not take their advice on anything at all. In fact, this is the most interesting part of this. If Pinnacle is saying that a fragmented disk can crash their software they are basically saying that their software is of an astonishingly bad quality. So bad in fact that anyone actually buying it should have their head examined.

But if the "B's" are where the "A's" should be and the D's" are where the "F's should be

Well, your analogy is absurd, but that's OK. Your analogy would be correct if the file allocation tables were inaccurate, they are not. Let me improve the analogy a little. It would be like an office where everything was filed correctly, but oddly. It would be like having a clients file stored in many places. On the first floor you would be able to find the address. In the second floor, correctly filed, they would have his case file. On the third floor they had his dependents and so on. If you needed all the information on the client you had to go to five floors and pick up parts of his file, each of them stored properly though.

Now, imagine you are a lawyer and you have all your client files stored like this. Why, nobody knows, but that is the way it is. Your secretary (Mrs Vegas) who has been with you for twenty years knows this, and she finds your files for you no problem. It isn't always fast, but she'll get them. Now, she's retiring and you are looking for a new one. You hire a temp (Ms Pinnacle), and one day, after having asked her to find her find the file on Mr Zarkosy, you find her on the fourth floor, hole in the head and a gun in her hand. Suicide.

You wonder about it but figure she had personal problems. So, you get another one from the same agency. A week later he is asked to get the Zarkosy file. This time you find him and two of your colleagues dead on the 7th floor. Murder suicide.

This time you really wonder so you call the temp agency support line. You are on with them for 10 hours, they tell you to check if your windows are leaking air, checking if the railing on the fourth floor stair case is polished properly and so on. Eventually they get to your filing system and you tell them how it works. They look into it for a while and tell you they'll get back to you in two days.

If they later included a note with each of their temps saying something like: Please don't have a fragmented filing systems, our temps are known to go suicidal or even homicidal if your filing system is fragmented, would you get another temp from them?

A company blaming file fragmentation for their crashes is just lazy and incompetent.
farss wrote on 6/17/2008, 6:34 AM
Fragmentation has nothing to do with order.
I too have at times large uncompressed video files. I'm moving into even larger stuff now, 4TB rendered output. How do you think RAID works? It fragments the files!
If your entire video consists of 1 file only then and only then would having it as an unfragmented sequential file make sense. What happens when I have 4 files being played back at once?
The answer is the optimal solution is to interleave those 4 files. That's kind of what RAID does, bits of the file from one disk, other bit from the other disk.
Also factor in NCQ. The disk system doesn't even do the operations in the order the OS asks for them. Instead it sorts them to minimise head travel.
The whole question of optimising disk systems is pretty complex. I used to follow a forum where this was discussed and explained in huge detail. The mathematical modelling lost me, it makes anything in this game look trivial.

Having said all that I do have one contrary thought :)

Defragging your disk will improve productivity. Wash your car and it will go faster, trust me. If you believe this, it will happen, that's because we aren't machines, we're human. We feel better about it, we perform better. So if you feel better having defragged your disk, go for it. The chances of disaster are pretty remote, it costs next to nothing. I do it sometimes. I know it does nothing but I feel better for it, I've done all that I can do.

Bob.
Terje wrote on 6/17/2008, 10:03 AM
If your entire video consists of 1 file only then and only then would having it as an unfragmented sequential file make sense. What happens when I have 4 files being played back at once?

And remember, then only when you are sure that your video playback will not be pre-empted by another process that needs disk access. Once you have another process that hits the disk and thereby moves the read/write head, you have lost the advantage of the contiguous file. I don't know of any scenario where this will not happen (that your video playback is pre-empted that is).

Defragging your disk will improve productivity. Wash your car and it will go faster

and we add, be a nice boy and Santa will give you nice presents, pray to any of the more popular deities and you will win the lottery, wear a magnetic wrist-band and you will significantly reduce the chance of cancer...

As with any kind of snake oil, it's all good stuff. Oh, and I too own Diskkeeper. Why not.
blink3times wrote on 6/17/2008, 3:17 PM
"Well, your analogy is absurd"

You take the analogy much to literally. Fragmented files are just that (bits and pieces of file all over the HDD).... hence the term FRAGMENTED.... My analogy was meant to suggest completeness as opposed to some kind of numarical order. (I would say however that your example is just plain out to lunch..... and then some)

The fragmentation has little to do with the program and everything to do with the way files are written to a disk, how often the disk is written to, and erased from....etc. The program in question... be it Vegas or otherwise, therefore has little control over how it writes to the drive. It is unreasonable to expect a program to work at its maximum level of efficiency when the variables as wide as the number of operators that exist.

This whole thing is really quite simple and does not need to be blown as far out as you are making it. Heads have to move across a series of disks. The more (and the farther) the heads have to move in order to retrieve an entire file... the longer it takes. It's really not rocket science here. Granted we're talking about microseconds here.... but when you talk about enough of them....

What's incompetent (IMO) is when the manual says... "To avoid such and such problem(s) you should defrag......" and then have the operator thumb the nose at the manual and in turn complain about the support line not being any help.
farss wrote on 6/17/2008, 4:09 PM
Only one point I don't really see.
Sure the player can be pre-empted but with dedicated drives why/how would the OS move the heads when it's not accessing that drive? I ask because if this is true I'd like to know how to stop it. I'm thinking to spend a bundle on a fibrechannel external RAID box (I need over 1GB/sec STR) and I sure as heck don't want the OS moving the heads around on that when it doesn't need to.

Bob.
blink3times wrote on 6/17/2008, 4:23 PM
That's a good bloody question and I have often wondered that myself. Files get fragmented because the OS writes the file to the first available empty slot on the disk. If that spot is too small then the head moves on to find another empty spot on the disk for the rest of the file. So you can see that constant writing... erasing.... writing.... erasing can fragment a drive pretty good (Your file is not erased BTW when you delete it.... the table is simply marked as clear on that part of the disk and the next file is written over it)

But if you start with an empty drive in the first place then there are no slots so to speak on the disk.... just one long empty disk. Now logic would dictate that the file would be written as non fragmented.... and that's what I thought. But I have a dedicated drive (1TB) for my uncompressed avi's and when I view it with Diskkeeper, it shows the entire file sometimes as being fragmented. I can't figure that out.

But I'm sure Terje will have an answer for us. ;)

As for moving heads when the drive isn't being access... I think Vista has the opposite problem... the drives stop spinning when not in use far too often... which can be a bit of a pain when accessing every 15 minutes or so.
Himanshu wrote on 6/17/2008, 8:09 PM
Not sure if anyone clicked-through on the link for MS's article on defragmentation in Windows Vista, but here's the summary from that link I posted:




So whether one wants it or not, by default Vista is often defragging your drive. This thread is getting way too long...and I am not trying to convince anyone to start defragging their drives...so see you all in another thread! :)
Terje wrote on 6/18/2008, 12:50 PM
Only one point I don't really see.

If you have a dedicated video drive, it probably would not. If you do not have a dedicate drive the appliaction that pre-empted your video app would probably move the read head.

For the general population this isn't that isn't a relevant problem though, most consumers have only one disk, and in that case, the read head will typically move when an app is pre-empted (if the other app uses disk, and most apps do).
Terje wrote on 6/18/2008, 12:52 PM



I do, and I have given it a number of times already. NTFS fragments files on purpose. Can you guess why? Do you think the NTFS developers know what they are doing?

Edit: As of Windows XP, this behavior is apparently no longer standard in Windows, that is interesting and probably not a good idea for a general purpose computer, but I digress. Windows NT has been going downhill since 3.5, but that is another matter.
Terje wrote on 6/18/2008, 12:56 PM
It is unreasonable to expect a program to work at its maximum level of efficiency when the variables as wide as the number of operators that exist.

Given the fact that no software outside of specialist software, gets access to files by going directly to the physical structure of the disk, that means that for an application the only difference possible is that disk access is faster or slower. No other factors.

If a piece of software is so brittle that it committs suicide when a disk is a little slower than expected should be discarded immediately as useless crap.

What's incompetent (IMO) is when the manual says... "To avoid such and such problem(s) you should defrag......

No, the incompetence would be in writing software that dies if the disk is a little slower than they expected. That's absurd in the extreme. Also, given that the average harddisk can feed, at it's max capacity, 40 simultaneous video streams to an application, the idea that harddrive performance has any impact on this at all is insane.