Comments

Former user wrote on 9/18/2014, 8:07 AM
Bob,

This is still your opinion. And I am fine with that, but it should be prefaced that way.

Every link I posted backs my generalisation.
musicvid10 wrote on 9/18/2014, 8:28 AM
It occurred to me that anyone participating in this thread, including moi', probably has some OCD, though not necessarily about the same things.
I respect John Meyer and DaveT2 immensely, and there are several decades of combined experience among us, and plenty of things for us to agree on.

I don't base my reluctance to defrag on efficiency or wear or any of that; I've had whole media drives that weren't backed up corrupted after defrag, and am loath to repeat the experience.

Out of curiosity, I checked my old 32 bit laptop, now beginning it's sixth year of daily use, for everything from editing to entertainment to big databases. It's never been defragged, and the MS utility says it doesn't need it.
farss wrote on 9/18/2014, 8:29 AM
[I]"This is still your opinion."[/I]

No it is not my opinion. I have backed up my claims with fact and logic.

Facts:
1) Video files are large, many GB. No other uses for a PC involves such large files.
2) Non linear Editing can involves reading more than 1 such file.
3) Rendering involves reading at least one large file and writing a file of possible equal size.

Quoting sources such as Microsoft who speak only about average users does nothing to support your case. Time and again I read high level IT people qualify statements with "unless you're a video guy". What we do using many very large files and the way they're accessed by our applications is unique.

Bob.

Grazie wrote on 9/18/2014, 8:40 AM
What we do using many very large files and the way they're accessed by our applications is unique. I should hope so too! 'bout time for a better mousetrap, methinks?

Grazie

set wrote on 9/18/2014, 9:08 AM
Wow... long discussion on this...

Anyway, regarding to this, my computer builder warned me NOT TO defragging SSD.

Setiawan Kartawidjaja
Bandung, West Java, Indonesia (UTC+7 Time Area)

Personal FB | Personal IG | Personal YT Channel
Chungs Video FB | Chungs Video IG | Chungs Video YT Channel
Personal Portfolios YouTube Playlist
Pond5 page: My Stock Footage of Bandung city

 

System 5-2021:
Processor: Intel(R) Core(TM) i7-10700 CPU @ 2.90GHz   2.90 GHz
Video Card1: Intel UHD Graphics 630 (Driver 31.0.101.2127 (Feb 1 2024 Release date))
Video Card2: NVIDIA GeForce RTX 3060 Ti 8GB GDDR6 (Driver Version 551.23 Studio Driver (Jan 24 2024 Release Date))
RAM: 32.0 GB
OS: Windows 10 Pro Version 22H2 OS Build 19045.3693
Drive OS: SSD 240GB
Drive Working: NVMe 1TB
Drive Storage: 4TB+2TB

 

System 2-2018:
ASUS ROG Strix Hero II GL504GM Gaming Laptop
Processor: Intel(R) Core(TM) i7 8750H CPU @2.20GHz 2.21 GHz
Video Card 1: Intel(R) UHD Graphics 630 (Driver 31.0.101.2111)
Video Card 2: NVIDIA GeForce GTX 1060 6GB GDDR5 VRAM (Driver Version 537.58)
RAM: 16GB
OS: Win11 Home 64-bit Version 22H2 OS Build 22621.2428
Storage: M.2 NVMe PCIe 256GB SSD & 2.5" 5400rpm 1TB SSHD

 

* I don't work for VEGAS Creative Software Team. I'm just Voluntary Moderator in this forum.

musicvid10 wrote on 9/18/2014, 9:35 AM
What we need are drives that don't wear out or generate heat, and access data as from a holograph, not from 2-dimensional platters or substrates.
riredale wrote on 9/18/2014, 10:40 AM
Hey, wait! Let me jump into the pool too! Looks like fun!

Anyway, my two cent's worth:

(1) Hard drives are moving devices. They will eventually wear out.

(2) Some drives are considered "heavy duty" when compared to others, presumably so that they will last longer in an environment where the heads are constantly seeking (i.e. servers).

(3) Heat speeds up the degradation process, so proper cooling makes a difference.

(4) Some drives run a LOT hottter than others. I had a Seagate a decade ago that one could fry an egg on. BTW it later died.

(5) For PC activities that are CPU bound, hard drive data transfer issues are not very relevant.

(6) For activities that are I/O bound, then the idea is to store the data in a way such that there is very little head movement, since head movement takes a very long time relatively speaking. So a needed file that is fractured all over the place will take far longer to retrieve than if it were in one piece.

(7) Our Tivo box in the living room is defragging all night long. You can hear it. It's doing this for a reason. Similarly, modern operating systems do a certain amount of automatic defragging for the same reason.

(8) The trend is toward SSD system drives. They work on an entirely different principle and don't need any defragging, ever.



One final note: If I should ever go back to grad school and take a lab class, I want John Meyer as my lab partner.
musicvid10 wrote on 9/18/2014, 10:45 AM
I'd love for someone to demonstrate that drive seek times significantly affect encoding times for compressed HD video. That's the elephant in the house, and it seems to be growing . . .
jerald wrote on 9/18/2014, 11:01 AM
(can't believe I'm doing this...... but....)
I don't have time to 'demonstrate' that drive seek times affect encoding times (significantly or otherwise), but logic tells me that it depends on whether one's media drive is defragged. :-) :-)
Chienworks wrote on 9/18/2014, 11:04 AM
"not from 2-dimensional platters"

Well, we're already partway beyond that. WD's "perpendicular recording" technology stacks the bits up to 25 deep into the platter. I'd call that at elast 2.5-dimensional. I think this item is one of those things Arthur C. Clarke had in mind when he said "a sufficiently advanced technology indistinguishable from magic".
Chienworks wrote on 9/18/2014, 11:06 AM
Jerald, yes, it can depend on the defragged state. If you have more than one stream you're reading, which is quite common in an NLE environment, then a completely defragged drive is the slowest possible configuration for reading. It forces the head to bounce back and forth between the streams to the maximum possible amount.

Now, true, there aren't many randomly fragmented arrangements that improve on this much, but the point is that any state other than completely defragged cannot be slower, and may just possibly be faster.
jerald wrote on 9/18/2014, 11:42 AM
"If you have more than one stream you're reading"

Bingo. My main contention from the beginning of my participation in this topic (including prior thread) was about this point.

Would you agree that continuously reading a 10GB file, and nothing else from that drive except that file, would be faster if the drive is defragged?

One simple example of this would be copying a 10GB file from a non-system drive (with no other concurrent activity on that drive) to another drive.

I contend that read-rate would be highest in the case of reading a file that is entirely contiguous on the drive (i.e. no file fragmentation), as long as there is no other concurrent drive activity.

j
Chienworks wrote on 9/18/2014, 11:48 AM
Yes, i would agree with that. What i disagree with is that it would be enough faster to be worth the effort, or even noticeable in every day operations.

My biggest concern with defragging is that it puts data in jeopardy. I have no interest in taking the chance of having my data erased off the drive with the *assumption* that it was copied elsewhere successfully. That concern FAR outweighs any potential benefit or lack thereof.
musicvid10 wrote on 9/18/2014, 12:26 PM
Well, at least a few people are speculating about the right thing.
Eagerly awaiting the results of your tests.
jerald wrote on 9/18/2014, 1:50 PM
"My biggest concern with defragging is that it puts data in jeopardy."

Thanks, Chienworks. I agree that one should do all they can to protect their data integrity.

Another highly respected contributer's observation : Win7 [...] does do light defragging by default.

Apparently Microsoft doesn't share your concern that data is at significant risk for corruption during the defragmentation that Windows does.

I'm not presuming that I'm a valid authority. I'm just someone who has some historic knowledge of defragging and a heart to be helpful.

Normally, in best practice defragging algorithms, sectors (data) are moved first, without changing the drive's file-to-sector index. This means that a failure during the actual sector copy portion of the defragging algrithm will be harmless to the data. Only after the sector is copied and trusted is the disk's file-to-sector index changed to point to the new data.

If Microsoft trusts their algorithm so much that they make it the default configuration in Windows, then I trust it as well (as long as I have the original source media files on a non-defragged drive).

Since Microsoft Windows' default configuration is to defrag, and since I have never heard any computer industry pundit's proclamation of a grave danger of data corruption due to Windows defragging by default, I trust it (as long as I have an original media file on a non-defragged drive).

My own tests, historically, have shown me that there are *many* technical factors in video compositing and rendering speed, one of which, in certain situations, is file fragmentation.

One thing I won't do is proclaim that defragging is or isn't crucial for everyone, or, that defragging is never or always helpful. This topic is not suited for absolute, simple rules.

Another possibly helpful point: it is possible to configure Windows to defrag selected drives only. One could keep their original media files on a drive that is never defragged.

I would never advise anyone to run some untrusted defrag utility on their only copy of important data.
j

jerald wrote on 9/18/2014, 2:26 PM
"What i disagree with is that it would be enough faster to be worth the effort, or even noticeable in every day operations."

Thanks, Chienworks. If I understand your statement, then we may agree, at least somewhat.

Not sure what your intended definition of "every day operations" is.

My inputs on this topic do refer to video rendering, specifically, and they are most relevant to video rendering in situations in which source media data rates, and/or project complexity, are sufficiently great that I would be concerned about optimization.
j
farss wrote on 9/18/2014, 2:46 PM
[I]" If Microsoft trusts their algorithm so much that they make it the default configuration, then I trust it as well (at least to the point that I don't worry about it as long as I have a backup copy of the source media)."[/I]

Not so according to KB 312067. Windows own defragmentation code can cause the loss of restore points.

"When you run Disk Defragmenter on a volume with shadow copies activated, all or some of your shadow copies may be lost, starting with the oldest shadow copies. "

Bob.

johnmeyer wrote on 9/18/2014, 3:02 PM
Further on data loss during defrag:

Winternals Defragmentation, Recovery, and Administration Field Guide (a published book on the subject)

This is a complete book, published by someone who has spent a lot of time researching the subject. Several people keep asking for "authorities" on the subject. I'd say that between Bob's link to Microsoft, and this passage in the book, there is ample reason to be concerned. I linked to the page that describes the reason for why you might lose data.

The problem of a corrupted disk causing problems during defrag should be concerning to Vegas users because of all the crashes reported in these forums the past four years. When an application crashes -- and especially when the entire computer crashes (e.g., a BSOD) -- it is almost certain that this will create cross-linked or lost sectors. While the article I just linked to is concerned with the more serious issue of bad sectors, I can see where the same mechanism might cause problem any time the file table gets corrupted.

Regardless of how it happens, this indicates that if your drive isn't perfectly healthy, defrag would increase your chances of losing data before you had a chance to back it up.

[edit]Here is another angle on the same subject:

Hard Disk Defragmentation - Does it improve the recovery yield?

This is from another disk recovery vendor. Their point is that if your disk should become corrupt the worst thing you could do would be to defrag it. While most of us would not do that if we suspected a problem, many of you are recommending that you let Windows simply defrag in the background. Thus, if you have a disk that has started to develop a problem -- many of which are not immediately apparent -- the background defrag could start up on its own, and all heck could break loose.

This is also another reason to never let background processes run ...
VMP wrote on 9/18/2014, 3:06 PM
So is it better to disable the auto defrag in Windows 7?

I see that it was only enabled in the C drive, which is SSD. I have now disabled the auto schedule on all drives.

VMP
OldSmoke wrote on 9/18/2014, 3:22 PM
I thought it used to be that you do a checkdisk/f or scandisk/f before defragmenting? Has that changed? I don't do defrag anymore since I use SSDs for project files and a RAID setup for archive. I may on and off open a project that is on the RAID drive but only to have a look at it and maybe re-render a section of it. As such, my RAID archive as well as my project drive don't get fragmented so much so that I would need to defrag. However, my work laptop that still had a mechanical drive for over 4 years, I replaced it with a SSD last year, did benefit from a defrag but simply because it was a system drive and storage drive which I used to test software; lots of installing and uninstalling was done.

Generalizing the matter of defragmenting is certainly wrong but to say a heavily fragmented video source drive is as good as a defragmented one or even better...hmmm.

I think to get a video project drive or archive drive fragmented you would need to be a news reporter that constantly has new footage coming and going or someone that works on more then one project at a time from the same drive, erases content and puts on new one.

Proud owner of Sony Vegas Pro 7, 8, 9, 10, 11, 12 & 13 and now Magix VP15&16.

System Spec.:
Motherboard: ASUS X299 Prime-A

Ram: G.Skill 4x8GB DDR4 2666 XMP

CPU: i7-9800x @ 4.6GHz (custom water cooling system)
GPU: 1x AMD Vega Pro Frontier Edition (water cooled)
Hard drives: System Samsung 970Pro NVME, AV-Projects 1TB (4x Intel P7600 512GB VROC), 4x 2.5" Hotswap bays, 1x 3.5" Hotswap Bay, 1x LG BluRay Burner

PSU: Corsair 1200W
Monitor: 2x Dell Ultrasharp U2713HM (2560x1440)

jerald wrote on 9/18/2014, 3:28 PM
"Not so according to KB 312067. Windows own defragmentation code can cause the loss of restore points."

Thanks, farss, for your input. Does this mean that Microsoft no longer configures Windows to defragment by default? Are shadow copies related to media files? Should you keep original media files safe on a non-defragged drive?
j
jerald wrote on 9/18/2014, 3:32 PM
"Several people keep asking for "authorities" on the subject. I'd say that between Bob's link to Microsoft, and this passage in the book, there is ample reason to be concerned."

Thanks, johnmeyer, for this info. Good point.
Chienworks wrote on 9/18/2014, 4:07 PM
It's not just faulty drives to be concerned with either. Just the fact that defrag operates by making a new copy of the file, then erasing the old, and often several times (!) is the concern. Just a normal copy is not guaranteed to be exactly the same. Write errors are very rare, but the do occur. Is ever single sector that is written compared bit-for-bit with the one that was read?

Think about it ... for just regular storage of your files would you copy from file A to file B, erase file A, copy B to C, erase B, copy C to D, erase C, then store D while having completely thrown away the original copy? Would you do that every day? Why?
johnmeyer wrote on 9/18/2014, 4:27 PM
Just a normal copy is not guaranteed to be exactly the same. Write errors are very rare, but the do occur. Is ever single sector that is written compared bit-for-bit with the one that was read?Kelly, I think you may be a little bit overboard on that one. Digital storage systems have all sorts of error checking and also error correction. If bits really got lost during the course of normal computer usage -- even once in a blue moon -- I don't think computers would work very well, and we'd all be doing something else with our lives.

So, during normal operation, I don't think disk drives lose bits. However, when they start failing, they do lose bits, and that was the point of my previous post, and its caution about using defrag. (Oldsmoke's suggestion about always doing a Scandisk/CHKDSK before defrag is an excellent one.)

I learned the hard way, back in 1986, about bad media losing files. It was the only file I have ever lost (thanks to backup and practicing safe computing). It was the only copy of my first and only book (the user guide for our software). I was saving it to a 5.25" floppy, and suddenly, it was gone.

Someone told me about this amazing utility Peter Norton had written, and using that, I was able to retrieve about 75% of my work, sector by sector, over a period of three hours.

Since then, I have learned everything I can about how to make sure that data doesn't disappear.