I had an external drive, which I decided to defrag. Defraggler would not work on the external drive, a USB drive connected to my computer, which is a laptop. So, I went looking for a way to defrag my external drive and found your site. Someone had asked that question, and I followed your instructions to right-click the drive and select defrag from the menu, and Windows defrag would defrag the drive. The external drive had only 10% of the drive’s size in data, but if I remember right, defrag does not distinguish between data and empty portions of a hard drive.
I set defrag to work using another 3rd party program called “No Sleep”, which simulates a person moving the cursor, so the computer doesn’t shut down. Then I went to bed. When I awoke the computer was hung up, and I had to back out of defrag. When I looked at my external drive almost all of the data was gone.
The external drive was a backup of sorts, and it was not itself backed up. I hadn’t gotten around to that. So, hundreds of hours of work was lost. I started by going back on the web and found an article that warned that Windows defrag often hangs up on large drives. My external drive was a terabyte in size. The lack of warning for this on your site led to the loss of my data.
I have 8 recovery programs, all third party, which I rarely use. Six of them could not recover anything. The seventh recovered everything. So, I’m whole. But you need to add a warning in your instructions to warn people that Windows defrag has problems with large drives … and maybe small ones. I never have used Windows defrag since I was alerted to the existence of third party programs that do a better job of defragging.
I’m sorry you had such a time of it, and I’m very relieved you were able to recover your data.
Unfortunately, there’s really no warning for me to add, other than what I already repeat ad nauseum: if there’s only one copy, it’s not backed up. Your data was already at risk: the drive could have failed at any point and it would all be gone.
In fact, I suspect that something like that was bound to happen. Defragging a drive should never cause data loss, unless there’s a pre-existing underlying problem.
Become a Patron of Ask Leo! and go ad-free!
Defragging should work
Both Windows’ own defragging tool, as well as the third-party tool Defraggler, should have worked on your external drive.
That Defragler couldn’t see the drive at all leads me to believe there was already a problem with the drive or the USB interface to it. If Windows could see it, then Defraggler should’ve been able to see it.
Further, Microsoft’s own defrag should have worked without incident.
I have defragged large external disks without incident many times, and there’s nothing about your situation that leads me to believe that it shouldn’t have worked.
A defrag can cause data loss if….
What I strongly suspect is that something else caused Defraggler to fail to work for you and ultimately caused the data loss you describe.
Perhaps a hardware issue; perhaps the USB interface on the drive; perhaps the drive itself – there’s no real way to know.
For all I know, it could even have been due to this “No Sleep” program you’ve mentioned that I’ve never heard of. (Though, assuming it’s as simple as it claims to be, and is not malware itself, it’s unlikely it would be a problem.)
But, yes, defragging is a disk-intensive operation, meaning it drives the hard disk heavily. It makes sense that hardware failures or other marginal issues would be more likely to appear when the disk is under the stress of heavy use.
There’s a bigger problem
Here’s what I see as an even bigger problem.
The fact that you were facing data loss at all if this drive failed means to me that the data was not backed up – it existed ONLY on the drive in question.
That’s a recipe for disaster.
I’m sure you’re aware that drives can fail – catastrophically – without warning, regardless of what you happen to be doing with or to it. It doesn’t have to be a defrag, or a chkdsk, or any other kind of heavy use. Drives fail for a variety of reasons, occasionally without warning or odd behavior at all.
And sometimes, when they die, they die hard, destroying all the data that they contain.
You were very fortunate to recover your data at all.
My recommendation to avoid data loss
My very strong recommendation is my most basic, and my most frequent: make sure you’re backing up. Don’t put it off. Automate it if you can, so you’re not the weakest link in the chain.
Think about your hard drive suddenly and permanently disappearing. Would you lose anything you cared about? That’s the data-loss litmus test. If the answer is “yes”, you’re not sufficiently backed up.
Then, only after everything is backed up, I would run CHKDSK /R to scan the surface for errors, as well as check the file system for issues. If Defraggler still doesn’t see the drive, then there’s an issue that, in my opinion, needs to get resolved before I’d feel good about using that drive at all.
I’d also consider turning sleep mode off completely, so you don’t have to rely on yet another tool to keep the machine awake. (I could be wrong, but I believe the defragging will prevent sleep anyway.) I’ve experienced too many issues with sleep on various computers to trust that it’s not part of the problem.
P.S.: One clarification
You mentioned, “…if I remember right, defrag does not distinguish between data and empty portions of a hard drive”. That’s incorrect. Defragging is only about the actual data on a hard drive, and doesn’t care about (or specifically operate on) free space.
In fact, one way to completely defrag a non-system drive is to:
- Copy the contents of the drive elsewhere.
- Erase everything on the drive.
- Copy the contents of the drive back.
Of course, this assumes you have room somewhere to copy the data to. The process of copying the files back, as long as it happens sequentially (one file after the other), puts them “in order”, which is the ultimate goal of a defrag operation.
Do this
Subscribe to Confident Computing! Less frustration and more confidence, solutions, answers, and tips in your inbox every week.
I'll see you there!
I monitor my drive with a free software package that pops up every morning and hands me a detailed report on the health and well being of my drive. Name? Crystal Disc Info. Here is a link:
http://crystalmark.info/software/CrystalDiskInfo/index-e.html
Crystal Disc monitors temperature in real time. Useful when you have to decide whether or no you need to turn on the air conditioning.
“And sometimes, when they die, they die hard, destroying all the data that they contain.”
I remember the “ping, ping, clunk” sound of a dead IDE drive trying to power up. I heard it numerous times, and it always mean “sorry, but your drive is dead — if you really need to get your (not backed-up) data off of it, you’ll need to pay big bucks to one of those data recovery places”.
Defraggler helped me identify a USB hard drive that was going bad. It showed it was defective on analysis. I did wonder if the case from Seagate that went bad damaged the disk before failing. I found a new external case and cannot be sure it did not damage the drive.
I ran SpinRite on the drive all night. It was doing recovery for over 12 hours. After that, the drive was totally inaccessible by either Win 7 or SpinRite. Apparently it was READY to fail.
I just wish I had examined it if possible BEFORE using SpinRite. I doubt there was much on it that was essential and not already backed up elsewhere.
I do image Windows to an SD card just as a CYA to backup USB backup hard drives.
A long time ago, when using Windows 98, I got a file corruption while defragging. It involved a scenario where a drive containing some shared folder was defragged while some shared files where accessed remotely. Apparently, the defragger that I used at the time was unable to detect that a file was remotely edited. It was obviously a critical bug in the defragger that I used at the time.
After switching to another, professional grade, product, the problem vanished.
“I had an external drive, which I decided to defrag.” – I’ve never defragmented an external hard drive as I suspect there would be no/very minimal performance benefit. That said, I’m only speculating and have never seen any evidence one way or the other.
I don’t even defrag any more. I suppose Windows 7 might be defraging in the background during idle times. But I haven’t actually run a defragger in years … just seems to be a waste of my time. I have never noticed any great increase in performance, so I just stopped doing it.
Yeah, all versions of Windows since Vista have defragged automatically.
Another point: If you have an SSD, you must not defraggment. At most, you may want to consolidate the free space, maybe every two years.
Current defraggers should now detect SSDs and automatically avoid working on them.
“Current defraggers should now detect SSDs and automatically avoid working on them.” – Actually, that’s not entirely accurate:
http://www.hanselman.com/blog/TheRealAndCompleteStoryDoesWindowsDefragmentYourSSD.aspx
I have used Auslogics defragger for years on both my computer and external drives without a single problem.
It should probably be noted that this issue was likely unrelated to defragmenting and would likely have occurred no matter which defragmenter was used – or, possibly, even if no defragmenter was used. These days, there’s really no reason to used a third-party defragmenter (but nor is there any particular reason not to).
“The process of copying the files back, as long as it happens sequentially (one file after the other), puts them “in order”, which is the ultimate goal of a defrag operation.”
There is a huge caveat to that suggestion: the “as long as it happens sequentially” is the key, and Windows may play tricks on you in that regard. At least from Windows 7 (I can’t speak to other versions but probably applies to 8 and 10) the system write cache is greatly expanded over previous Windows version. Windows 7 will use up to 10GB of RAM if available as a buffer: it will read all the source files in as quickly as possible and write out files *multiple files at a once*. Check out the queue length of the target drive when copying a large set of medium-sized files, you’ll see that the queue length goes well above 1.0 from trying to write bits of many files at once from the buffer. Unfortunately it could be that the copy out-and-back operation actually makes the fragmentation worse than it was before.
All is not lost, however, as third-part file copy utilities (e.g. TeraCopy) can use their own more-intelligent file caching and do true sequential transfers.
Love the comment “The lack of warning for this on your site led to the loss of my data.”.
There’s nothing like blaming others for your own stupidity is there.
OP, tell us please the recovery program (no. 7) you mentioned that worked! The other six would help too. Many thanks.
“The lack of warning for this on your site led to the loss of my data.”
Yes, your failure to back up your data adequately is all Leo’s fault.
No software or hardware is 100% risk free. As adults, it’s up to us to check out the pros and cons of using any code or device with our computer system. We cannot rely on anyone else to do our due diligence for us. Hard drives are especially susceptible to failure. It’s documented all over the Internet. Thinking that we’re safe from problems and don’t need to back up our data is hubris at its best, and blaming someone else for our own folly is childish.
Leo does an exceptional job of trying to enlighten people about the ins and outs of using a computer, and we’re lucky he takes the time to do so. But we cannot expect him to do everything for us. Intelligent people shouldn’t need a label warning them that they shouldn’t stick a paper clip into an electrical outlet.
“Hard drives are especially susceptible to failure.” – Indeed. Every hard drive will fail sooner or later (and sometimes much sooner than you’d like). It’s something you need to plan for.