Technology in terms you understand. Sign up for the Confident Computing newsletter for weekly solutions to make your life easier. Click here and get The Ask Leo! Guide to Staying Safe on the Internet — FREE Edition as my thank you for subscribing!

A Peek Behind My Backup Curtain

I was asked the other day to describe how I back up my machine. Presumably the intent was to learn some kind of “best practices” from what I choose to do for myself.

Ummm…. no.

Oh, I’ll describe what I do, as best I can, but I’ll warn you: I’m a geeky edge-case, far away from what “average” computer users do. It’s unclear whether anything I do might help you directly.

The big take-away might be that I’m crazy, I back up like crazy, and I automate as much as a I can.

Beyond that … well, it’s a maze of twisty little passages.

Become a Patron of Ask Leo! and go ad-free!

Virtually everything

I want to start by pointing out that while I have several physical computers, I actually have many more that are virtual.

For example, the Windows 10 and Windows 8.1 machines I use, both for research and a variety of Windows-only applications, are actually virtual machines on my desktop computer.

What that means is that they also have “virtual” external hard drives. Rather than having half a dozen external drives on my desk, I have a single 8-terabyte external drive, on which each of the virtual machines lives, along with its virtual external drive.

Another complication is due to the nature of what I do; it’s important that I evaluate and run several different back-up programs. You’ll see that reflected (no pun intended) across the machines below.

Physical inventory

Let’s start with a physical inventory of the actual computers I use or am otherwise responsible for.

Desktop: Mac Pro with 64GB of RAM, a 512GB internal SSD, the 8TB external hard drive I mentioned earlier, and another 1TB external. This is my office machine, where I do most of my work.

Laptop: Macbook Pro with 16GB of RAM and a 1TB internal SSD. My travel and “family room” computer.

Microsoft Surface Pro, first edition, with 4GB of RAM, a 110GB internal SSD, and a 512 GB external drive. After a very smooth upgrade to Windows 10, this machine now sits in a corner and is dedicated for use with two connected scanners.

Dell XPS laptop with 8GB RAM, 512GB internal SSD, and a 1TB external HD. This machine also had a very smooth upgrade to Windows 10, even though it’s at least six or seven years old. Once upon a time, this was my primary laptop for travelling. These days, it’s dedicated to playing audio into our stereo system.

Puget Systems Desktop with 8GB RAM, three internal drives and five external drives, running Linux Mint. This was my old desktop machine, and now lives in my basement, playing the role of “network attached storage”, or NAS. You’ll see it plays a prominent role in backing up, below.

My wife’s laptop: a Macbook Air I’m responsible for maintaining and backing up as well.

Believe it or not, this may be the lowest computer count I’ve had in years. :-)

Backup Slate

Backing up the Macs

The internal SSD on my Mac desktop is backed up to the 1TB external drive using Apple’s Time Machine application.

I also run Crashplan (free edition) on my desktop Mac, my MacBook, and my wife’s Macbook Air, and have it configured to back up the primary hard drive on each to the Linux box in my basement. This is the only formal backup used on the two laptops (though as you’ll see shortly, much of what’s on those machines is backed up in other ways).

Portions (though not all) of that gigantic 8-terabyte external drive are copied1 to some of the external drives on the Linux machine. Specifically, the files that make up the virtual machine images, the collection of files that go into creating my videos, and my collection of photographs, are copied each night by a custom script.

There’s more going on with that 8TB drive that I’ll get to in a moment.

Backing up the Windows 10 laptops

Each of the Windows laptops are no longer treated as “portable”, so each has an attached external hard drive.

The Dell XPS laptop runs Macrium Reflect free, and creates periodic full and differential backups according to Macrium’s default settings.

The Surface Pro runs Macrium Reflect Home, and performs a full image backup once a month.

Backing up off-site

In addition to the backups I’ve discussed so far, there are several forms of off-site, or “cloud” backup, and related technologies.

Being a software engineer at heart, I use a software version control program (SubVersion2) to main a wide variety of files across my servers and local computers. Besides allowing me to make changes anywhere while maintaining version consistency across all those machines, everything is backed up in multiple places – on multiple computers and on the askleo.com server.

I use online cloud storage – Amazon’s Web Services – for a couple of purposes, including delivering large video and audio files to my sites. I also use it for online backup. In fact, uploading my entire collection of photographs – perhaps my single most irreplaceable digital asset – has been a priority for years.

I use Bittorrent Sync, a utility very much like Dropbox but without online storage, to automatically copy files between the multiple machines I might be working on. Thus, any files I’m actively working on are automatically replicated to several machines every time I hit “Save”.

One set of files – my personal and business records –  are also synchronized to my server “in the cloud” for an additional layer of backup. Because these are sensitive files, however, I use BoxCryptor to store and upload only encrypted copies.

I also use OneDrive and Dropbox, though mostly as cross-machine workspaces with semi-real-time online backup. I use both to share files with my staff and others, and I also use DropBox to automatically back up any files my wife works on on her laptop.

About the Linux box

With no fewer than eight internal and external disk drives, you might expect that the Linux box in my basement plays an important role in my personal backup strategy. It includes:

  1. Three terabytes of NAS that can be used by any machine on my local network. It’s another way to transfer files, but it also ends up being a repository for downloads, documents, archives, files, email (some dating back to 1986), all of my music, and my complete photo collection. Backup images from other machines are also stored here.
  2. More archives. Backup images of other, older machines, or other information that is less critical. These archives are archives of convenience, and could be lost without concern.
  3. A complete backup of the NAS. Each night the NAS is copied by a custom script to another hard drive.

Backing up virtual machines

On my desktop machine (remember, it’s a Mac) I use Virtual Box to run virtual machines of other operating systems. There are at least nine, including virtual machines that can run:

  • Windows 3.1
  • MS-DOS
  • Windows XP
  • Windows Vista
  • Windows 7
  • Windows 8
  • Windows 8.1
  • Windows 10
  • Linux Mint

No, I don’t run them all at the same time (though I have run XP, Vista, 7, 8.1, and 10 all at once, just to see if I could).

As we saw above, the files that make up the virtual machines are backed up nightly to the Linux box. However, because these are the machines on which I often test, evaluate, or simply learn about backup software, some may include additional backups using Windows-based software.

Specifically, at this time:

  • The Windows 8.1 machine runs Macrium Reflect Home, and performs my recommended “daily incremental, monthly full” backup scheme to a simulated external drive.
  • The Windows 10 machine runs EaseUS Todo Home, once again performing a “daily incremental, monthly full” backup to a simulated external drive.

Belt, meet suspenders

So far, just about everything is backed up in at least a couple of places, if not more.

But I’m just a little obsessed about backing up. In particular, I’ve heard too many stories of account loss to rely on just one online backup strategy, or even a single encryption technology.

As a result, at least once a week:

  • The contents of my Dropbox is zipped into a single large file, which is then encrypted using GPG, and uploaded to my Amazon storage.
  • The contents of my OneDrive account is zipped into a single large file, which is then encrypted using GPG, and uploaded to my Amazon storage.
  • The unencrypted files that are normally encrypted using BoxCryptor are zipped into several files, each of which is then encrypted using GPG, and uploaded to my Amazon storage.
  • On my server, the source code control repositories are – you guessed it – zipped into large files which are encrypted using GPG and uploaded to my Amazon storage.

This is one of the reasons I often tell people that in order for me to lose something that’s actually made it into my system would typically take the simultaneous destruction of all my machines at home, the failure of my hosting company, and the failure of several different online service providers.

I feel pretty safe.

An example: how my photographs are backed up

While all of that is pretty darned complex, (I tried to draw a diagram once – I couldn’t), it’s also almost completely automated and takes very little of my time. Once a month I do a little clean up as I also snag backup copies of things that can’t be automated, like an export of my LastPass database.

But it is complex, so let’s walk through what happens when I take a picture – as I said, I consider photographs one of my most valuable assets; not because they’re that great, but because if lost, there’s simply no way to recover them.

  • Any photo I take on my phone is automatically uploaded to the “Camera Roll” folder in OneDrive, and appears on several of my computers running OneDrive within minutes.
  • Once a night, that OneDrive folder is automatically copied to my photo collection, stored on the 8-terabyte drive.
  • If I take a photo using my digital camera, I manually copy the photo to my photo collection.

From there, it’s a series of automated copies:

  • Each night, the photo collection on my 8TB drive is automatically copied to my NAS.
  • Each night, the NAS is automatically copied to its backup.
  • Once a week, images are automatically uploaded to my Amazon storage.

It’s overkill!!

I’m crazy, and I know it. I’m obsessive about backing up.

But you already knew that.

And, no, this isn’t even close to what I recommend the “average computer user” should do. For you, an external hard disk and some good backup software, coupled with something like Dropbox or OneDrive, puts you in great shape.

On the other hand, if you’re in IT, if you run a business, if you have a complex combination of virtual and real machines as well as a mixture of operating systems and online services ….

Well, who knows. Maybe you need to be a little crazy too.

Do this

Subscribe to Confident Computing! Less frustration and more confidence, solutions, answers, and tips in your inbox every week.

I'll see you there!

Podcast audio

Play

Footnotes & references

1: Throughout this article, when I say “copy”, it’s typically a copy of new and changed files only. There’s no need to copy any files that are already in the destination again.

2: Someday I may switch to GIT, as it seems that all the cool kids are using it.

64 comments on “A Peek Behind My Backup Curtain”

  1. While that’s pretty involved, I think for what you do for a living it’s pretty essential.

    The one thing you touched on that is more important than people realize is ‘if you run a business’. That can even be as little as running a small etsy store… losing your data for that would be a nightmare, and depending on jurisdiction, might even run into legal issues over ‘record keeping’!

    One thing you’ve not mentioned that I’d like an opinion on Leo, is what do you think of ‘offsite backup’ consisting of (encrypted) backups on an external drive stored at a friend/relative’s house? Maybe with reciprocal storage of their backups on your premises. With the price of external drives so low, it seems a reasonable thing to do these days, provided of course you encrypt the data to make sure nobody can access it without authority!

    Reply
  2. Outta curiosity: why not simply make the NAS the backup target for every device and then back the NAS up to both the cloud and an external device – another NAS or hard drive?

    We back up all our devices to a fire- and waterproof NAS (an ioSafe) which uses built-in apps to back up to both an external drive and Glacier. Another app enables data to be synced across multiple devices (much like BitTorrent Sync). The processes are all automated and hands-free.

    Reply
    • Much of what I do is to test “real world” scenarios – hence my plethora of external drives. Since I recommend using ’em, I ought to use ’em myself for at least some scenarios. On the Mac side it’s really REALLY difficult (and fragile) to get Time Machine to backup to a non-Apple network device. So that external drive is just pragmatism. Finally, I don’t have the internet speed to backup my NAS to the cloud – I have to be very choosy about prioritizing what goes up.

      Reply
      • “On the Mac side it’s really REALLY difficult (and fragile) to get Time Machine to backup to a non-Apple network device.” – Yeah, it is. Were Microsoft to attempt this type of lock-in, they’d be absolutely pilloried!

        Reply
  3. I have owned and run a small business for the last 40 years, I have 3 computers 2 running Quickbooks , number 1&2 run XP and are not connected to the web. and are cloned and fitted with a removable hard drive . number 1 is at work number 2 is at home , the removable drive is backed up, removed and uploaded at home each night. ( now have three backups ) Number 3 computer is connected to the internet does internet banking and email to my accountant. all run avg free and malware bytes anti malware .I use CC cleaner every 2 weeks never had a virus ever.( Can’t say the same for the family computer that kids use.

    Reply
    • I use Glacier: it’s excellent and extremely cheap. Or, more accurately, storage costs are cheap. Retrieval costs, however, are relatively high. Consequently, Glacier is best used as a store for data that you’ll only need to access very infrequently. Note too that it’s not as speedy as S3, meaning retrieval times will be slower – which likely isn’t an issue for an average home user with an average amount of data.

      In my case, data is stored locally on a fire- and waterproof NAS – an ioSafe – which is backed up to an external HDD and so it’s extremely unlikely that I’ll ever have reason to access/retrieve the data stored in Glacier. It’s simply a safety net that I’ll hopefully never need.

      Reply
    • I need to look into it further. The tools I use (command line) are just starting to catch up with Glacier, and may allow me to upload their in a manner similar to the way I do to S3. Glacier does look like a viable backup solution. (I do use S3’s “reduced redundancy” to reduce cost somewhat already. but Glacier is even cheaper.)

      Reply
      • For the moment, my “off site” backup is very classical: regularly, I copy by hand my centralized encrypted backup of all my machines on another disk, and I put that disk at my office which is in a nuclear research center, so I take it that in as much as both my house and that nuclear center are destroyed, I don’t care about the lost data any more :-) In fact, I have 2 disks, so that at no moment, there is no backup “outdoor”. If I take one disk home to make a new copy, the other one, with the old copy, remains at the office so that even if disaster strikes when I’m in the process of copying, I have a copy elsewhere.

        I was thinking of replacing that manual procedure by Glacier. But even with Glacier, it is hard to beat the cheapness of storing a few remote disks in a cupboard, and when I look at my uploading bandwidth, I actually think I won’t make it practically.

        Say, 2 TB on Glacier induces a monthly cost of the order of $15,- Now that’s not much, but it corresponds to buying a 1 TB disk every 4 months (and that price will be decreasing). So after 16 months of Glacier fee you have actually the price of the 2 2TB disks for off-site storage more or less. I must have an upload bandwidth of somewhat less than half a MB/s, which brings me to somewhat more than a GB an hour. I’d need 2000 hours to upload 2 TB to Glacier…. That’s more than 2 months !
        So in the end, Glacier is not an option for me…

        Reply
        • My wife used to have a retail store for which I was naturally the IT guy. I got two identical external hard drives – one for home, one for the store – onto which each location would periodically back up. I would then manually swap them from time to time. :-)

          Reply
          • Yes, finally, I will keep using such a “low tech” solution, as I didn’t realize that my *upload* bandwidth won’t allow me reasonably to do backup in the cloud, no matter the price of the storage.

  4. You have a very thorough setup, I’m glad I’m not the only paranoid one! I’d like to share a few of the thoughts I’ve had in this area, in case you find them useful too.

    I looked at BitTorrentSync as well, but found a number of concerning security issues with it. Worst was that knowing the ID of a file is sufficient to retrieve it, and it seems like those IDs may be given to their servers. I decided to use SyncThing instead, personally. I’d definitely recommend researching this, if you haven’t already.

    It looks like most of your backup methods are “push”? I worry that if someone were to compromise my PC, and the PC can access the backup location(s), the attacker may be able to remove my backups as well. A clever attacker could even lie in wait until a usually-disconnected external drive is reattached for the next backup. I like to have at least one pull-only method, to a dedicated machine with no inbound remote access of any kind. Files can’t be removed without sitting at the keyboard, no matter what other machines, services, or accounts get compromised.

    Reply
    • I think you’re hitting the nail right on here. If you really want to be secure concerning malware in all respects, some back up steps should in fact remain manual, because if they aren’t, then by definition the computer (which can potentially be infected) has the means to reach the backup and corrupt it. Of course, the way to access the backup can be complicated, and you may hope that the malware won’t be smart enough to find out, but this is of course only relative. If the malware opens a reverse shell and a human hacker has access to your machine, then if he’s smart enough, he might find out.
      You are of course right that if you use “push”, then the machine you backup, when corrupted, can destroy its own backup, so from that PoV, pull is safer. However, if you use “pull”, you have the problem on the other side ! If the machine that has to do the pulling gets corrupted, it can destroy the backup too !
      So the only way to make a backup that is “guaranteed safe” from malware corruption, is one that you can only do manually. ( And then you forget to do so… )

      Reply
      • “And then you forget to do so.” – Exactly. I think that, for most folk, manual backup is a more risky proposition than automated backup.

        Reply
        • My point was not to rely *only* on manual backup, of course, let us be clear about that. I think that Leo insists also very often that there should be automatic backups. But I had been wondering about malware induced corruption of backups. For instance, ransomware is kind of ridiculous if you have good backups. Ransomware is only effective if there are no backups (many people :-) ), or if the ransomware can also corrupt the backup. And by definition, if the backup is automatic, the ransomware can *in principle* access it too. So, in order to protect you from that kind of thing, there should be a “manual gap” between the system to be protected and (at least one) backup.

          Reply
          • “Ransomware is only effective if there are no backups, or if the ransomware can also corrupt the backup.” – Which is something certain variants can do. It’s also more of a risk in business environments where backups may not be immediate – for example, in a business that runs its backup routines overnight, the data will sit vulnerable for several hours.

            Realistically, if you have tiered backup processes and exercise caution with email attachments – which is pretty much the only way current crypto variants spread – the chance of being impacted is extremely small. And using something like OpenDNS adds an extra layer of protection and further reduces the likelihood of it happening.

  5. Good article! I’d like to see more on how you configure and use GPG for the encryption/decryption.

    Here I run Crashplan on our three machines. It backs up to the Crashplan Cloud and an external local drive. It also makes backups so that each of the machines backs up to one of the others. This is to prevent data loss in the event of theft or hardware failure.

    I also run Reflect on each machine. The images are saved locally and I rotate through copying the images off-site to a friend’s system. He does something similar with me serving as the host.

    I see two concerns to keep in mind when planning your backup strategy: 1) Being able to recover your data when there is a hardware failure and 2) Being able to recover from your entire local system being destroyed by fire, storms, floods, etc. or stolen by someone.

    Reply
  6. Just my two cents,

    I keep a data folder named “main” where every bit of my new changed and modified data goes.

    This folder is regularly sync’d with a flash drive which goes home with me daily.

    At home there is a laptop and a desktop pc where I sync manually again.

    So thats 1 plus 3 copies of my data.

    My operating system generally remains untouched and I have made image backups of the OS partition on all PCs which I have restored successfully a couple of times.

    Small but effective because my daily managed data is not that huge.

    Ravi.

    Reply
  7. I backup my data to the cloud using iDrive services. I also backup my entire drive using Macrium Reflect Free to a locally attached USB drive. What scares me is the potential for ransomware to either delete my Reflect image files on the USB drive, or encrypt them, thereby eliminating my ability to recover from a ransomware attack. I’m wondering if I could protect the Reflect image backup files as follows: Configure Reflect to use a different Windows account (call it “backup”) to run the scheduled Reflect backups. Give the “backup” account read/write (maybe full access?) access to the folder in which the Reflect image files are stored. Give my normal Windows account (the one I log on with every day to do my work) Read-only access to the backup folder. In this way, only the “backup” account can delete or modify files in the backup folder, and I (logged on as my normal account) can view the contents of the Backup folder, but do nothing more. Then, if I, using my normal Windows account, get hit by any ransomware virus, it would not be able to modify or delete the Reflect image files located in the backup folder on the USB drive because it wouldn’t have the necessary NTFS permissions on that folder to encrypt or delete the image files. Can anyone see a flaw in this plan? It assumes that the cryptovirus would execute with the permissions of the logged on user that triggered the virus — is this assumption correct? Also, it depends on being able to configure Macrium Reflect to use a Windows account other than the currently logged on account for doing the backup, but I don’t know if Macrium includes this feature. Any thoughts or feedback would be greatly appreciated.

    Reply
    • “It assumes that the cryptovirus would execute with the permissions of the logged on user that triggered the virus — is this assumption correct?”

      In short, no.

      In principle, it should be. But then, if there were no exploits, no security holes, the virus could not get onto your machine in the first place except by yourself, installing it like a pro yourself by running mail attachments or the like. In both cases, your hypothesis of “sandboxing” the virus to your normal account, fail.

      Indeed, there are regularly exploits (that is, discovered bugs) in system software which allow for what’s called “privilege escalation”. https://en.wikipedia.org/wiki/Privilege_escalation If the virus came onto your machine not through your own mistake, but through exploits (for instance, exploits in flash player when browsing content on the web, and visiting – eventually through ads – malicious sites targeting that exploit with a ransomware payload), then you are assuming that your system has safety failures (which is a pretty safe assumption in itself), and so you can just as well presume that there are privilege escalation exploits too.

      If the virus came onto your machine by your own somewhat silly action of installing the malware yourself, then chances are that you will allow this malware to be installed with system privileges (like you would for any newly installed driver for instance).

      So in both cases where you got the virus on your machine, you also have a similar path to administration privileges for the virus.

      But of course it can help. You make life somewhat harder for the malware.

      Reply
    • “It assumes that the cryptovirus would execute with the permissions of the logged on user that triggered the virus — is this assumption correct?”

      In short – and to contradict Patrick – yes. Or, at least, that’s the answer today (tomorrow, it could be a completely different answer). As far as I know, no current crypto variants are able to jump between user accounts or elevate privileges.

      That said, given that your backup strategy is in pretty good shape, I don’t think you need be overly concerned about cryptos for a couple of reasons. Firstly, they’re very easily avoided. Current variants propagate almost entirely by email attachments – and if you don’t open an infected attached, your system will not be compromised. Secondly, even if your system was compromised, iDrive’s versioning would enable you to easily restore earlier versions of your files that hadn’t been encrypted:

      https://www.idrive.com/idrive-rewind-faq#2

      Reply
      • Ray, you are right, because current ransomware is still in Kindergarten. If the only way to spread your troyan or virus is by e-mail attachment, then you’re still in Malware Engineering 101 and you’re not in the top students of the class :-)

        My point was, however, that even given the sorry state of sophistication of ransomware today, you are not protected against this malware running with system privileges: if you install it yourself, you might be so silly as to give it *yourself* system privileges ! Imagine the malware to be installed in the same way as adware gets installed, as piggyback on another installer, and when your OS asks you system privileges during installation, you give it to the malware installer yourself. If you are “silly” enough to get such kind of delivered malware in the first place, you’re probably also “silly” enough to give it system privileges. So, or you’re “smart” enough to avoid this kind of Kindergarten ransomware in the first place, or you’re silly enough so that the sandbox in a non-administration account is pretty much illusionary.

        But as I said, I consider ransomware still in Kindergarten if it needs e-mail attachment delivery. There are many Trojans and viruses that *can* propagate using exploits. They only need to be charged with a ransomware *payload*. If I were a ransomware writer, I would:

        1) use a vector such as BadUSB to propagate
        2) hide myself in pre-boot firmware or in early boot stuff
        3) not use bitcoin, but rather an anonymous cryptocurrency, such as bytecoin or dash or the like.

        But then I’m not a ransomware writer :-) I only study it as a hobby…

        Reply
  8. Yes, the NSA is recording everything that goes over your network, but I’m not sure that they accept to send it back to you if you need it as a backup. It is a pity, because otherwise, the taxpayer money lost on them would serve a purpose :-)

    Reply
  9. What a fascinating article. Thanks, Leo. I thought I was a geek. I have at least half a dozen real external drives. But I have a question. I take many irreplaceable hi-def movies. My grandson is a great 11-year-old hockey player. Each game is about 6GB and I try to never miss a game and especially a tournament which could be in another province, state or country. He plays on 3 teams including the top one in North America. I’m sure you can identify with these being irreplaceable as well as taking lots of space. I share them with the coaches for training purposes and also convert to AVS and post on a private YouTube channel so the kids can watch at home and listen to their dads telling them everything they did wrong ;-)

    In the early years, I used to make duplicate copies on DVD’s. a few years ago I bought a Synology NAS and put in 5 x 2 TB WD Black drives. Now I have upgraded two of them to 4 TB each and use the 2 TB ones that were replaced, in my Blackbird which holds 5 drives including a 1TB SSD. When a 2 TB in the Blackbird is full of movies I set it aside to keep as another backup. The second or original backup of the movies is in the NAS. My regular non-movie stuff on the SSD gets backed up onto an internal drive in the Blackbird as well as onto an external USB drive. For these, I use both Acronis and Norton Ghost which is probably not necessary. In fact, since you are a much bigger and smarter geek than me, I have been thinking seriously about switching to Macrium. If you think it’s the best it must be.

    Now my question is regarding the safety of using the NAS as a backup. I bought it just for this purpose ie backup. But someone, a year ago in the Synology forum told me that it is no good to use for backup because it is RAID. In this case, I used the Synology version of RAID. But he meant any type of RAID I think. Geeze, is that true? I know that RAID can fail but I thought one purpose of it was for safety. I don’t think I can UN-RAID it at this stage unless I can first back the whole thing up to some other giant device which I certainly do not have.

    Reply
    • “Geeze, is that true?” – I think the point he was trying to make is that RAID is not a substitute for backup – in other words, you shouldn’t put the only copy of your data on a RAID and assume that it’s safe (it isn’t: the RAID controller could fail, the device could be killed by a power spike, etc., etc.). Outside of that, there’s absolutely no reason not to use a RAID as a backup target (I use an ioSafe NAS – which is basically a fire- and waterproof Synology – that’s then both synced to Glacier and backed up to an external hard drive).

      Reply
      • Thanks, Ray,

        That makes total sense to me and I don’t know why I didn’t see it before. A regular hard drive can fail just like a RAID array. I have had a RAID 0 array fail in my desktop before finally replacing it with the SSD. I have also had a single HD fail as well. I guess the key for backing up irreplaceable data is to have more than one backup, which I do

        Reply
        • RAID 0 is generally best avoided. Unlike other RAID levels – which duplicate the data across multiple drives – RAID 0 puts each bit of data onto only a single drive without any duplication. In other words, the failure of any one drive will result in data loss – and the more drives you have in the RAID, the more likely it is that one will fail.

          Reply
          • I should have explained Ray, my RAID 0 array I set up in 2007 was not for backup but for speed in the days when SSD’s were too expensive. I have 2 x 150 GB 10,000 rpm Raptor drives for the RAID 0 but they are currently sitting idol because last year I got the 1TB SSD which is like greased lightening. With RAID 0 you absolutely must back up constantly for the reasons you mention. The original pair of drives actually failed and had to be replaced under warranty. My desktop is a bit of a hotrod ie an HP Blackbird 002 which has been rebuilt and upgraded since I bought it new in 2007. It have an i7 and 24 GB of RAM. Although it’s wha

  10. I have a small computer “fix anything” part-time business. I push backup hard among my clients and have had two clients whose hard drives failed this year (and they were desktops, not laptops) and neither had a full backup. The only thing I still say that you probably don’t is I still don’t think it is a good idea to put income tax returns or other sensitive financial data on any cloud storage. With all the hacks we hear of often (just read Krebs on Security), I think it is better to keep backups of those sensitive things on an external hard drive in a safe deposit box. Also, for my main, “carry everywhere” laptop, I also clone the hard drive so I have it ready to go for an almost instant, “be back in business” fix if the one in the laptop fails. Keep pushing backups, they really save a lot of tears.

    Reply
    • I completely disagree. It’s much more likely that local data will be lost due to fire, flood, crypto, device failure, theft, etc., etc., etc. than it is that clouded data will be hacked. How many people do you know who’ve lost locally stored data? Likely plenty. How many people do you know who’ve had their cloud data hacked? Likely none. Yeah, I know, it could happen and cloud companies lose data too – but the incidents of few and far between.

      It’s all risk- and putting your stuff in the cloud is much less risky than keeping it local.

      Reply
      • Maybe you missed the words, “safe deposit box”. Having worked in corporate America, the other problem with cloud storage is a couple of bad quarters, followed by a layoff, followed by some deleted files by an administrator who lost his/her job. But an income tax return is just safer in a safe deposit box and not on somebody’s server.

        Reply
        • I keep mine on Dropbox encrypted with a strong password with several backup copies at home on disk and on paper.

          Reply
        • “But an income tax return is just safer in a safe deposit box and not on somebody’s server.” – Not if the drive drives, it isn’t.

          “Having worked in corporate America, the other problem with cloud storage is a couple of bad quarters, followed by a layoff, followed by some deleted files by an administrator who lost his/her job.” – Which is why you’d choose a reliable provider – Google, Amazon, Microsoft, etc. – none of which are likely to go belly-up any time soon. And when did you ever hear about a disgruntled administrator deleting files from somebody’s cloud storage? Do you seriously believe that a BOFH could hit a button and delete the PBs of customer data – consisting of trillions of objects – that Amazon stores? Obviously, there are safeguards in place to ensure that it cannot happen.

          Reply
          • I meant to say, “Not if the drive dies, it isn’t.” A combination of local and cloud storage really is the best option. It covers you against pretty much every eventuality and makes data loss significantly less likely.

          • Indeed. In fact, it’s probably safer. So long as data is properly encrypted, it cannot be improperly accessed – no matter where it’s stored. And putting the data in the cloud mitigates against all the scenarios – fire, flood, theft, power spikes, hardware failure, etc., etc. – that could result in the loss of the local copy of the data. If bandwidth isn’t a problem, keeping an additional copy of your data in the cloud is a no-brainer.

    • Buckman, a US-based global privately owned chemical company requires all sales reps to carry a second hard drive wth them in case one fails.

      Reply
      • To be clear, I’m not suggesting that local backup should be replaced by cloud backup: they shouldn’t. However, bolstering local backups with cloud backups is the easiest, (usually) least expensive and most sure-fire way for people to minimize the chance of data loss.

        Reply
  11. Hi Leo,
    Excellent article and I do enjoy your posts.
    I am looking at your Desktop arrangement of 512GB storage with 8 TB external Hard Drive with another 1 TB.

    Now I have something similar, but much smaller and cannot figure out how to make things work.
    My hard drive is only 250GB and I have 2 x 2TB Buffalo external storage drives.

    Nothing at all is stored on C drive except programs.
    I work from G drive and merely wish to automate my Back-Ups from G to F Drive.

    The only thing I have been advised is to do it manually and regularly, but that is tedious and subject to some forgetfulness.

    Imagine working from your 8TB hard drive and then automating the back-up to another drive. How is it done?

    Many thanks for your anticipated reply.
    Toni Hughes

    Reply
    • I sometimes use a little 1 line backup batch file (backup.cmd) using robocopy.exe to copy all of the subfolders
      robocopy “c:\Users\{USER}\Documents\” e:\docu /e
      echo Backup Complete
      pause
      Well actually 3 lines but 1 does all the work. You can add a line for any folders you need to backup or back up the whole drive in one go. You can run it on startup or use the Task scheduler to run it.

      Reply
    • You could configure a backup program like Macrium or EaseUS to backup G: to F: in an automated fashion quite easily. In fact that’s probably what I’d do. (Though I’d back up C: as well :-) ).

      Reply
  12. “Any photo I take on my phone is automatically uploaded to the “Camera Roll” folder in OneDrive, and appears on several of my computers running OneDrive within minutes.”
    Can you teach how you link your phone camera app to onedrive so any photo you take will be automatically uploaded to camera roll in onedrive?

    I am using Samsung Galaxy S4 default camera app.

    Thanks

    Reply
  13. I’ve noted this in previous posts … storing a physical backup away from the physical building where your computing equipment resides. I use Macrium Reflect to backup our PCs onto external USB drives and copy the NAS to one also. About every month or so I unplug these drives, haul them to a steel outbuilding far away from the main house, drop them in a safe and pull out out the other set of drives, take these to the house and plug ’em in, spend a couple minutes fiddling with the configuration (just forcing the correct drive letters), and it’s done. This also forces a new image every time I swap, then the nightly auto-increments kick in.

    I am intrigued by Leo’s use of cloud backup but more so by all the comments; will determine if this is something I’ll add to our backups. For a business it makes sense.

    Like Leo, and most folks I’d assume, photographs are the most precious data we have, and shipping copies off to family and friends is another way to preserve them. Although we’ve digitized all the old photo albums there’s something about those old photos that send us back … just like listening to vinyl records with the skips & scratches brings back memories. So we keep the old photo albums, film negatives, and most of the records in the outbuilding also.

    Reply
    • “For a business it makes sense.” – And for home users too. As I said previously, bolstering local backups with cloud backups is the easiest and least expensive way to reduce the chance of data loss.

      Adding cloud into the mix protects your data from local disasters, your own incompetence (we all have oh-no moments) and from Murphy’s Law-style coincidences. Imagine a lightening strike/power surge toasting your connected equipment and then discovering that the hard drive in the outbuilding had decided to join the choir invisible. The odds are obviously against it, but things like this do happen.

      Reply
  14. “Once a night, that OneDrive folder is automatically copied to my photo collection, stored on the 8-terabyte drive.”>
    What software do you use and how to set it?
    Thanks

    Reply
    • It’s a custom script I wrote myself. It happens to be in PHP, but it’s of no general use outside of my own situation. (Yes, to those who follow, I often use PHP as a batch file language. :-) )

      Reply
    • Here’s an example of a little one line .cmd file I’ve installed on people’s computers. (It’s actually 3 but you can get away with the first line once you’ve tested it out if you want it to be transparent. Personally, I like to see that it’s done its job.)

      robocopy “c:\Users\{USER}\Documents\” e:\docu /e
      echo Backup Complete
      pause

      You can copy and paste these lines into notepad, edit the folder names and save it as backup.cmd. Include a line of each for as many folders as you need. Use the task scheduler, or place it in the c:\Users\{User}\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup\ folder which will run it on startup.

      Reply
  15. I am curious why you trust DropBox to backup anything, especially your wife’s files. DropBox has actually lost user data in the past, and although I consider myself someone who gives people second chances, this is downright unacceptable. DropBox is also a Silicon Valley Unicorn, a wildly overvalued company that has never made a profit and will never IPO. They are dead in my and my clients eyes.

    Reply
    • I have doubts about Dropbox’s long-term prospects too, but the company certainly isn’t going to go belly-up any time soon. As for the fact they’ve previously lost data, so has every other online backup/storage company. Amazon, Microsoft, Google, Code42: all have lost customer data at some point. And it’ll happen again too.

      This is why people should have a tiered strategy that combines both cloud and local backups. The cloud backup is a safeguard against the loss of the local data; the local backup is a safeguard against the cloud company losing data or shuffling off the mortal coil completely.

      Reply
    • I’ve been very happy with their service. I also don’t put all my eggs in one basket (meaning: I back up). I would never trust ANY online service (or computer or whatnot) to have the ONLY copy of my data.

      Reply
      • I’ve always thought that the old eggs/basket idiom is a little too simplistic and doesn’t entirely make sense when it comes to backups. Simply dividing your eggs between baskets doesn’t not necessarily mean that they’ll be better protected. For example, you’ll likely have a better chance of getting your eggs home safely from the grocery store if you carry them in one solidly constructed and well-padded basket than if you split them between a couple of flimsy old baskets the handles of which are about to fall off. The position is similar with it comes to data: having a single copy in AWS is likely safer than having two copies on cheap hard drives both are which are stored in the home and both of which would be lost in a fire, flood, etc.

        The bottom line is that you should indeed use multiple baskets, but you also need to think about the quality of those baskets.

        Reply
  16. After your recommendation to take advantage of the IDrive One HDD + free cloud storage, I placed my order. The HDD arrived today. When I connected it by USB and prepared to do an image backup, the warning came up that the HDD is FAT32, and limited to files smaller than 4GB. Should I reformat to NTFS (possibly wiping out the IDrive software)? Or is FAT32 of no concern.
    I also use the ICloud with zero problems.
    I use WIN10.

    Reply
    • I’ve not recommended iDrive (I think Bob Rankin does). I would backup the contents of the drive, reformat it to NTFS, restore the files to the drive and carry on. :-)

      Reply

Leave a reply:

Before commenting please:

  • Read the article.
  • Comment on the article.
  • No personal information.
  • No spam.

Comments violating those rules will be removed. Comments that don't add value will be removed, including off-topic or content-free comments, or comments that look even a little bit like spam. All comments containing links and certain keywords will be moderated before publication.

I want comments to be valuable for everyone, including those who come later and take the time to read.