tadhg.com
tadhg.com
 

Always Have Good Backups

19:47 Tue 19 Jul 2011. Updated: 17:39 23 Jul 2011
[, , , ]

At time of writing, the server hosting my blog and email is down, and the possibility data recovery is uncertain. This makes me feel a little dumb, as I don’t have everything backed up. Not good, especially since I’m highly aware of the need for backups. But this server is where I generally back things up to, and having backups of it is something I was once better at but have lost the habit of. So if the data isn’t recoverable, I’m missing quite a lot, and it would be, at the least, a significant headache to get it back.

How should it have been? At the least, I should have been using rsync to copy absolutely everything from the server to other machines. This means a certain amount of overlap, as I’d be copying things down that I was also backing up there, but that’s not really something I should worry about. That would give me recovery options whether the server or client machines had problems, and that’s the rational way to set things up.

Better, though, would probably be to keep more or less everything in a distributed version control repository of some kind, and to synchronize repos across all machines whenever any change is made. I’m hoping it’s not too difficult to move to that setup (away from my current situation, which uses Subversion, with a master server), and I’ll probably try it with git, if only because that’s the dominant distributed version control system at the moment.

In addition, I’ll need to be more diligent about exporting database contents, particularly for things like WordPress, and those will also need to go into version control.

Here’s hoping that I get a chance to do that with the data whose recovery is currently uncertain…

« (previous)
(next) »

One Response to “Always Have Good Backups”

  1. Seth Milliken Says:

    For the vcs stuff, git is a great choice for a much better reason than being popular: every clone is a complete copy of the entire repository. To restore the server repository after a failure that resulted in data loss, `git push` is all you’d need. Having migrated all of my repositories from svn and hg just before the drive failure, I was not worried at all since at most I would have lost a few unworkable-specific configuration files that are not yet checked in anywhere.

    Synching repositories across multiple machines should be fairly easy with git, too, since you can set up multiple remotes for a working copy. Or you could set up a post-receive hook.

    On the other hand, large binary files are not handled terribly gracefully in git natively. Since git stores each version in its entirety, your repository can quickly get quite large.

Leave a Reply