ClearOS‎ > ‎

Remote Backup

This page could fit in a more general section because it does contain some good general information.  But because my ultimate goal and focus here is to backup a ClearOS without paying for the service from ClearSDN, I'm leaving it here!

I've been steadily moving my personal work into the cloud.  That typically means google, but there are a number of services I'm keeping an eye on.  I'm most interested in LiveDrive because of the value they provide.  At $120/year, you get unlimited storage and unlimited clients.  There are three reasons I can't recommend this service, though: 1) it's not totally reliable.  Sharing links doesn't always work.  But this isn't why I'm using the service, so I'm happy to overlook this issue.  More important is 2) data is only encrypted in transit.  It is very important to notice that data is stored in the clear on their servers.  And perhaps equally important is 3) their TOS prohibits automation of their service.  I assume this includes utilizing SyncToy to backup files to a LiveDrive and I'm overlooking that restriction for my personal use, but wouldn't overlook it for business applications.  Otherwise, I've been relatively satisfied with their product.

Here's a key summary of a few services I'm watching:
(I'll try and go back to update these.  I was turned off of multiple services because their licensing scheme disallowed network share backups.)
 Google Apps7G free?  must use web interface
 DropBox2G free Looks professional.  Supports Linux.
 
 SkyDrive
 (Windows Live)
25G Free must use web interface 
 iDrive2G free  must use cheesy application

 box.net 1G free expensive.  Watching it because
 it seems popular and looks professional
 Mozy Home

 2G free
 $55/year, but also $5/pc/month
 Hidden pricing always turns me off!
 Carbonite

 Free trial.
 $55/year "unlimited"
 I forget what I disliked about this one
 LiveDriveFree trial, $120/year  Data stored in clear.
 Sharing often doesn't work.
 Software integrates with windows creating a virtual harddrive (allowing automation of backups).
 It should be noted that automation is against the TOS, in section 6.3.

In short, none of these services fill my need for an automated offsite backup at a reasonable price.  LiveDrive would, but their software is only windows based, which means I can't automate a ClearOS backup directly to a LiveDrive (setting aside the fact that this practice is against their TOS).  Therefore I'm stuck rocking some scripts to push offsite for the time being.

The basic theory for my scripts:
  1. tar/bzip2 a directory to make it more manageable.  This gives me one file to work with for each directory I want to backup.  It also compresses the backup size.
  2. Encrypt the archive(s).  I had previously done this with GPG, but managing the keys was a pain.  I'm now using OpenSSL and only have to maintain a password file.
  3. Use rsync to push the encrypted files offsite.  My understanding of rsync is that I have better reliability of the data making it offsite, which is important, because we're typically talking about pushing 10's of gigs across home-user grade bandwidth (30K/connection upload throttling expected).
For the rSync to work in an automated system, you'll need to setup no-password ssh.  This isn't to say SSH isn't secure without a password, because it uses public-key authentication.  Therefore, your SSH host is kept as safe as your private key.  I also recommend using an unprivileged user account on your SSH host.  My backup script runs as root.  If you make changes to the script, I suggest you be careful.  To get no-password SSH setup, as root run: ssh-keygen -t rsa and follow the prompts (without entering a password).  This will create two keys in /root/.ssh/.  You just need to echo the contents of root/.ssh/id_rsa.pub into [/remote/host/limited/user/home/].ssh/authorized_keys

If you're backing up to a remote clearOS, you'll need to enable SSH for the remote user:

This script will not run in its current condition!  It is intended to be a guide, and has been edited for my security.
Here's the backup script:
#!/usr/bin/perl
use strict;
my $DRYRUN = 0;
my($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst)=localtime(time);
$year -= 100 and $year = "20" . $year if $year >= 110;
$year -= 100 and $year = "200" . $year if($year >= 100 and $year < 110);
$mon++;
$mon = "0" . $mon if $mon < 10;
$mday = "0" . $mday if $mday < 10;

my $destination = "/root/bak/$year-$mon-$mday";
#TODO: improve fault recovery.
if($DRYRUN){
  print "Attempting to make directory: $destination\n";
}else{
  die("Died trying to create $destination\n  -- $!") unless mkdir $destination;
}

runBackup("/var/flexshare/shares", $destination, "flexShares");
runBackup("/home", $destination, "homes");
runBackup("/etc", $destination, "etc");
runBackup("/var/samba/", $destination, "samba");
runBackup("/var/lib/mysql", $destination, "mysql");

#Now consolidate the archives above.
runBackup("/root/bak/$year-$mon-$mday", "/root/bak", "$year-$mon-$mday");
#Now encrypt the archive. This portion will take the longest time.
#Note that /root/pass.des3 must exist, and should include a complex passkey!
if($DRYRUN){
    print "Home stretch: encrypt, clean, post!\n";
}else{
    my $command = "openssl des3 -salt -in /root/bak/$year-$mon-$mday.tar " .

                  "-out /root/bak/$year-$mon-$mday.des3 -pass file:/root/pass.des3";

    system $command;
    #Do a little housekeeping:
    system "rm -f /root/bak/$year-$mon-$mday.tar" if(-e "/root/bak/$year-$mon-$mday.tar");
    system "rm -rf /root/bak/$year-$mon-$mday" if(-d "/root/bak/$year-$mon-$mday");

    #We're all done, and ready for offsite!
           my $postCMD  = qq{rsync -v -e --progress "/usr/bin/ssh"};
       $postCMD .= qq{/root/bak/$year-$mon-$mday.des3 [remoteuser]\@[hostname]:[/path/to/storeBak/]};
    system($postCMD);
}
sub runBackup{
  my($argBackupDir, $argDestinationDir, $argBackupName) = @_;
  my @temp = split("/", $argBackupDir);
  $argBackupDir = pop(@temp);
  my $working_dir = join("/", @temp);
  $working_dir = "/" unless $working_dir;
  
  #TODO: add fault recovery.
  my $tar_command = "tar -cpf $argDestinationDir/$argBackupName.tar ".
                    "--recursion --directory $working_dir $argBackupDir";
  if($DRYRUN){
    print "Attempting backup: $tar_command\n";
  }else{
    my $result = `$tar_command`;
  }
}
Comments