# Secure, Free, Incremental and Instant Backups for Linux
I use two different methods to backup the data on my laptop. The code that I am
working on at any one time is backed up off-site as soon as it changes.
Additionally, it sits under revision control. Along side this, my entire
filesystem is incrementally backed up to network attached storage, on an hourly
basis. Both solutions utilise encryption such that only I am able to recover
data from those backups. Here follows a description of the technology and
services that I use to achieve this.
**Full System Backups**
I like to backup the entire filesystem on my laptop. You might think that
backing up folders like /sbin/ is a waste of time, but I prefer to err on the
side of caution and keep everything. My backups are stored on a NAS which I mount on my laptop using
CIFS. You can use the same
method to backup to a USB attached hard drive, or to a machine with SSH/FTP
access, or even to [Amazon S3](https://aws.amazon.com/s3/). To achieve this, I
use some free software called [Duplicity](https://duplicity.nongnu.org/)
I use full disk encryption on my laptop, so it would be silly to not use it for
the backups as well. Duplicity will tar up your files, and then utilises
[GnuPGs](https://gnupg.org/) symmetric option to encrypt the tarball with a
passphrase. Only files which have been changed between the current time, and the
previous backup, are backed up.
I make sure that NAS is automatically mounted on my laptop, and add the
following to my crontab.
```text
PASSPHRASE="theEncryptionPassphrase"
30 * * * * duplicity incr --full-if-older-than 4W --exclude-other-filesystems / file://backupdir
29 * * * * duplicity --force cleanup file://backupdir
28 * * * * duplicity --force remove-all-but-n-full 2 file://backupdir
```
If there is an environment variable named PASSPHRASE, then duplicity will use it
for the encryption. In this example configuration, the first job runs an
incremental backup once an hour. Every 4 weeks a new full system backup is
performed. The second and third jobs clean up the backup directory. At any one
time, I keep a minimum of two full system backups and their increments. The
--exclude-other-filesystems option allows me to avoid other filesystems like
/proc/ and /dev/ and the backup mount it's self. You may want to create
additional jobs for other partitions like /home/ depending on your partition
scheme.
Say I wanted to restore a copy of /home/foo, exactly as it looked 10 days ago,
I'd run a command similar to this:
```text
duplicity -t 10D --file-to-restore home/foo file://backupdir /home/foo.restored
```
**Instant Code Backups**
I value the code that I write, more highly than the rest of the data on my
laptop. I use [Git](https://git-scm.com/) for revision control,
[Dropbox](https://www.dropbox.com/) for automatic synchronization of code
off-site (to Amazon S3), and [EncFS](https://vgough.github.io/encfs/) for
encrypting the code before it leaves my system.
All of my development is done in project specific directories beneath
/home/mike/git/. /home/mike/git/ is mounted from /home/mike/Dropbox/git.encfs/
using the encfs command:
```bash
encfs /home/mike/Dropbox/git.encfs /home/mike/git
```
Now, whenever I save anything in /home/mike/git/ an encrypted version in
/home/mike/Dropbox/git.encfs is created/updated, and that change is immediately
synced to the cloud. If Dropbox is compromised, I don't have to worry about my
code being compromised as it is encrypted.