Yes, I’ve tried multiple approaches in the past to get an offsite-backup up and running. I’ve used rsync first and then obnam and duplicity with encryption. Both successful with remote ssh-hosts.
But times are changing and I wanted to use a cloud-storage provider now instead of a dedicated remote host for my backups – so I started with duplicity/duply to get AWS S3 access running – that worked quite good.
Things I like:
- growing user base on GitHub
- its encrypted (very interesting read on this)
- uses deduplication
- usage is straightforward, no hard to read config-files with dozens of options
- no full-/incremental backups, every backup is a „snapshot“
- they’re promise to not change the repository format in the future
- access to cloud providers (tested AWS S3 and Google Cloud Storage) went very easy
- its using a local cache to speed up (but is not sticked to that)
- you could move the backups to different locations using standard tools and they’re still usable
- local backups (like on USB-drives) could be achieved the same way: every destination is considered as „remote“ and is encrypted
Documentation is good, so I won’t share usage details here. But I’ve started backups to Google Cloud Storage last week and it feels great…