An easy-to-use backup tool designed for servers.
The server is the bacup service: it runs as a deamon and executes the backup of the services on the remotes.
The goal of bacup is to make the configuration straightforward: a single file where defining everything in a very simple way.
3 steps configuration.
When configuring the backups, the field when accepts configuration strings in the format:
"daily $hh:$mm
e.g. daily 15:30
weekly $day $hh:$mm
e.g. weekly mon 12:23
or weekly monday 12:23
. weekly
can be omitted.monthly $day $hh:$mm
e.g. monthly 1 00:30
NOTE: The time is ALWAYS in UTC timezone.
```toml
[aws] [aws.bucketname] region = ""# "eu-west-3" accesskey = "" secret_key = ""
[ssh] [ssh.remotehost1] host = "" # example.com port = "" # 22 username = "" # myname privatekey = "" # ~/.ssh/id_rsa
[localhost] # Like copy-paste in local. The underlying infrastructure manages # the remote (if any) part. Below 2 examples [localhost.samba] path = "" # local path where samba is mounted
[localhost.disk2]
path = "" # local path where the second disk of the machine is mounted
[git] [git.remoterepo] host = "" #github.com port = "" #22 username = "" #git privatekey = "" # ~/.ssh/id_rsa repository = "" # "galeone/bacup" branch = "" # master
[postgres] [postgres.service1] username = "" db_name = "" host = "" port = ""
[folders] [folders.service1] pattern = ""
[backup] # Compress the DB dump and upload it to aws # everyday at 01:00 UTC [backup.service1dbcompress] what = "postgres.service1" where = "aws.bucketname" when = "daily 01:00" remotepath = "/service1/database/" compress = true
# Dump the DB and upload it to aws (no compression)
# every first day of the month
[backup.service1_db]
what = "postgres.service1"
where = "aws.bucket_name"
when = "monthly 1 00:00"
remote_path = "/service1/database/"
compress = false
# Archive the files of service 1 and upload them to
# the ssh.remote_host1 in the remote ~/backups/service1 folder.
# Every friday at 5:00
[backup.service1_source_compress]
what = "folders.service1"
where = "ssh.remote_host1"
when = "weekly friday 05:00"
remote_path = "~/backups/service1"
compress = true
# Incrementally sync folders.service1 with the remote host
# using rsync (authenticated trough ssh)
# At 00:05 in August
[backup.service1_source]
what = "folders.service1"
where = "ssh.remote_host1"
when = "5 0 * 8 *"
remote_path = "~/backups/service1_incremental/"
compress = false # no compression = incremental sync
# Compress the DB dump and copy it to the localhost "remote"
# where, for example, samba is mounted
# everyday at 01:00 UTC
[backup.service1_db_on_samba]
what = "postgres.service1"
where = "localhost.samba"
when = "daily 01:00"
remote_path = "/path/inside/the/samba/location"
compress = false
[backup.service1_source_git]
what = "folders.service1"
where = "git.github"
when = "daily 15:30"
remote_path = "/" # the root of the repo
compress = false
```
When compression = true
, the file/folder are compressed using Gzip and the file is archived (in the desired remote location) with the format:
YYYY-MM-DD-hh:mm-filename.gz # or .tar.gz if filename is an archive
You need to build the server
binary
bash
cargo build # or cargo build --release
And then you have 2 options:
config.toml
next to the executable.CONF_FILE
env variable, to configure the location of the configuration file.Configuring the remotes is straightforward. Every remote have a different way of getting the access code, here we try to share some useful reference.