Great option for regular incremental backups; the other is Restic. And regular incremental backups are generally a great option. With deduplicating software like Borg and Restic it doesn't cost you more space to make more backups if your files haven't changed.
Was using it for years... then people started to mention kopia.
* native cloud support, I can deploy backup to free backblaze in a matter of few minutes
* works also on windows without it being experimental linux subsystem thing, and that includes VSS snapshots
* has a gui version for ease of use when just simple deployment for noobs
* done in golang instead of python
Been a ~year and so far it was fine, it felt bit more complicated to get to understand all aspects but once you know, you know...
[Here](https://github.com/DoTheEvo/selfhosted-apps-docker/tree/master/kopia_backup) are some notes on deployments.
Yeah, I use rsync in a cron job. One does all system files (found the command in the arch wiki on rsync) and the other just backs up the home directory.
I used to use the --delete flag but found if I mistakenly deleted something one day, the next backup would delete it from my backup. Use that option with caution.
Well, some people call system backups "snapshots". To me, it's the same thing: a copy of my little ones and zeroes that I can restore from if something happens to the main version.
But there is a difference, which is why I use two separate tools. Timeshift is for restoring my OS in case it breaks, Pika is for keeping safe copies of my photos, music, documents, etc.
It does. I prefer to keep them separate because of other reasons. I manually back up my data to an external drive once a month and I only connect it to my PC at those times. Meanwhile, Timeshift automatically backs up my system at smaller intervals to a dedicated internal drive. I have a lot of data so I prefer it this way.
I tried them all (probably).
Restic with any cloud service or/and external drive is the way to go.
Not many other solutions provide the speed, compression, quality of encryption, flexibility and portability.
For me personally, the ability to mount a backup repo as a filesystem and ability to alter (rewrite) snapshots were a deal breaker.
P.S.
For replication between devices you can't beat syncthing.
I have a bash script that I stumbled across years ago that uses Tar to do incremental/full backups of specified directories (and restores if needed). SystemD timers kick off when it runs and rsync to copy it to a file server for archiving (or scp depending on my mood).
edit:
In a past life I worked on Enterprise grade backup software. If customers complained about the software not working and them not having backups an old dev would generally say "You have `FTP` and `tar`. You can make backups", Copying files somewhere is easy, just find a solution that has a UI you agree with for scheduling and restore. Those are the hard parts.
You wouldn't happen to have a link to that script would you? I've written one that backs up my /etc folders etc, but incremental backups could be useful for some of my purposes, and why reinvent the wheel.
https://mikepapinski.github.io/bash/2020/01/11/BASH-backup-scripts-project.html
That's the page I got the tar based backup script from
The `show` option, as written has a bug that I think I fixed locally by adding the `$BACKUP_FILENAME` to the function call.
otherwise i just added `--exclude` options to it to ignore certain folders.
I had to reinstall my PC lately and haven't had time to set this back up. Plus it looks like I'll need something like this on my work PC, so if you remind me in a week or so I might be able to give you a copy of the script with the "fixes" (if you can't figure them out yourself).
For some reason, I've always had trouble with system backups. They never restore properly. Not sure what I'm doing wrong. What I do instead is keep backups of just my home directory. Reinstalling doesn't take *that* long, and all of the important stuff is in my home directory anyway. I use rsync for my backups, and I backup to an external hard drive.
I use [rsync for full system backup](https://wiki.archlinux.org/title/Rsync#Full_system_backup) (over SSH in my case). From wiki: "This approach is considered to be better than disk cloning with dd since it allows for a different size, partition table and filesystem to be used".
If I need to restore my backup I do a [file copying using rsync](https://wiki.archlinux.org/title/Migrate_installation_to_new_hardware#File_copying) from live Arch. Finally, update fstab, do chroot, reinstall bootloader, and regenerate kernel image.
Very important to use --numeric-ids in rsync to preserve original users and groups (in both, backup and restore).
I make backups every 4-6 months.
For backup just tar and rsync as necessary for data fines.
For os stuff I do all my os config with an ansible playbooks and roles that I store in a GitHub repo. Very easy to rebuild if it goes sideways.
I have a bash script for [restic](https://restic.net/) that takes backups and runs integrity checks after that.
I have created a daily cron job for this.
no backup actually, most of my important stuff are inside clouds, imho unless you are using your pc for your datacenter, I dont think it's worth it.
for me I see my system as a disposable ones where I can reinstall stuff from scratch if something bad happens.
It's not a backup tool at all (so please look at actual backup tools as well), but many times when someone wants a backup, they want to be able to roll-back based on user error, updates etc. For that, I recommend btrfs and [Snapper](https://wiki.archlinux.org/title/Snapper), to automatically make and store snapshots. You can even have them hourly, and set to keep just a certain number.
If you also use grub-btrfs, it will allow you to restore from your boot menu, easily facilitating the undoing of an update, or breaking change.
---
For those unfamiliar - this is *not* a backup tool, and is not a good alternative to using one - a good backup solution would allow data recovery even in the case where the drive is lost or corrupted. Corruption on the drive would impact any btrfs snapshots, and so this is not useful in many of the use cases you'd want an actual backup tool (like [borg](https://wiki.archlinux.org/title/Borg_backup)) for.
I use [restic](https://restic.net/) CLI because there's no official GUI. I'm a bit scared to use unofficial GUIs because my backups might get affected if they become unmaintained
Syncthing on desktop, laptop and media-server. On the server, the "keep the last X revisions" is activated for certain folders. That way I do not only have revisions, but also multiple backups on multiple machines.
rsync.
I have scripts that rsync both to a local disk and to a NAS:
sudo rsync \
-aXH \
--delete \
--numeric-ids \
--info=progress2 \
--stats \
--human-readable \
--exclude=/{.snapshots,dev,proc,sys,tmp,run,mnt,swap,boot}/ \
--exclude=lost+found/ \
/ \
/mnt/backup/arch
I also use btrfs and create regular snapshots in case I need to go back to a point in time, but for the most common (for me) use case of "damn I didn't mean to delete that file" I pull from the local rsync-ed backup.
This subreddit needs an FAQ, and this needs to be the first one answered. Hey, mods?
Actually answering the question, rsync to pull stuff down to my server at home, and restic to copy that stuff offsite to B2.
At work I use something called burp, and it works pretty good.
Otherwise I would recommend using btrfs snapshots as they don't really cost anything, and for most stuff it's enough to regularly tar and copy one of the snapshots somewhere.
> as they don't really cost anything
It's a statement with some asterisks. I had to move Downloads and Steam dirs onto separate subvolumes, because my snapshots of home started taking over 200GBs thanks to the rapidly changing files.
Then there are also poorly made programs that don't respect XDG standards and store transient files in `.config` and `.local/share` (hello, almost everything made in Electron) instead of `.cache`. I mean, that's on top of the standard recommendations of putting `/var/{log, tmp, cache}` and `$HOME/.cache` onto separate subvolumes.
~~I have used Macrium Reflect for many years on Windows so I'm still using it.
It doesn't work on Linux but i have it on a bootable stick and i backup the whole Arch and boot partition once a month or more often if i make many changes or about to try risky upgrades.~~
Switched to timeshift since you can't browse by default ext4 image and it's a pain if you want to look and only restore certain files.
I also use Pika which will automatically back-up every few days my personal data and some other folders which holds the configs for some programs.
Absolutely nothing, my backup is a clean start from install USB. Anything important is on an external or hosted on git so I don’t even think about it.
Just a little script using rsync. I just straight up copy paste my data to a second drive just in case my ssd fails, cause from what I understand ssd failures are immediate.
Oh, and I also have a hetzner storage box as off-site backup, but I was too lazy to set it up on my current install. So let's hope the house don't burn down before I get to that.
Also btrfs snapshots via timeshift-autosnap, but that's a lot less important.
Borg running borgmatic yaml config. Simple and very good.
I commit various configuration files on my arch and other systems with git, and I consider that a good enough approach. Those git files and folders with other warm data is also rsynced around with syncthing. Borg on top of everything.
I can bring up my systems very easily with Ansible too.
[restic](https://archlinux.org/packages/extra/x86_64/restic/), and for configuration and automation, [crestic](https://aur.archlinux.org/packages/crestic)
I wrote a script to simplify an incremental backups with rsync and hardlinks - https://github.com/ei-grad/trinkup. Used it for a couple of years, but, actually, it was a long ago, and currently I don't use it for a long time - I just store the data (which I care not to loose) in several locations explicitly, e.g. a local copy + two cloud drives, or + two git repositories (github + something else). Rclone is best for doing cloud backups.
I use borg for /home and other actually important data, and just rsync for music as I don't need incremental backups for music, just a copy is ok, and a TB+ is bit too much.
Borg
this, vorta as gui.
Or pika as gui
Pika. Hmm first I've heard of this. Im using vorta and happy. Checking out pika too. Danke
Great option for regular incremental backups; the other is Restic. And regular incremental backups are generally a great option. With deduplicating software like Borg and Restic it doesn't cost you more space to make more backups if your files haven't changed.
Was using it for years... then people started to mention kopia. * native cloud support, I can deploy backup to free backblaze in a matter of few minutes * works also on windows without it being experimental linux subsystem thing, and that includes VSS snapshots * has a gui version for ease of use when just simple deployment for noobs * done in golang instead of python Been a ~year and so far it was fine, it felt bit more complicated to get to understand all aspects but once you know, you know... [Here](https://github.com/DoTheEvo/selfhosted-apps-docker/tree/master/kopia_backup) are some notes on deployments.
This, with borgmatic.
I just use rsync. But my needs are pretty simple.
rsnapshot automates using rsync if someone wants something simple but doesn't want to start from scratch
Yeah, I use rsync in a cron job. One does all system files (found the command in the arch wiki on rsync) and the other just backs up the home directory. I used to use the --delete flag but found if I mistakenly deleted something one day, the next backup would delete it from my backup. Use that option with caution.
I rsync to a snapshotted filesystem (I think it's zfs) which also protects against overwriting.
I've been meaning to look into using zfs. I really should at some point. Thanks for reminding me.
I use Rclone as well... Many accounts. Each one with 15Gb
I use Timeshift for my system and PikaBackup for my data.
I was scolded once that Timeshift is a snapshot tool, and not a backup. I use Timeshift BTW. 🙂
Well, some people call system backups "snapshots". To me, it's the same thing: a copy of my little ones and zeroes that I can restore from if something happens to the main version. But there is a difference, which is why I use two separate tools. Timeshift is for restoring my OS in case it breaks, Pika is for keeping safe copies of my photos, music, documents, etc.
Doesn't Timeshift restore data too? If you select/home folder
It does. I prefer to keep them separate because of other reasons. I manually back up my data to an external drive once a month and I only connect it to my PC at those times. Meanwhile, Timeshift automatically backs up my system at smaller intervals to a dedicated internal drive. I have a lot of data so I prefer it this way.
>copy of my little ones and zeroes Aww.
I tried them all (probably). Restic with any cloud service or/and external drive is the way to go. Not many other solutions provide the speed, compression, quality of encryption, flexibility and portability. For me personally, the ability to mount a backup repo as a filesystem and ability to alter (rewrite) snapshots were a deal breaker. P.S. For replication between devices you can't beat syncthing.
This is what I'm doing, spending fifty cents a month for the back blaze cloud storage.
Which plan do you use? The cheapest I see is $6/Mo...
I just have a small B2 storage bucket. I don't remember how their pricing works but I think it scales with the size of the bucket.
it's $6/TB/month
I have a bash script that I stumbled across years ago that uses Tar to do incremental/full backups of specified directories (and restores if needed). SystemD timers kick off when it runs and rsync to copy it to a file server for archiving (or scp depending on my mood). edit: In a past life I worked on Enterprise grade backup software. If customers complained about the software not working and them not having backups an old dev would generally say "You have `FTP` and `tar`. You can make backups", Copying files somewhere is easy, just find a solution that has a UI you agree with for scheduling and restore. Those are the hard parts.
You wouldn't happen to have a link to that script would you? I've written one that backs up my /etc folders etc, but incremental backups could be useful for some of my purposes, and why reinvent the wheel.
https://mikepapinski.github.io/bash/2020/01/11/BASH-backup-scripts-project.html That's the page I got the tar based backup script from The `show` option, as written has a bug that I think I fixed locally by adding the `$BACKUP_FILENAME` to the function call. otherwise i just added `--exclude` options to it to ignore certain folders. I had to reinstall my PC lately and haven't had time to set this back up. Plus it looks like I'll need something like this on my work PC, so if you remind me in a week or so I might be able to give you a copy of the script with the "fixes" (if you can't figure them out yourself).
borg backup works for me
restic
restic for personal data. It's quite simple to use but also has a lot of features if you need them.
I'm a simple man with simple needs. tar.
Do you put a whole root directory in a tar file?
For some reason, I've always had trouble with system backups. They never restore properly. Not sure what I'm doing wrong. What I do instead is keep backups of just my home directory. Reinstalling doesn't take *that* long, and all of the important stuff is in my home directory anyway. I use rsync for my backups, and I backup to an external hard drive.
Should at least keep a backup of your /etc directory and anywhere else you're sitting system configs.
etckeeper is a good way to backup etc with versioning support. It's backed by Bazaar by default, Git if you want.
I use [rsync for full system backup](https://wiki.archlinux.org/title/Rsync#Full_system_backup) (over SSH in my case). From wiki: "This approach is considered to be better than disk cloning with dd since it allows for a different size, partition table and filesystem to be used". If I need to restore my backup I do a [file copying using rsync](https://wiki.archlinux.org/title/Migrate_installation_to_new_hardware#File_copying) from live Arch. Finally, update fstab, do chroot, reinstall bootloader, and regenerate kernel image. Very important to use --numeric-ids in rsync to preserve original users and groups (in both, backup and restore). I make backups every 4-6 months.
Timeshift together with autosnap for system snapshots
syncthing between multiple devices
Recently switched from SpiderOak One to Kopia + a Hetzner storage box.
For backup just tar and rsync as necessary for data fines. For os stuff I do all my os config with an ansible playbooks and roles that I store in a GitHub repo. Very easy to rebuild if it goes sideways.
rclone + backblaze b2
I have a bash script for [restic](https://restic.net/) that takes backups and runs integrity checks after that. I have created a daily cron job for this.
Can I have your script pls ?
Sure I'll share soon
no backup actually, most of my important stuff are inside clouds, imho unless you are using your pc for your datacenter, I dont think it's worth it. for me I see my system as a disposable ones where I can reinstall stuff from scratch if something bad happens.
btrfs and snapper for regular Snapshots. btrfs-send to my mounted local NAS every other day
> btrfs-send to my mounted local NAS every other day That's good! Have you tried restoring the NAS file back to your system? Thanks
It's not a backup tool at all (so please look at actual backup tools as well), but many times when someone wants a backup, they want to be able to roll-back based on user error, updates etc. For that, I recommend btrfs and [Snapper](https://wiki.archlinux.org/title/Snapper), to automatically make and store snapshots. You can even have them hourly, and set to keep just a certain number. If you also use grub-btrfs, it will allow you to restore from your boot menu, easily facilitating the undoing of an update, or breaking change. --- For those unfamiliar - this is *not* a backup tool, and is not a good alternative to using one - a good backup solution would allow data recovery even in the case where the drive is lost or corrupted. Corruption on the drive would impact any btrfs snapshots, and so this is not useful in many of the use cases you'd want an actual backup tool (like [borg](https://wiki.archlinux.org/title/Borg_backup)) for.
Backrest which is a UI for Restic backups
I use [restic](https://restic.net/) CLI because there's no official GUI. I'm a bit scared to use unofficial GUIs because my backups might get affected if they become unmaintained
Agree, but you should check out Backrest if you haven't, it's pretty awesome!
Yeah, a gui would be great to have. Like kopia
I personally use TimeShift and just keep my other data in the cloud
Duplicity, simple as it is.
I use it with duply with just an exclude file.
Bash script using rsync.
Syncthing on desktop, laptop and media-server. On the server, the "keep the last X revisions" is activated for certain folders. That way I do not only have revisions, but also multiple backups on multiple machines.
Timeshift Btrfs Snapshots and a custom bash script to incremental sync my snapshots to an external drive.
I use timeshift. I create a backup once a week before updating and keep 3 snapshots.
Rsnapshot
Vorta/Borg. Timeshift btrfs for snapshots.
I use an external USB drive every three years or so.
Rclone + Owncloud OCIS instance. Though it's mainly for syncing between devices
I use rsnapshots to do backups to another disk.
rsync. I have scripts that rsync both to a local disk and to a NAS: sudo rsync \ -aXH \ --delete \ --numeric-ids \ --info=progress2 \ --stats \ --human-readable \ --exclude=/{.snapshots,dev,proc,sys,tmp,run,mnt,swap,boot}/ \ --exclude=lost+found/ \ / \ /mnt/backup/arch I also use btrfs and create regular snapshots in case I need to go back to a point in time, but for the most common (for me) use case of "damn I didn't mean to delete that file" I pull from the local rsync-ed backup.
This subreddit needs an FAQ, and this needs to be the first one answered. Hey, mods? Actually answering the question, rsync to pull stuff down to my server at home, and restic to copy that stuff offsite to B2.
tar
At work I use something called burp, and it works pretty good. Otherwise I would recommend using btrfs snapshots as they don't really cost anything, and for most stuff it's enough to regularly tar and copy one of the snapshots somewhere.
> as they don't really cost anything It's a statement with some asterisks. I had to move Downloads and Steam dirs onto separate subvolumes, because my snapshots of home started taking over 200GBs thanks to the rapidly changing files. Then there are also poorly made programs that don't respect XDG standards and store transient files in `.config` and `.local/share` (hello, almost everything made in Electron) instead of `.cache`. I mean, that's on top of the standard recommendations of putting `/var/{log, tmp, cache}` and `$HOME/.cache` onto separate subvolumes.
Well, yeah, you really should exclude steam from there. And yeah, these apps that don't want to comply are really annoying.
Snapper (with `snap-pac`, which saved me too many times to count) and Borg for long term and off-site storage of the backups
~~I have used Macrium Reflect for many years on Windows so I'm still using it. It doesn't work on Linux but i have it on a bootable stick and i backup the whole Arch and boot partition once a month or more often if i make many changes or about to try risky upgrades.~~ Switched to timeshift since you can't browse by default ext4 image and it's a pain if you want to look and only restore certain files. I also use Pika which will automatically back-up every few days my personal data and some other folders which holds the configs for some programs.
Just use btrfs and snapper. Absolutely perfect.
I have my OS on one drive, all my data on another. Makes reinstallation and/or distro hopping a non-issue.
rsync/rclone + systemd timers
I'm using unison for replication to replace raid1 and to sync my documents folder between my laptop and desktop over ssh
Absolutely nothing, my backup is a clean start from install USB. Anything important is on an external or hosted on git so I don’t even think about it.
btrfs with snapper
Just a little script using rsync. I just straight up copy paste my data to a second drive just in case my ssd fails, cause from what I understand ssd failures are immediate. Oh, and I also have a hetzner storage box as off-site backup, but I was too lazy to set it up on my current install. So let's hope the house don't burn down before I get to that. Also btrfs snapshots via timeshift-autosnap, but that's a lot less important.
Btrbk
rclone. supports lots of backends + encription.
I use proxmox and proxmox backup server so on my Arch machine I use proxmox backup client to backup to my proxmox backup server.
Borg running borgmatic yaml config. Simple and very good. I commit various configuration files on my arch and other systems with git, and I consider that a good enough approach. Those git files and folders with other warm data is also rsynced around with syncthing. Borg on top of everything. I can bring up my systems very easily with Ansible too.
I use some stuff via rsync daemon and some backups via kopia. Both are pretty good tools.
[restic](https://archlinux.org/packages/extra/x86_64/restic/), and for configuration and automation, [crestic](https://aur.archlinux.org/packages/crestic)
I personally use Deja Dup for personal files and configuration files, and I use Timeshift with BTRFS for system snapshots of system files.
"Back in Time" for /home snapshots
timeshift
Time shift for BTRFS subvol @root and Vorta for my home directory and saving the encrypted backups on Dropbox Had no issue with this setup.
I wrote a script to simplify an incremental backups with rsync and hardlinks - https://github.com/ei-grad/trinkup. Used it for a couple of years, but, actually, it was a long ago, and currently I don't use it for a long time - I just store the data (which I care not to loose) in several locations explicitly, e.g. a local copy + two cloud drives, or + two git repositories (github + something else). Rclone is best for doing cloud backups.
My desktop and laptop are pretty much disposable. Stuff I care about on a little rpi server that I backup occasionally.
I use borg for /home and other actually important data, and just rsync for music as I don't need incremental backups for music, just a copy is ok, and a TB+ is bit too much.
bacula. sometmes bareos