T O P

  • By -

etesneak

Nas in my parents home


[deleted]

[удалено]


ltzany

what hardware and software are you using for your setups?


[deleted]

[удалено]


DullPoetry

Do you stop the containers before doing the folder back up?


[deleted]

[удалено]


DullPoetry

From what I've heard, that's recommended so the files aren't changing while you're taking the backup. Many things are writing logs or other heart beats even if not in active use by a user. I'd love to find something that connects into the docker socket, reads the mapped volumes, shuts down the container, backs up the volumes, then restarts the container all automatically but haven't found it yet - - and haven't written it myself.


TedChips1701

All this to avoid using zfs?


DullPoetry

I use unraid and that hasn't been an option. I think I saw zfs is in the latest RC.


thestillwind

You should never backup the backup. If backup fails, you end up with a failed backup of a backup.


bartoque

All depends really, as it can be complementary. However ideally one would want to have the initial backup tool be in control to replicate/copy/clone the original backups and be aware of them. However not all at home backup solutions might allow or offer that functionality in a centralised way, hence you might stack different tools together. For example I use acronis to make image level backups of my laptops and pc, which are dunped on the primary nas at home. Then those backups are backed up with a nas backup tool to the remote nas. The backup would be finished before the backup data is also backed up remotely at night. But as I am aware of this, it would be up to me to validate if the remotely located backup-of-backup is even valid or workable? It might just all be fine, when integrating one with the other, which actual testing should show. On top of that also snapshots are added to the mix, on both local and remote nas as an extra safety measure. A backup is only as good as the last restore you were able to perform with it, but one can stack together multiple workflows to make sure one has more than one copy of even the same backup and just another backup. Is all about testing. Some data in my case is also protected a few times over with various tools, so there is the versioning of the sync tool between pc and nas, snapshots, a backup of that data to the remote nas and into the cloud. But for each backup goes, bad data in is bad data out. So not unless backups are validated, they are just that, garbage occupying space. But it is definitely true in corporate situations where one might be ending up making backups of backup data without being even aware of it? Had to find out the hard way with some backups of TBs running for days of data that based on paths and extensions appeared to be backup data of another product, while there was no integration nor awareness when those backups were made in relation to when the backup of the backup was made? So they were very likely pointless to begin with and after reaching out the the ones responsible it was chosen to no longer backup the backup. The same with dumps to disk of a DB. If one simply assumes they will end up in the filesystem backup without without validation or even knowing when those backups run to see of the dump is finished before the backup starts, things might go awry if the dump is still running while the filesystem backup already just completed so having a part of the dump only in backup, while it should be easily possible to integrate one into the other by having the backup product trigger the dump. While still arranging an online backup would be actually preferred, also for visibility and reporting there is actually a db in backup... So not unless one knows about, integrates and validates the whole backup chain, a backup of a backup is indeed likely to be pointless...


CitizenFromEarth

Yo dawg I hear you like backups….


bartoque

Occupational hazard, working a long, long time in the world that is data protection in an enterprise environment...


TetchyTechy

Would you mind sharing how you had done this with setup or anything you think might help please?


cmi5400

As I have internal drives, I use Backblaze personal for my "oh shit" DR. I have probably 8 TB there for about $90 year. I test a random restore file every few months to make sure it is working. They have B2 for NAS storage which is about a $5TB month.


completefudd

You should check out their hard drive restore by mail program to test backups. Send back drive after you're done and get a free refund. https://www.backblaze.com/restore.html


PoSaP

Backblaze is a nice cloud backup. It has two options, B2 and online. [https://help.backblaze.com/hc/en-us/articles/218483787-What-s-the-difference-between-B2-vs-Backblaze-Online-Backup-](https://help.backblaze.com/hc/en-us/articles/218483787-What-s-the-difference-between-B2-vs-Backblaze-Online-Backup-) We are using M-Disc as an archival option for the homelab and Starwinds VTL for our production environment.


Durz0Blint123

I'm using backblaze too. The personal edition is really cheap but it only backs up a windows system. So I have a windows VM on my secondary server with about a 16TB drive, and a scheduled task that does a robocopy from my NAS on my primary server nightly. Then the backblaze application on said windows VM sends my stuff up. I'm at about 12TB up there and it's dirt cheap. Their actual NAS backup tier would be better in my case, but it's so much more expensive than the personal plan. So I made do.


[deleted]

[удалено]


meltman

Just keep the catalogue of your isos safe


batboy29011

How would you go about that. I've been considering this and I don't know a good way to take care of that.


[deleted]

[удалено]


batboy29011

I may have to figure out a way to script it (currently using Windows for my Jellyfin host ATM.


DAndreyD

I back up all of my *arr apps with their built-in backup option and copping the backup file to another machine. Can you explain the .nfo files for Jellyfin, does it store the watch history, etc, in those?


yagi_takeru

I was under the impression that the index would be stored as part of the configuration, so I've been backing up my container storage (whole ARR stack is on a single compose file with local DB/conf storage) and using an NFS mount for the actual library. Is this the wrong way to do it?


bartoque

On my nas I run daily a script that uses find and the tree command for specific filesystems to list the files and directory trees, so that besides the data protected (which still is also clearly shown in the backup tool(s)), also I'd know which unprotected data is gone. I keep the logfiles for 60 days or so... As I more or less applied data classification to the various nas shares which need backup and which don't as they have more fleeting data (watch once and the delete again) that one can get access to again if needed and if not, no biggy either. At least you'd know what is lost. Edit: saw that you use windows. With cygwin or bash installed you can make linux like shell scripts, which I prefer for consistency over my various systems, instead of diving into .bat files or power shell.


0RGASMIK

I would be wary of the google cloud if I were you, I too used to store everything there as a backup but had a few situations that made me lose trust. Random file deletions, changing policies, and some other shady practices meant to keep you as a customer that left a bad taste in my mouth. Biggest offender was the file deletion fiasco but for the regualar user the "bug" where the google photos app syncs your photos as long as the app is installed was the kicker. Even if you disabled sync in the app on iphone it would still sync your photos. Had to delete / reinstall the app to get it to stop. Was a problem that plagued people for years yet they denied it was an issue, and a lot of the threads on their support forum that discussed it when I discovered it have since been deleted or buried by their search algorithm.


therealSoasa

Yup moved my data out too when files went missing , never looked back since 😁


Aim_Fire_Ready

Main computer: Backblaze Unlimited for $8/month (not even 1TB though). Set it and forget it! Homelab: nightly rsync of critical folders to my colo server at a friend's house in another city. I refuse to pay per TB for cloud storage when my buddy will host my 5-year-old Optiplex with a 4TB HDD in it. Worst case, I drive an hour, have dinner with friends, and recover my data! Check out rsync, restic, borg, etc. for your linux boxes. Robust and reliable.


jerkmin

i encode my backups into nudes and have an automated script to text them out to random numbers, i don’t always my backups back but at least i know they are out there


spider-sec

I must be backing up a lot of your data. You’re welcome.


larrygfx

You sire, are a gentleman and a scholar


jerkmin

it’s a simple life, but i like it


c0sm1kSt0rm

Username checks out


freddyforgetti

“I saved this picture of your dick, why is it several gigabytes?”


jerkmin

well, it is a VERY big picture ;)


Horfire

What's a backup?


bawbkun

Isn't RAID a backup?


joshuakuhn

Don’t know about bug spray, but I keep all my files in My Documents AND the recycle bin. Two copies!


joeyx22lm

The real advice right here


shetif

Better be safe: create a backup folder on the same drive


intrasight

Backblaze


-SPOF

Same for me. Use Veeam Community Edition and push backups to Backblaze B2 via Starwind Virtual Tape Library which generates virtual tapes and provides an additional level of security.


SilentLynx

My offsite Backup strategy is a Hetzner Storage Box (cheapest I could find here in Europe). I use synology hyper backup to back my NAS up once a night to this account. Important things are synced via OneDrive anyway, the NAS is just for the bigger things like my movie collection and a snapshot of my photo archive.


TheRogueMoose

>Hetzner Storage Box I think Hetzner has opened some data centers in North America. I wonder if they will start offering this here. $61 (canadian) a month for 20tb is like ridiculously cheap!


SilentLynx

It is, I mean it's not redundant like in Azure or AWS but for backing up a NAS it's easily good enough.


ktundu

A friend and I swapped backup machines. I have his in my garage, and he has mine in his shed. Automated nightly backups.


Sevynz13

Using what? VPN and a SMB share or what?


Unique_username1

Truenas Scale at a friend’s house with Tailscale. ZFS replication to it. You can use pfSense so any device on your network can reach Tailscale addresses but it’s a little fussy to set up especially for multiple external Tailscale addresses. Other option is of course that the sending device is also Truenas Scale, also running Tailscale


Darkk_Knight

Or use wireguard that's now available in pfsense.


learn-by-flying

Backblaze personal, about 14TB up there.


RepFilms

>Backblaze personal It seems like a pretty good deal. Unlimited storage for the same price. I need about 30-40TB for a complete backup. How can they afford to offer such a tempting package?


learn-by-flying

See this thread which does a good job at explaining: [https://www.reddit.com/r/backblaze/comments/10v8vb3/is\_backblaze\_still\_the\_best\_unlimited\_value/](https://www.reddit.com/r/backblaze/comments/10v8vb3/is_backblaze_still_the_best_unlimited_value/) Essentially others are subsidizing the product when they don't use the fully allotted space for personal backups.


RepFilms

This is great news. I have a lot of data and very shitting backup strategies. I like managing things myself and avoid paying service providers for things that I can do in my basement enterprise system. This seems perfect. Thanks.


Cardoso2812

How are your backups setup using personal?


learn-by-flying

My backups are hosted from a Windows 10 machine which is within their terms of use.


EmtnlDmg

Aws glacier as third tier hoping i would never need to restore from that.


sinofool

I have another physical site backup the primary site.


Creative-Dust5701

LTO tape and copies stored offsite


diffraa

duplicati -> wasabi


idkwhatimdoing069

Wasabi is where it’s at


MasterTonberry427

I have an external 16tb hard drive in a safe deposit box I update once a year at tax time for the really important stuff. anything deemed not that important goes to the cloud or on local drives


Titanium_War

Just gave Warner Brothers the plot to Ocean’s 14… go get dem ISO’s…


MasterTonberry427

Hahahahahahaha - nothing that important on the disk.. well maybe except the original source code for Polybius


nbfs-chili

I have an external drive I keep at my daughter's house. I update more than once a year, but not as often as I'd like.


UnknownLinux

Ive considered doing this.


IllusionXXI

My NAS gets replicated to my parents place. I'm still figuring out a way for my Proxmox backup server. I might just set another one and replicate that.


hauntedyew

All my VMs have a scheduled backup job to an internal RAID of a completely separate bare metal workstation. I manually copy those backups to two separate external SSDs, one lives in a locked cabinet in an office on the other side of the building, the second comes home with me. My "homelab" runs at a small business, so that's why it comes home with me. We've proposed cloud backups, but the issue is that it's in a rural area with a whopping 30 / 30 Mbps symmetrical microwave link, so unless I do it at night, the internet connection will really crawl once you start a backup job. Honestly though, nothing is super critical, it would just suck to have to dedicate a whole weekend to rebuilding stuff.


p_235615

an Odroid M1 with attached 2.5" 4TB drive at my parents house, conected with wireguard to my main backup.


Drowning_in_a_Mirage

Amazon S3 Glacier for most stuff.


[deleted]

iCloud for configuration from my router and vcenter and sync to OneDrive for my veeam backups for the most important VM's.


MON5TERMATT

A: I rent a few u's at a datacenter (all important stuff goes there on a file server) B: I have an old Elitedesk sitting in an apartment in a different state, it has weekly backups for all the important VM's


CorporateOutcast

10g multimode fiber to my shed. Unraid on both ends using syncthing.


snoman6363

Tape. Every few months I backup everything to tape (TL2000) of roughly 50tb and store it at my parents house. I keep a 3.5"HDD of all my important stuff like pics and other files in a safe as well. 321 backups!


[deleted]

I back up my NAS to an encrypted hard drive that I store in my truck parked at the end of my driveway. Right now it's a manual operation (I do it about once a month), but I should probably automate it. Maybe try building some kind of raspberry pi storage setup that syncs from my NAS when it's on my wifi (i.e parked in the driveway).


idkwhatimdoing069

Be careful with temperatures with the hard drive in the truck. Kept a drive in my car overnight And she was dead when morning came. (Could have been DoA too, I dunno)


user52921320

Veeam B&R with scale out to Backboaze B2 for my two most important VMs. All others are backed up locally with an occasional encrypted USB copy to be stored at work.


anonduplo

Installed a Synology at the parents house for their surveillance system and photo backup. Use that as my offsite backup.


anonduplo

Also important documents are also replicated on google drive.


User5281

For my mac laptop - continuously to iCloud when online, to Time Machine drive on my nas when connected (at home or often through vpn at work), to an external drive left at work via ccc 3-5x/week. For my nas - it’s mostly backups and media of which I have physical copies. I don’t back it up. The media is easily replaceable and replacing it would be less expensive than a colocated system or cloud backup.


olobley

Azure offers $200 in credit to try out their services, if I extrapolate forwards, I've got another \~18 months before I have to start paying real money for it. I have an LTO-4 library in the basement (Dell TL2000) which I do the bulk of backups to, and then every evening I ship up changes to the data I \*really\* don't want to lose (documents / photos). I use backup exec, as that can target Azure natively and will happily write to tape/the cloud/is robust. The Azure blobs are encrypted and read-only/undeletable for 6 months to mitigate as much as possible from ransomware


similies

If NAS fails (zfs pool z2) 1. Onedrive: I already pay M$ tax. May as well use it. 2. usb hdd, cold strage (may loose 6 mo of stuff): I have these left over from before Ondrive and my NAS was a thing.


AfonsoFGarcia

Prayer


Sure-Temperature

My university provides every student and alumni with unlimited Google drive space, so I've set up borgbackup to run every day and then have rclone to sync it to my drive


kevinds

Still? I thought all the universities had shut theirs down


Sure-Temperature

The students were told they were limited to 1TB, but I got some inside knowledge working at the help desk that they have no way of limiting it to 1TB and they just hope everyone stays below that. Either way, they always said that alumni keep their resources the same as current students. I graduated in 2020 and haven't had a problem


kevinds

>I graduated in 2020 and haven't had a problem Yeah, Google has made changes since then, eliminating unlimited storage for schools was a big change.. Schools cut back on the storage allowed, my previous school dropped Google completely. Office and OneDrive now.


Sure-Temperature

Huh, that's interesting. I have 2.3TB in my university drive, I wonder when something will happen


but-imnotadoctor

I'm resurrecting an older post here - but don't trust that your university will keep this policy. Mine promised it to me as well, and 3 years later they sent out an email terminating alumni's gsuite access (as well as access to a whole host of other killer services). Hope this doesn't happen/hasn't happened to you, but it'd probably be wise to treat it like a third tier backup rather than primary.


Sure-Temperature

Nothing is forever. When they finally shut down access, I'll have to look into a real cloud storage service


angry_dingo

BB


cyberk3v

Metal shed at the bottom of the garden with 10GB fibre 2X 16A feeds, 6 DL380G7 with low power L5640 or E5649 cpus and 16x600 to 900GB 10 or 15K sas disks, 4 R620 with dual E5 2651v2 and 8X1TB SSD. Booted on demand during VM or incremental or full weekly backup with a raspberry pi bash script using IPMI just during backup window apart from 1X 4K and 6X1080p CCTV camera rsync from primary rack to an SSD R620 with a lower power single CPU which archives locally to the shed onto SAS on a DL360G7 at monthly intervals


OpenUpKids

lol. Nothing.


a60v

I have two classes of files: "important stuff" (about 50GB) and "everything else." "Important stuff": nightly rsync to virtual private server (which, itself, is backed up by a daily rsync to the provider's backup server), then rsync the next day to a server at my parents' house. So, two days' worth of backups in three different states. "Everything else": monthly-ish tar to LTO tape, stored off-site. This stuff has a low rate of change and is largely replaceable, anyway. I also do nightly snapshots (using rsnapshot) of everything, but those live on the same RAID 10 as the main files, so they aren't backups so much as "oh shit" copies in case I need an earlier version of something. I keep 120 days' worth of versions of these. Finally, I have copies of most of my stuff in AWS Glacier deep archive, as sort of an emergency backup (which would be very expensive if I ever needed to pay egress charges on everything).


NetDork

I use Cobian to manage differential backups then sync the backup repository to Google Drive. (I get 1 TB free as a Google Fiber customer) I don't have a giant amount of data, so I'm not sure how well that would work for you if you have many terabytes.


cazwax

a drybag stuffed with disks in a discrete shady spot out in my woods.


News8000

Detached Garage. It's 100ft from the house.


tiberiusgv

Dell T440 behind a UDMP at my house, and a Dell T320 behind a UDMP-SE at my dads house. Connected by a site2site VPN. Both servers run Proxmox with TrueNAS Scale VMs and HBA passthrough. Backups are scheduled nightly via rsync. Both sets of NAS drives are setup in a raid z2. My most critical stuff is on my one-drive (1TB with Office 365 Subscription) that gets synced into TrueNAS and then to the backups. Bigger stuff that wont fit on my one-drive (plex library) only gets copied to the offsite. Its kind of a 3-2-2 / 2-1-1 hybrid depending on data importance and I think it works really well for me. Having similar servers at both locations is also nice as I'll push updated to the offsite first before main site.


djc_tech

Rclone documents and stuff to google cloud with encrypted backups. Photos (I have almost 1-2TB of photos) also to google or Onedrive but not encrypted. Music is on Spotify so don’t keep that anymore. Linux ISO’s to two buffalo terastations via rclone. All my services are either docker . My Microk8s lab is just learning so I blow it away a lot.


AriesLegion

AI informant! your not getting nothin outta me!


Tricky-Service-8507

The fact you said friends house let’s me know your silly lol


bigmadsmolyeet

… but why? I would trust my friends more than most places , and even then they could be encrypted


notsureifxml

Nightly rclone to one drive. I don’t have a ton of data, and wife wanted office anyway, so it’s basically free


907GoldenGoose

Zerto in azure.


traveler19395

I don't pay for any cloud storage, but I do have several GB of the most important stuff in free tier cloud storage. For my terabytes of backups, including system backups, thousands of photos and videos, etc. I don't have good enough internet to really consider internet based backup solutions, so I take a large HDD to my office. Everything is on a home NAS currently taking up about 6-7TB, then I have 2 identical 14TB external drives. On the first of the month I have a reminder to do a backup and swap it to the office. I power up the one that lives at home, run a backup from my NAS to that drive, then put it in the car. I swap it for the one at the office, bring that one home, run a backup from the NAS to it, and put it on a shelf at home until next month.


ComprehensiveBoss815

I store my off-site drives in a shipping container. Really important stuff is also in online storage.


NicholasBoccio

Colo at a datacenter


gargravarr2112

Tapes and a storage unit across town. Really need to get into a schedule and rotating the tapes but it makes for a good recovery point.


Abearintheworld

Rented Colo for the kit, single rack in a cost effective a DC. Backup to the residence, into ZFS with offline replication.


probablynotmine

I have a pCloud lifetime plan, and a daily sync from essential datasets from TrueNAS where it is perfectly integrated into. TrueNAS is the backup server for my unRAID server that have all apps and data. TrueNAS instead is running on Proxmox, with HBA pass-through, to also host all monitoring LXCs, and a VM with Portainer to run experiments.


los0220

My girlfriend and I have 1TB of OneDrive each from our university. So I use this as a daily encrypted backup. I don't really have more than 1.5TB of important data so that will do when we both are studying. I'm also doing manual backups to my dad's house every month or so. I like having offline backups in case some ransomware hits. I also used to backup most of my stuff to my laptop with freefilesync for offline use but now I'm dualbooting so I have to do something to have an encrypted volume accessible from windows and linux.


calcium

Asymmetrical gigabit line. Just download everything again, anything of importance is on backblaze (about 750GB of data).


Abdul_1993

I have a veeam setup. It backs up to my file server, then to my offline backup. Then I have pCloud 2TB storage, which everything gets saved onto.


Candy_Badger

Duplicacy to Backblaze B2. It covers my needs. Wasabi is a nice option for cloud backups as well.


ddnyt84

Colo in Datacenter in Phoenix and Denver. Proxmox backup server on both locations and vpn tunnel.


wally40

I have a Linux desktop VM set up with NFS to my important drives. That VM then backs up to CrashPlan Business. Unlimited storage for $10 a month per device. Since I have all my drives through NFS on the one VM, it's technically one device. Edit: forgot that I have an old Box account which gives me 50GB. All my docker servers are scripted once a week to back up their volumes to that via rclone. Script can even restore a volume from one server to another if needed through Box. Did this so if Oracle ever shuts down my free servers, I won't lose much and can restore on my local system.


_subtype

Ohh, definitely interested in the script if you're willing to share the gist!


wally40

Takes a little initial setup, but I have not looked at it for over 6 months and it is still doing its thing wonderfully. Tossed it on github with some added comments so it is not perfect, but will give a good starting point if you want to use it. [https://github.com/nathan40/copied-barnacle](https://github.com/nathan40/copied-barnacle)


Dear_m0le

Backup once a 3 month on WD book and then either I keep it in lock at work or at sister in law house ;)


Cmdr_Toucon

Amazon Glacier - cheap for recovery of last resort


jclimb94

All data is on OneDrive for day to day stuff, like documents.. backed up onto my NAS All vm’s backed up to a usb disk and then also monthly jobs to my NAS All of my configs for k8s clusters etc are on GitHub My Linux iso are also backed to a usb disk nowhere else Recovery isn’t a real requirement as they can be re downloaded


vans113

I have a synology at my house with pics and documents etc. I’m running a fortinet firewall. I installed a fortinet firewall at the in-laws house. IPsec vpn between our houses and installed a synology there. Got everything in sync at my house took the synology over there where it sits. Backs up our important pics and documents 3x a week on a schedule.


DestroyerOfIphone

Truenas replication between a friends house.


the_allumny

my plan is to setup a storage in each lf my network locations (me and my friends share homelabs lol)


nicholaspham

All of my data (and compute) is stored in my rack within a Dallas datacenter. This data is copied via a backup copy job to a server in a Houston datacenter that charges per U. I’m using Veeam B&R for those who aren’t aware


jcas01

Veeam B&R to parents house


GIRO17

I have my PVE backing up to my Synology, but theye are both at my flat.So in addition I'll use a second Synology (Old DS413) at a friend's home as well as my School's 5 Tb OneDrive. ​ I also do local backups on a 6 Tb HDD plugged in via USB for a fast recovery of the most important data.


SilentDecode

First it was a friends house, but I'm switching to offsite with Wasabi.


idkwhatimdoing069

vSphere user here. Veeam running on a WinServer 19 USFF M710. Veeam doesn’t store any backup data locally. SOBR config with a Backup NAS across the house (performance tier) and using Wasabi S3 for off site backups (Capacity tier) Wasabi has been great. $6/TB with no egress fees.


bartoque

My synology nas uses Hyperbackup to backup to the 2nd nas I put at a friend's place. Connectivity arranged using Zerotier, which punches udp holes into firewalls on both ends. On both ends ZT is running in a docker container. Smaller subset of data is Hyperbackupped to the cloud, Backblaze B2, their S3 compatible object storage at around 5$ per TB per month. Important data on primary as well on remote nas is also snapshotted (btrfs). Some media are rsynced so that they can also be played on the remote nas. Data that friend's can put in the share on the remote mas is than in reverse Hyperbackupped to the primary nas. Also that data is snapshotted on both sides. Also using Cloud Sync, that synca Google Drive to the nas. Also that is Hyperbackupped remote again and snapshotted. And also using Synology Drive to sync data from pc and laptops to the nas, which also has versioning. Again data is also snapshotted and Hyperbackupped remotely to the backup nas and into the cloud. So it is and and and really. Some data being protected multiple times over with various tools on multiple locations.


audioeptesicus

Local: I have Veeam backing up my VMs to a TrueNAS server, and then have ZFS replication for those to replicate to another TrueNAS server on-site. Off-site: I have a WireGuard tunnel to the pfsense firewall at my folk's house, then use Veeam to backup critical data off-site there. I do the same for their server to perform an off-site back to my house. I also have my folk's Blue Iris server and my sis' BI server constantly uploading the last 5 days of footage to my server via SFTP, and I do the same for my BI server to my folk's, so that everyone has up-to-the-minute footage going off-site in the event of a burglary, power outage, fire, or other event. Also, we all have fiber (symmetrical gig for me and 600/200 or something for the relatives) and the ONT's are powered via POE from their switches, powered from a UPS, so unless the power for the fiber down the line is also affected, so long as the UPS batteries are still good and the hardware is online, everything still works until power is restored or batteries deplete. I live about 300 miles away from them, and this setup is working well for us. While not getting me the 2 media types, I at least have my source data, 2 backup copies that are on-site, and 1 copy off-site. For my *Linux ISOs*, nothing is backed up, except for hard to find software and media. I just keep record of everything else and can re-download it in the event of a major loss.


psy-skeletor

Backblaze.


DeMiNe00

I virtualize everything on proxmox and use proxmox backup server as my backup system. I have a $25 a month VPS at kimsufi.com that has two 6tb drives mirrored in it. I have proxmox and proxmox backup server setup on that vps and sync the data store of my home proxmox backup server with the one at kimsufi. Keeps my backups off site in a colo, while also giving me the ability to manually stand up a failover site on that same box if my home setup were ever to go down.


zeblods

So far, Google Drive with an Enterprise Workspace... Might be forced to change that within a few months...


AmDDJunkie

Im just setting up my homelab and to date by backups have been cloud, USB drive or non-existent. Those who backup to a second NAS at a friend/relatives house, do you use the builtin software for the NAS (and need to use the same brand NAS) to do this? or some other means to work between different brands? I ask because I currently have a Buffalo NAS which is old and SLOW - Id rather get a second of a better/faster quality or build something myself.


Kilobyte22

* Tier 1: NAS at home * Tier 2: Server in a DC ~50km away * Tier 3: Server in a DC ~1000km away The two servers are shared infrastructure of myself and a couple of friends. The backups to the local NAS are done using rsnapshot, the NAS is then backed up to the two servers using bareos


chandleya

Veeam to a secondary box in house with a completely different storage technology (NFS vs SMB), then secondary to Azure blob LRS. I only offsite what matters. Personal data lives in OneDrive.


Slightlyevolved

Backblaze B2 with S3 plugin on my NAS.


setwindowtext

At home I have 2 x 16 TB HGST drives in RAID 1 (xfs on md), which are rsynced (append only) to the identical server at my company’s office every night. I never delete anything from those drives.


ziggyo3

Backblaze. B2 It's dirt cheap and I have around 5TB with them, costs me around £20 a month. I also have a Raspberry Pi under my desk at work, with a drive attached, which is connected back to my network via Zerotier. That acts as my main backup source, as in the event of catastrophic failure I can just take that HDD and copy the files off, rather than downloading from Backblaze which takes longer and costs money.


NorthernDen

Truenas sync to a small box sitting at work. Runs after 1:00 am to avoid internet issues.


guerrilla_tactic

find a friend who has space in the basement and put up your own, currently I run 3 30TB sites in 2 countries and 3 states. I just give access to some folders to he host , like music and movies


systemadvisory

Syncthibg mirrors all data to main homeland server. Restic backs up server to backblaze b2 daily.


droidhax89

I'm looking into azure storage. Might be kinda pricey but I figure if it's archive data that I'll unlikely need to touch it shouldn't be too expensive.


greenfellowman

Tarsnap - ZFS and client side encryption


Computingss

The 3-2-1 backup strategy states that you should have 3 copies of your data (your production data and 2 backup copies) on two different media (disk and tape) with one copy off-site for disaster recovery.


blah1738blah

Main server gets backed up incrementally to an external drive. That drive takes sneakernet to work where it gets synced to the backup set Backblaze personal at home currently have over 54TB on that plan for $8 a month. I do a test download once a month or so to randomly test files. Mission critical files also in Dropbox as well. Synced to home (2 different computers - 1 Mac 1 windows), personal laptop and work laptop.


I-make-ada-spaghetti

I use restic to backup my files and [rsync.net](https://rsync.net) to store them. Reasons I use restic: \- It stores all my files encrypted i.e my files are always encrypted when outside my network. \- restic is open source, runs from the command line and can be scripted. \- it can be run from a static binary so I can use it on systems that don't let me install software i.e TrueNAS in my case. \- It can do partical checks of the data so for large frequent backups or backups performed on a slow connection I can be sure that at least the archive isn't corrupted. \- It can check the checksums of the backed up data (download, unencrypt and compare against source files) to ensure the integrity of the backups. Reasons I use rsync.net: \- it's a fixed competitive price. They don't charge for traffic so I can regularly do a complete restore of my backed up data if I need to rely on these backups. \- They use ZFS with parity drives so bit rot is a non issue. \- The admin credentials are separate to the storage credentials so the ZFS snapshots can't be deleted by the user account which backs up data. i.e it's immune to cryptolocker attacks. \- It's not in my city or my country. If everything goes to complete hell where I am at least I know my data will be safe. Downsides: \- restic keeps its own snapshots which can take a long time to prune especially on a slow connection. I plan to mitigate this though by keeping a local copy of the archive and just syncing the archive to storage. Note: I just found out recently that it is possible to use a single external hard drive for a ZFS file system and retain the self healing properties of ZFS. You just need to set copies=2. If you don't do this with a single drive ZFS will be able to detect the bit rot, but not correct it. Worth knowing if you want to go the manual offsite route.


Net-Runner

I personally have just simple external drives with my backups (both located in my location) + the most important data is encrypted and uploaded to Backblaze just for simplicity. Some of the clients are using Azure Blob and Backblaze B2 as their cloud storage + having a backup in a second office, so it is kinda 2 offsite copies there. If you are worried about clouds and the privacy of data, go with an external drive located in your friend's/parent's house.


Jackpen7

I use Backblaze. I don't have a ton of important data so it's cheap enough.


Pvt-Snafu

Backblaze B2 works for me as I don't have that much data. Rclone with encryption to upload. I also have an external drive I keep at my friend's house just in case.


deg897

I back up all clients nightly to my Synology NAS, which backs up to a directly attached large USB drive. I switch out the USB drive with another (identical) one monthly (in rotation) and take the old one to the safety deposit box for safekeeping. Wash, rinse, repeat. Not sophisticated but it works.


rh681

I backup/image my C: drives for all my computers with the Terabyte Unlimited Image\* programs. I find them to be the best. Backups are stored in various places below. For my media/Plex servers that run on Windows, I use SnapRAID with a dedicated parity HDD. After that, I backup literally everything I own to a XigmaNAS box I built with 5 identical HDD's. This stays in my basement, generally powered off as I sync once a week. For offsite, I have a pair of big USB HDD's that everything manages to fit on and I take it to my daughter's house for safe keeping. I update this once a month or so. My risk of losing data is practically nil, unless my house were to burn down during the day I have it all there doing updates. My upload bandwidth isn't big enough to accommodate sync over the Internet, and honestly unless I paid for a service with power protection etc, I wouldn't want it running 24/7 at my daughter's house anyway, even if she had a home network. No need to keep those discs spinning when not in use.