How To Back Up Your Linux Computer to a Synology NAS

This post was published on The Next Tech

If you’re also the proud owner of a Linux desktop which you have extensively customized since installation, then you’ve probably given a thought to backup and data recovery.

Thankfully, when it comes to backups, there are plenty of good options out there for Linux these days.

These range from the powerful and versatile rsync command line interface (CLI) to Timeshift: an excellent tool for creating system snapshots for rolling back to a previous point in time that offers many of the same features that Time Machine (MacOS) and System Restore (Windows) do.

While my own Linux backup strategy includes both Timeshift snapshots and bare metal Clonezilla backups my approach to onsite 3-2-1 compliance — which requires that both backup copies be on different storage media — has for years simply consisted of adding more internal drives to my desktop.

This is …. not ideal.

Backing up to a local server or network attached storage (NAS) confers several advantages over taking up backups on the host machine. For one, the 3-2-1 strategy also calls for keeping a backup copy offsite.

And unless you’re lucky enough to have business-grade home internet (in which case I am truly jealous), pushing up backup data to the cloud — particularly the first time you do it— can take days, weeks, or, excruciatingly, even months.

As I tend to allow my desktop some shut-eye when I do, my previous approach (yes, really) consisted of attaching a Post-It to the front of my computer warning me not to turn off the device until the backup had finished running.

Not ideal — but by using a backup server or NAS you can keep those jobs running effortlessly.

Secondly, using an NAS makes it really easy to set up Redundant Array of Independent Disks (RAID) — so, depending on the level you use, you don’t need to worry about disk failure on your onsite constituting an unforeseen threat to your backup strategy.

Just to be clear: RAID isn’t backup (it’s redundancy). But having your onsite backups on a RAID-enabled storage makes that onsite backup even safer.

Finally, consider the fact that Synology’s NASs run their own operating system, DiskStation Manager (DSM), which features a great cloud sync engine.

This makes it simplistic to push the backups your store on the NAS up to the cloud without having to get your hands dirty with things like cron scripting and automation.

All around, I would argue that backing up to an NAS rather than a local drive or plug-in SSD is a win-win.

Do I have you convinced?

If so, and without further a-do, here are the tools I managed to get running in order to replicate the previous functionality I had to back up my Linux Ubuntu 20.04 LTS desktop.

1. Grsync / Cloudberry for incremental backups

If you’re a fan of the Timeshift GUI like I am then you might be disappointed to learn that it does not support backing up to network devices as the target.

Thankfully, all hope is not lost.

You can simply use grsync or Cloudberry to sync full system backups to the NAS.

Note: I’m using some artistic license to call these ‘incremental backups’. They’re change-only syncs — but the end result, unless you dig a bit deeper into the available parameters, are full backups on the destination. (Incremental backups mean something quite specific even though sometimes any backups taken over rsync are described that way.)

In terms of the tools available to make : Grsync is a pretty bare-bones frontend to rsync. MSP360™ (CloudBerry) Backup for Ubuntu lets you take things a bit further by giving you the ability to create backup plans and have them run on a schedule.

Finally, you can use Cloud Sync to create a local to remote job in order to sync those system backups up to the cloud / to an offsite repository.

Setting up grsync between my Linux desktop and the NAS was quite straightforward although the progress does not support full-fledged backup plans and scheduling.

By contrast, using Cloudberry I was able to backup plans which I could set on schedule and run over the Local Area Network (LAN) directly onto the NAS.

Setting this backup up was as simple as entering the NAS’s local IP address and creating a new SFTP destination with the NAS’s local IP.

The one caveat to creating all these backups from any local host to the NAS is that the SSH server is not enabled by default in Disk Station Manager (DSM).

In order to open up SSH, rsync, and FTP access I needed to manually enable these servers on the NAS. This is done in the ‘File Services’ window:

2. Disk imaging with Clonezilla

In addition to taking more regular ‘incremental’ backups to various snapshots points for easy restore, I like to create harder ‘bare metal’ disk images too.

As mentioned, I typically use the excellent and very powerful Clonezilla tool to take these onto a separate SSD in my computer.

Replicating the process but backing up directly onto the NAS turned out to be extremely straightforward.

Firstly, as described above, I enabled SSH (and rsync for good measure) from the “Terminal & SNMP” part of control panel in DSM.

Next, I created a separate shared volume just to house the Clonezilla disk images.

As I have been testing a few different backup methodologies I created a new shared volume for each one just to keep things well separated on the filesystem — and to allow me to create new users just for individual backup services if required.

Next, I booted into Clonezilla in the usual way but of course opted for ‘SSH server’ as the backup destination rather than local device.

Some users prefer backing up onto the SAMBA server, which can also be enabled, but I had success just over SSHFS.

Finally, when the program asked where to mount as /home/partimag I replaced the default path with my new volume’s path relative to the root of the NAS:

In about the usual timeframe, the program then created a full disk image as normal on the NAS.

Take All Your Linux Desktop Host Backups Over LAN Onto Your NAS

Performing backups onto an NAS makes a lot more sense than doing it onto your Linux desktop machine itself.

For one, like any server, it’s designed to be kept running 24/7 — so if you’re pushing large backups like disk images offsite you won’t need to keep your computer turned on while large amounts of data leave your network for the cloud.

No Post-It notes appended to your desktop required while the push runs for days or weeks. Just set the cloud sync up (whether manually or though a tool like Synology’s Cloud Sync) and let the server do all the heavy lifting.

A second advantage is that — to not run afoul of the different storage media rule — you won’t even need to worry about buying external SSDs or adding more internal drives just to house your backups.

By backing up everything on your Linux desktop to the SSD you’re already putting it on different storage media.

Synology’s NAS features a nice operating system (OS) called DSM that makes it even easier to configure running various servers on the device — and makes it easy to sync the onsite backups you can take onto it up to the cloud, thereby fulfilling an important part of the  3-2-1 approach.

Follow the above tips to migrate your Linux desktop backups onto a local NAS device.