Mastering rsync and Bash to Backup Your Linux Desktop or Server

Print
by Rob Williams on October 4, 2013 in Software

Keeping good backups of your data is important; don’t be the sucker who loses important files and has to deal with it afterwards! In this in-depth guide, you’ll learn about using rsync and lftp to transfer files, writing your own scripts and automating them, and backing up to external storage, a NAS, and a remote server without passwords.

Page 5 – Scripts, crontab, Final Thoughts

Backing up files on a regular basis isn’t too difficult, but if we’re talking about executing a backup procedure every single day, then that changes things. Thanks to the existence of crontab, the Unix scheduler, the pain is lessened to the point of nonexistence. The best part: No install is required for crontab; your distro undoubtedly has it built-in.

To make good use of crontab, though, some BASH scripts need to be written. These scripts can be edited via a GUI text editor or a CLI one, and saved anywhere on the PC. As a personal preference, I store all of my scripts in /home/scripts. You could install yours somewhere else, however, such as /home/user/scripts.

To start, here’s a simple script that archive syncs one folder to another:

#!/bin/bash
rsync –av /home /mnt/ntfs/Backups

Once you save that to a file with an ‘sh’ extension (eg: bashscript.sh), you could execute it with sh bashscript.sh. Simple stuff. However, for more flexibility, you can get a little more advanced:

#!/bin/bash
echo Beginning Backup… ;
mount -t vfat /dev/sdc1 /mnt/thumb ;
rsync –a ––delete /home /mnt/thumb ;
umount /mnt/thumb ;
echo Backup Complete!

That script will acknowledge that the process has started, mount the drive, rsync, unmount, and then tell you it’s finished (this of course assumes the drive either stays plugged in or has the same drive letter each time it is).

What if you want to set up a script to run on a regular basis, or at a specified time when you’re not around? Here’s another.

#!/bin/bash
backup_log=/var/log/backup.log
rsync –av /home /mnt/ntfs >> $backup_log

This one establishes a log location and then proceeds to perform the rsync with all output thrown into the backup.log file for you to take a look at later if need be.

What about nightly backups of your source code? How about creating date-coded tar files that automatically saves to your storage device?

#!/bin/bash
backup=nightly_`date +%m-%d-%Y`
backup_log=/var/log/nightly_backup.log

echo “Backup Start: `date +%m-%d-%Y-%T`” >> $backup_log;
tar zvcf /mnt/nas/$backup.tar.gz /home/techgage/projects/application_name >> $backup_log;
echo “Backup End: `date +%m-%d-%Y-%T`” >> $backup_log;

Note: Those are acute accents `, not regular accents ‘

Using this example, a backup file format is established and also our static log file. First, the time that the tar process begins is pasted in, followed by the actual process itself. Using this example, a file output would be: nightly_10-04-2013.tar.gz. You can hit up man date and configure the date format to your liking. 

Want to symlink the fresh .tar.gz file to latest.tar.gz for easy handling? This means that you can create scripts to upload the file to a remote server or external device and be able to point it directly to latest.tar.gz, but it would upload the file you want it to. You could add this line to the very end of the above script:

ls -sfn `date +%m-%d-%Y-%T`.tar.gz latest.tar.gz

The possibilities of scripting are endless, and if you have a specific need, there is surely a way to get what you need accomplished with a simple script. Now, how about automation?

Setting Up crontab

So, you want to run your script at a certain time each day/week/month; it’s time to make use of crontab! Once ready to edit, run nano -w /etc/crontab in a terminal, but replace nano with your preferred text editor. If you launch a GUI editor as root (or sudo), you could edit the file that way as well.

Want to run the script at 5:00 AM each morning?

0 5 * * * root sh /home/scripts/nas_backup.sh

To give a quick cron primer, the beginning of each task requires five sets of numbers: Minute/Hour/Day of Month/Month/Day of Week. So if you wanted to run your script at 3:00 AM each Wednesday, it would be:

0 3 * * 3 root sh /home/scripts/nas_backup.sh

Or for 10:30 AM on Monday and Fridays (Sundays = 0, Saturday = 6), and as your own user:

30 10 * * 1,5 username sh /home/scripts/nas_backup.sh

To read more on cron, I highly recommend checking out this Wikipedia page.

Final Thoughts

After reading through the article, you should have a good idea of what to do now: How to rsync and how to create a script. But what exactly do you want to back up? The /home folder is always a good start; that way, if some data crash occurs, you can easily recover your files. But it can go deeper than that.

All around your hard drive, there could be files worth backing up. Gentoo is my distro of choice, and it has a lot of configuration files laced around that are worth backing up, such as Portage’s package files, the ‘world’ file, the gcc make.conf file, among others. Being able to restore these files from a backup would make getting back up and running a lot easier.

We just touched on the basics here. You could write a robust script to do a variety of things; mine for example, performs the same actions each night in a sequential order. It first cleans up the computer, empties the trash and various caches and then backs up my /home, configuration files, and my kernel source code folder. Then it grabs a few nightly backups off one server with wget, and rsyncs our website server to make sure we have a current backup. All of this is done while saving the output into a static log file.

Yes… there is a lot you can do with a simple script, just as long as you are not worried about spending some time setting things up properly.

So what are you waiting for? Accidents can strike at any time…

This article was originally published on August 27, 2007, and since updated.

Support our efforts! With ad revenue at an all-time low for written websites, we're relying more than ever on reader support to help us continue putting so much effort into this type of content. You can support us by becoming a Patron, or by using our Amazon shopping affiliate links listed through our articles. Thanks for your support!

Rob Williams

Rob founded Techgage in 2005 to be an 'Advocate of the consumer', focusing on fair reviews and keeping people apprised of news in the tech world. Catering to both enthusiasts and businesses alike; from desktop gaming to professional workstations, and all the supporting software.

twitter icon facebook icon instagram icon