Bash Scripting for Automated Backups
in DevOps
Author: Zayan Ahmed | Estimated Reading time: 4 mins
Scenario 1: Backing Up a Web Server
Imagine you are a DevOps engineer managing a web server. This server hosts important
files, databases, and configurations that should be backed up regularly. If a server crash or
accidental deletion occurs, you could lose all your data. To prevent this, you decide to write a
Bash script that automatically creates backups every day and stores them safely.
Bash Script for Web Server Backup
#!/bin/bash
# Variables
SOURCE_DIR="/var/www/html" # Directory to back up
BACKUP_DIR="/backup" # Where backups will be stored
DATE=$(date +"%Y-%m-%d") # Current date
BACKUP_FILE="$BACKUP_DIR/backup-$DATE.tar.gz" # Backup filename
# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Create a compressed backup
tar -czf $BACKUP_FILE $SOURCE_DIR
# Print success message
echo "Backup created successfully: $BACKUP_FILE"
Scenario 2: Archiving Old Logs
Log files accumulate over time and can take up a lot of disk space. To keep storage under
control, you need to archive logs older than 7 days and store them as zip files in a backup
directory.
Bash Script for Archiving Old Logs
#!/bin/bash
# Variables
LOG_DIR="/var/log/myapp" # Directory where logs are stored
BACKUP_DIR="/backup/logs" # Directory to store archived logs
DATE=$(date +"%Y-%m-%d") # Current date
# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Find logs older than 7 days and compress them
find $LOG_DIR -type f -mtime +7 -exec zip $BACKUP_DIR/logs-$DATE.zip {}
+
# Delete old logs after archiving
find $LOG_DIR -type f -mtime +7 -delete
# Print success message
echo "Old logs archived successfully in $BACKUP_DIR/logs-$DATE.zip"
Scenario 3: Backing Up a Database
Databases store critical information, and losing data could be disastrous. To ensure safety, a
scheduled backup of the database should be performed.
Bash Script for Database Backup
#!/bin/bash
# Variables
DB_NAME="mydatabase" # Database name
DB_USER="dbuser" # Database user
DB_PASSWORD="dbpassword" # Database password
BACKUP_DIR="/backup/db" # Backup directory
DATE=$(date +"%Y-%m-%d") # Current date
BACKUP_FILE="$BACKUP_DIR/db-backup-$DATE.sql.gz"
# Create backup directory if it doesn't exist
mkdir -p $BACKUP_DIR
# Backup the database and compress it
mysqldump -u $DB_USER -p$DB_PASSWORD $DB_NAME | gzip > $BACKUP_FILE
# Print success message
echo "Database backup created successfully: $BACKUP_FILE"
Scenario 4: Syncing Backups to Remote Storage
Local backups are useful, but if the server fails, the backups could also be lost. To ensure
safety, backups should be synced to a remote storage location.
Bash Script for Remote Backup Sync
#!/bin/bash
# Variables
LOCAL_BACKUP_DIR="/backup" # Local backup directory
REMOTE_USER="backupuser" # Remote server user
REMOTE_HOST="backupserver.com" # Remote server address
REMOTE_DIR="/remote/backup" # Remote backup directory
# Sync backups to remote storage
rsync -avz $LOCAL_BACKUP_DIR/ $REMOTE_USER@$REMOTE_HOST:$REMOTE_DIR/
# Print success message
echo "Backups synced successfully to remote server:
$REMOTE_HOST:$REMOTE_DIR"
How These Scripts Help in DevOps
● Automation: The scripts eliminate the need for manual backups.
● Reliability: Data is safely stored in case of failures.
● Efficiency: Compressed and archived backups save storage space.
● Scheduling: These scripts can be scheduled using cron to run regularly without
human intervention.
Scheduling the Scripts (Cron Job)
To automate these backups, add the following lines to your crontab file:
0 0 * * * /path/to/web_backup.sh # Web server backup at midnight
0 1 * * * /path/to/log_backup.sh # Log archiving at 1 AM
0 2 * * * /path/to/db_backup.sh # Database backup at 2 AM
0 3 * * * /path/to/remote_sync.sh # Sync backups to remote server at
3 AM
By using Bash scripting, you can ensure that important data is always backed up without
having to remember to do it manually.
🤔
😊
Want more ?
Follow me on LinkedIn