Categories We Write About

Build an automated backup schedule

Building an automated backup schedule involves defining what data to back up, how often, where to store it, and implementing tools or scripts that execute the backups automatically. Below is a complete guide to designing and implementing a reliable automated backup schedule.


1. Define Backup Objectives

a. Identify Critical Data

  • Databases (MySQL, PostgreSQL, MongoDB, etc.)

  • Application files or source code

  • Configuration files

  • Documents and user-generated content

  • System logs (if needed for auditing or diagnostics)

b. Choose Backup Types

  • Full Backup: Copies all selected data

  • Incremental Backup: Copies only changed data since the last backup

  • Differential Backup: Copies changed data since the last full backup

c. Determine Recovery Point Objective (RPO) and Recovery Time Objective (RTO)

  • RPO: How much data you can afford to lose (e.g., 1 hour)

  • RTO: How fast you need systems restored (e.g., within 2 hours)


2. Choose Backup Destinations

  • Local storage: Faster recovery, but vulnerable to physical damage or theft

  • External hard drives: Good for small setups

  • Network-attached storage (NAS): Ideal for internal backups

  • Cloud Storage: AWS S3, Google Cloud Storage, Azure Blob, Dropbox, Backblaze

  • Remote server: Backup to another data center via SSH/rsync


3. Select Tools for Automation

Linux/Unix-based Systems:

  • cron for scheduling

  • rsync for file-based backups

  • tar for archiving

  • mysqldump or pg_dump for database backups

  • duplicity, borgbackup, restic for encrypted incremental backups

Windows Systems:

  • Task Scheduler

  • PowerShell scripts

  • Robocopy

  • Backup and Restore (Windows built-in)

  • Veeam Agent for Windows

Cloud Backup Tools:

  • Rclone (multi-cloud)

  • AWS CLI for S3

  • Google Cloud SDK


4. Backup Schedule Examples

Daily Schedule

  • 2:00 AM: Incremental file backup to local NAS

  • 3:00 AM: Database dump to encrypted archive

  • 3:15 AM: Upload encrypted database archive to AWS S3

Weekly Schedule

  • Sunday 2:00 AM: Full system image

  • Sunday 3:00 AM: Sync all media assets to Google Cloud Storage

Monthly Schedule

  • 1st of each month 2:00 AM: Archive full system image to external drive or offline cold storage


5. Sample Cron Job Schedule (Linux)

bash
# Daily backup at 2:00 AM 0 2 * * * /usr/local/bin/backup-script.sh >> /var/log/backup.log 2>&1 # Weekly full backup on Sunday at 3:00 AM 0 3 * * 0 /usr/local/bin/full-backup.sh >> /var/log/full-backup.log 2>&1

6. Sample Bash Backup Script

bash
#!/bin/bash # backup-script.sh DATE=$(date +%F) BACKUP_DIR="/backup/$DATE" SOURCE_DIR="/var/www/html" DB_USER="root" DB_PASS="yourpassword" DB_NAME="mydatabase" mkdir -p "$BACKUP_DIR" # File backup rsync -avz "$SOURCE_DIR" "$BACKUP_DIR/files" # MySQL database backup mysqldump -u"$DB_USER" -p"$DB_PASS" "$DB_NAME" > "$BACKUP_DIR/db-$DB_NAME.sql" # Archive and compress tar -czf "/backup/$DATE.tar.gz" -C "/backup/$DATE" . rm -rf "$BACKUP_DIR" # Optional: Upload to cloud rclone copy "/backup/$DATE.tar.gz" remote:your-backup-folder

7. Security Measures

  • Encrypt sensitive backups using GPG or AES

  • Use secure channels (SFTP, rsync over SSH)

  • Store credentials in environment variables or use secret managers

  • Restrict access permissions on backup files


8. Testing and Monitoring

  • Perform routine test restores to validate backup integrity

  • Monitor backup logs for errors and storage limits

  • Use automated alerting via email or Slack on failure


9. Retention Policies

  • Daily: Keep last 7 days

  • Weekly: Keep last 4 weeks

  • Monthly: Keep last 6–12 months

  • Use tools like logrotate or cleanup scripts to automate retention

Sample cleanup script:

bash
find /backup -type f -mtime +30 -name "*.tar.gz" -exec rm {} ;

10. Cloud Backup Integration Example with Rclone

bash
# Configure Rclone once rclone config # Use it in scripts rclone copy /backup/2025-05-19.tar.gz gdrive:site-backups/

Conclusion

An effective automated backup schedule ensures data integrity, availability, and disaster resilience. Combine local and off-site backups, automate with scripts and cron jobs, secure with encryption, and regularly test your restore processes. Tailor the frequency and destinations to your infrastructure’s complexity and data criticality.

Share This Page:

Enter your email below to join The Palos Publishing Company Email List

We respect your email privacy

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Categories We Write About