The Palos Publishing Company

Follow Us On The X Platform @PalosPublishing
Categories We Write About

Build a Python tool to backup your website

Creating a Python-based website backup tool involves automating the process of downloading website files and optionally backing up the database if your site uses one. The script below provides a solid foundation for backing up both static and dynamic websites hosted on a server.

This tool will:

  1. Backup website files (via SCP or direct folder copy).

  2. Backup the database (MySQL/MariaDB/PostgreSQL).

  3. Compress the backup into a ZIP file.

  4. Store it locally or optionally upload to cloud storage (can be extended).

⚙️ Prerequisites

  • Python 3

  • SSH access to your server (for remote sites)

  • paramiko, scp, and mysql-connector-python or psycopg2 if working with a database

  • Access to your local/remote web root


✅ Full Python Script: Website Backup Tool

python
import os import zipfile import datetime import paramiko from scp import SCPClient import subprocess # Configuration REMOTE_HOST = "your.server.com" REMOTE_PORT = 22 REMOTE_USER = "your_ssh_username" REMOTE_PASSWORD = "your_ssh_password" REMOTE_WEB_DIR = "/var/www/html" LOCAL_BACKUP_DIR = "/path/to/store/backup" BACKUP_DB = True DB_TYPE = "mysql" # or "postgres" DB_NAME = "your_db" DB_USER = "db_user" DB_PASSWORD = "db_password" DB_HOST = "localhost" def create_backup_dir(): today = datetime.datetime.now().strftime("%Y-%m-%d_%H-%M-%S") backup_path = os.path.join(LOCAL_BACKUP_DIR, f"backup_{today}") os.makedirs(backup_path, exist_ok=True) return backup_path def ssh_connect(): ssh = paramiko.SSHClient() ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy()) ssh.connect(REMOTE_HOST, port=REMOTE_PORT, username=REMOTE_USER, password=REMOTE_PASSWORD) return ssh def backup_website_files(backup_path): print("[+] Backing up website files...") ssh = ssh_connect() scp = SCPClient(ssh.get_transport()) local_dir = os.path.join(backup_path, "website_files") os.makedirs(local_dir, exist_ok=True) scp.get(REMOTE_WEB_DIR, local_path=local_dir, recursive=True) scp.close() ssh.close() print("[+] Website files backup completed.") def backup_database(backup_path): print("[+] Backing up database...") dump_file = os.path.join(backup_path, f"{DB_NAME}_backup.sql") if DB_TYPE == "mysql": dump_cmd = f"mysqldump -h {DB_HOST} -u {DB_USER} -p{DB_PASSWORD} {DB_NAME} > {dump_file}" elif DB_TYPE == "postgres": dump_cmd = f"PGPASSWORD={DB_PASSWORD} pg_dump -h {DB_HOST} -U {DB_USER} {DB_NAME} > {dump_file}" else: print("[!] Unsupported DB type.") return try: subprocess.run(dump_cmd, shell=True, check=True, executable="/bin/bash") print("[+] Database backup completed.") except subprocess.CalledProcessError as e: print(f"[!] Database backup failed: {e}") def compress_backup(backup_path): print("[+] Compressing backup...") zip_path = backup_path + ".zip" with zipfile.ZipFile(zip_path, 'w', zipfile.ZIP_DEFLATED) as zipf: for root, _, files in os.walk(backup_path): for file in files: abs_path = os.path.join(root, file) rel_path = os.path.relpath(abs_path, backup_path) zipf.write(abs_path, rel_path) print(f"[+] Backup compressed at {zip_path}") return zip_path def clean_up(backup_path): print("[+] Cleaning up uncompressed backup...") for root, dirs, files in os.walk(backup_path, topdown=False): for name in files: os.remove(os.path.join(root, name)) for name in dirs: os.rmdir(os.path.join(root, name)) os.rmdir(backup_path) print("[+] Cleanup complete.") def main(): print("[*] Starting website backup tool...") backup_path = create_backup_dir() backup_website_files(backup_path) if BACKUP_DB: backup_database(backup_path) zip_file = compress_backup(backup_path) clean_up(backup_path) print("[✓] Website backup completed successfully.") if __name__ == "__main__": main()

🔐 Security Note

  • Avoid hardcoding sensitive credentials in production. Use .env files or environment variables.

  • Use SSH keys instead of passwords for better security.


💡 Features You Can Add

  • Cloud upload (e.g., AWS S3, Google Drive)

  • Automatic scheduling with cron or Windows Task Scheduler

  • Logging to file

  • Email notification after backup


This script is useful for regularly backing up websites hosted on Linux servers, especially WordPress or static HTML sites. For production use, enhance with encryption, error handling, and cloud storage integration.

Share this Page your favorite way: Click any app below to share.

Enter your email below to join The Palos Publishing Company Email List

We respect your email privacy

Categories We Write About