Backups for dedicated servers and VPS hosting are easy to ignore… until a bad deploy, hack, or hardware failure wipes out your site.
This guide walks through practical, real-world ways to handle dedicated server backups and VPS backups using tools you probably already have.
You’ll see how to get faster restores, more stable services, and fully automated jobs, without buying heavy, enterprise backup software.
When you run your own dedicated server or VPS, there’s no big “Undo” button.
A simple mistake can break everything:
You run a quick rm -rf in the wrong folder.
An update corrupts a database.
A security issue takes your site offline.
If you have a recent backup, this is annoying but fine.
If you don’t, you’re suddenly calculating how much downtime is “acceptable” and how long it would take to rebuild everything from scratch.
Two things to keep in mind:
Your hosting provider might keep disaster-recovery backups, but those are for their emergencies, not your convenience.
RAID (even RAID 1 mirroring) is not a backup. It just keeps the latest version of your broken files perfectly mirrored.
So, think of backups as your own personal time machine. Let’s build one.
In the hosting industry, you’ll usually see two big categories:
Managed dedicated servers / managed VPS – the provider often gives you a backup panel or add‑on backup service.
Unmanaged dedicated servers / unmanaged VPS – you get root access and a “good luck” wave. Backups are your job.
Some providers offer cloud backup tools powered by products like JetBackup or similar:
You log into a control panel.
You pick what to back up.
You pick how often.
You pay a bit extra each month.
If your provider offers a reliable backup add‑on, it’s often worth enabling it as a safety net.
But even then, having your own backup strategy (especially off‑site) is still smart.
One practical tip: pick a provider that makes it quick to spin up servers and doesn’t lock you into a single backup tool. That way you can try out different backup scripts, cron jobs, and remote targets on a real box instead of just reading docs.
👉 Launch a GTHost dedicated server and test your backup strategy on a real machine in minutes
Once you’ve played with backups on a test server, you’ll feel much calmer doing it on production.
If your dedicated server or VPS uses cPanel/WHM, you already have a powerful backup system built in.
What WHM backups are good for
Backing up all cPanel accounts on the server.
Including system files and configuration.
Sending backups to remote locations (S3, FTP, another server, etc.).
Typical flow
Log in to WHM as root.
Go to the backup section (Backup Configuration).
Choose:
How often to back up (daily/weekly/monthly).
Where to store them (local disk, remote destination).
What to exclude (big logs, cache folders, etc.).
Save and wait for the first backup window.
If you’re using WHM, let it handle the “whole server backup” job.
You can still add your own custom backups for secondary copies or specific folders.
If you only control a single cPanel account (for example, on a VPS or managed hosting plan), cPanel’s built‑in backup tool is enough for many small sites.
What you can do from cPanel
Create full account backups.
Download partial backups (home directory, databases, email).
Restore individual pieces when you mess something up.
Simple pattern
Log in to cPanel.
Open the backup tool or “Backup Wizard.”
Generate full backups regularly.
Download important ones to your own machine or a safe cloud drive.
This is a nice starting point, but it still depends on you remembering to click “backup.”
To get more consistent protection, let’s move to automated VPS backups on the server itself.
Storing backups on the same server protects you from small mistakes, but not from big disasters:
Disk failure.
Provider outage.
Accidental full-server wipe.
This is where remote cloud storage comes in.
You can use command‑line tools and cron jobs to send backups to:
An Amazon S3 bucket.
A Google Drive account.
Another object storage service with an S3‑compatible API.
Typical pattern
Use a CLI tool (like awscli, rclone, or vendor tools).
Point it at your backup folder.
Run it regularly via cron.
Result: your dedicated server backups and VPS backups live off‑site, so even if the whole server dies, your data is still safe somewhere else.
If you want full control—what to back up, where it goes, how files are named—then a simple Bash script plus a cron job is perfect.
We’ll walk through a very common pattern:
Create a backup directory.
Write a script that compresses your data into timestamped archives.
Make the script executable.
Automate everything with cron.
Log in to your VPS or dedicated server over SSH.
Create a folder to store backups
Replace username with your actual username:
bash
mkdir -p /home/username/backup
Create the script file
bash
cd /home/username/backup
nano backup-cron.sh
(Use vim, nano, or any editor you like.)
Add this script content
Replace username with your username and target with the folder you want to back up:
bash
#!/bin/bash
destination_folder="/home/username/backup/$(date +%F)"
archive_file="backup-$(date +%H%M).tar.gz"
mkdir -p "$destination_folder"
/bin/tar --exclude='/home/username/backup' -czvf "$destination_folder/$archive_file" /home/username/target
What this does:
Creates a folder like /home/username/backup/2025-01-01.
Saves a compressed tar.gz file like backup-2350.tar.gz inside it.
Skips the backup folder itself so you don’t back up backups forever.
Make the script executable
bash
chmod 700 /home/username/backup/backup-cron.sh
Test it once manually
bash
/home/username/backup/backup-cron.sh
Check that:
No errors show up.
The backup folder and archive file are created as expected.
Now you have a working dedicated server backup script. Time to automate it.
Cron is the built‑in scheduler on Linux. Once you set it up, it quietly runs your backup script on schedule.
You can configure cron two ways: through cPanel or directly from the command line.
Option A: Using cPanel (if available)
Log in to cPanel.
Go to “Cron Jobs.”
Choose how often to run the job (for example, once per day at night).
Set the command, for example:
bash
/bin/sh /home/username/backup/backup-cron.sh
Save the cron job.
This gives you automated VPS backups or dedicated server backups with a couple of clicks.
Option B: Using the Command Line
If you prefer (or only have) shell access, do it by editing your crontab.
Log in via SSH.
Open your crontab in edit mode:
bash
crontab -e
(Optional) At the top, add a line to get an email whenever cron runs something:
bash
MAILTO="[email protected]"
Add a line to run the backup once per day at 23:50:
bash
Run backup at 23:50 every day
50 23 * * * /bin/sh /home/username/backup/backup-cron.sh
Save and exit the editor (for vim: press Esc, then type :wq and hit Enter).
Cron will show something like:
bash
crontab: installing new crontab
To see your cron entries later without editing them:
bash
crontab -l
Now your dedicated server backup script runs automatically every day.
You can tune the schedule to your needs—hourly for busy sites, weekly for small, low‑change projects.
If your site uses MySQL or MariaDB, file backups are not enough.
You also want clean, consistent database backups.
A simple way is mysqldump plus gzip.
Create a database backup script
bash
nano /home/username/dbbackup.sh
Add the following content
bash
#!/bin/bash
db_name=dbname
db_user=dbusername
db_password='dbpassword'
backup_filename="$db_name-$(date +%F)"
mysqldump --routines -h localhost -u "$db_user" -p"$db_password" "$db_name" | gzip > /home/username/dbbackup/"$backup_filename".sql.gz
Notes:
The password is wrapped in single quotes in the script so special characters don’t break the command.
The output goes to /home/username/dbbackup, with filenames like mydb-2025-01-01.sql.gz.
Create the backup folder and secure it
bash
mkdir -p /home/username/dbbackup
chmod 700 /home/username/dbbackup
chmod 700 /home/username/dbbackup.sh
Add this script to cron as well
For example, run it once per night:
bash
30 0 * * * /bin/sh /home/username/dbbackup.sh
Now your databases are backed up separately from your files, which makes restores easier and faster.
So far, everything is happening on the same machine.
To protect yourself against full‑server disasters, you want copies somewhere else—another server you control, a backup box, or a storage VPS.
rsync is perfect for this.
Log in to the remote server via SSH.
Create a folder to store backups:
bash
mkdir -p /backup
Ensure rsync is installed on both ends:
On Debian/Ubuntu:
bash
sudo apt-get update
sudo apt-get install -y rsync
On AlmaLinux/Rocky/Fedora:
bash
sudo yum install -y rsync
On your main server (the one you’re backing up), create a script:
bash
nano /home/username/backup-remote.sh
Add this content, adjusting values to your setup:
bash
#!/bin/bash
destination_folder="/backup/$(date +%F)"
mkdir -p "$destination_folder"
rsync -e 'ssh -p 7822' -avl --delete --stats --progress [email protected]:/target "$destination_folder"/
Replace:
7822 with your real SSH port.
username with the remote user.
example.com with the remote server hostname or IP.
/target with the folder you want to sync from the main server.
Make it executable:
bash
chmod 700 /home/username/backup-remote.sh
Test it manually:
bash
/home/username/backup-remote.sh
If it works, hook it into cron, maybe right after your local backup finishes:
bash
10 0 * * * /bin/sh /home/username/backup-remote.sh
Now your dedicated server backup process looks like this:
Local backup script runs.
Database backup script runs.
Remote rsync job sends everything to another server.
That’s a proper backup strategy, not just “I hope the provider has snapshots.”
Q1: How often should I back up my VPS or dedicated server?
For most small sites, once per day is a good baseline.
If you have very busy apps (orders, payments, constant changes), consider hourly database backups plus daily full backups.
Q2: How long should I keep old backups?
It depends on disk space and your risk tolerance. A common pattern:
Daily backups for 7–14 days.
Weekly backups for 1–3 months.
Monthly backups for 6–12 months.
You can write simple cleanup scripts or use backup software options to delete older archives automatically.
Q3: Where is the best place to store backups?
Use layers:
Local disk (fast restore).
Remote server or object storage (protection from hardware loss).
Optional: download critical backups to your own machine for extra safety.
The more important the data, the more layers you want.
Q4: Are VPS backups different from dedicated server backups?
The tools are the same: tar, mysqldump, rsync, WHM, cPanel, cron.
The difference is mainly scale and control—on a dedicated server you might manage many cPanel accounts and more complex storage, but the backup strategy is very similar.
Q5: Do I still need backups if my provider offers “snapshots”?
Yes. Provider snapshots are great, but:
They might not be frequent enough.
You might not control how long they’re kept.
If you change providers, your snapshots don’t move with you.
Your own dedicated server backup strategy gives you independence and peace of mind.
Backups on dedicated servers and VPS hosting don’t have to be complicated: a few simple scripts, some cron jobs, and one off‑site copy already give you much faster recovery, more stable services, and more controllable risks.
The methods here—WHM backups, cPanel backups, custom Bash scripts, database dumps, and remote rsync—cover most real‑world server backup scenarios without expensive tools.
If you want a provider that works comfortably with this kind of do‑it‑yourself backup strategy, 👉 GTHost is suitable for self‑managed dedicated server and VPS backup scenarios: you keep full control of your servers, and you can plug in every backup technique from this guide from day one.