Creating a strong backup for Linux
Page 1 of 1
Creating a strong backup for Linux
Starting a web development company without any budget is a slippery slope. Loosing all your data can cost you all the months of hard work and development and cashflow that has been generated. Who can figure out I could not allow that with no help at all? So being in my shoes I had to come up with a backup solution that works seamlessly with the resources that I had: one dedicated Debian box in Frankfurt, my home desktop with abundant disk space and my flat rate ADSL line.
It turns out the solution is not that hard at all and can be achieved using SSH authentication keys and some simultaneous trickery. All I do to employ it is leave my home desktop box running all night and in the morning there is a fresh backup on my disk. Voila!
The solution that I developed was utilized using the following crontab entry:
The backup.sh script archived the bare necessities needed to restore the system to full operation:
After the user id is checked to be root the script stores a list of installed packages. Since my system is a clean apt install that is fine for me. After that mysqldump is started and piped directly into gzip which is piped directly into ssh connection that goes to my home desktop. Since my ssh connections are authenticated using keys there is no need for interactive login and thus this line works like a charm. I use the same trick to tarball the folders containing the valuable data. Finally the cleanup removes the temporary files and we are done.
It turns out the solution is not that hard at all and can be achieved using SSH authentication keys and some simultaneous trickery. All I do to employ it is leave my home desktop box running all night and in the morning there is a fresh backup on my disk. Voila!
The solution that I developed was utilized using the following crontab entry:
- Code:
# Backup
27 4 * * * root /root/bin/backup.sh > /dev/null
The backup.sh script archived the bare necessities needed to restore the system to full operation:
- Code:
#!/usr/bin/env bash
LOG_FILE="/var/log/evorion-backup.log"
BACKUP_HOST=hostname.dyndns.org
echo "" >> $LOG_FILE
echo "Evorion backup utility started at: `date "+%F %T"`" >> $LOG_FILE
# Must be run as root user
if [ "$UID" -ne "0" ]
then
echo "[`date "+%F %T"`] Error: You must run this script as root!" >> $LOG_FILE
exit 67
fi
echo "[`date "+%F %T"`] User id check succesful" >> $LOG_FILE
# Compress directly into ssh connection
echo "[`date "+%F %T"`] Dumping and archiving started" >> $LOG_FILE
nice -n 19 dpkg -l > /root/installed_packages.txt
nice -n 19 mysqldump -u root -pYOURMYSQLPASS --lock-all-tables --all-databases | gzip | ssh -q vlatko@$BACKUP_HOST 'cat > /home/vlatko/abraham_backup/databases_`date "+%F_%T"`.gz'
nice -n 19 tar cz -C / root home/vlatko etc usr/virtualweb | ssh -q vlatko@$BACKUP_HOST 'cat > /home/vlatko/abraham_backup/archive_`date "+%F_%T"`.tgz'
echo "[`date "+%F %T"`] Dumping and archiving completed" >> $LOG_FILE
# Cleanup
rm /root/installed_packages.txt
echo "[`date "+%F %T"`] Finished" >> $LOG_FILE
After the user id is checked to be root the script stores a list of installed packages. Since my system is a clean apt install that is fine for me. After that mysqldump is started and piped directly into gzip which is piped directly into ssh connection that goes to my home desktop. Since my ssh connections are authenticated using keys there is no need for interactive login and thus this line works like a charm. I use the same trick to tarball the folders containing the valuable data. Finally the cleanup removes the temporary files and we are done.
andry- Moderator
- Posts : 467
Join date : 2010-05-07
Similar topics
» Taking Backup For Your Browser Bookmarks
» Strong CAPTCHA Guidelines
» DEFT Linux 6 Released
» Chromium and Linux sandboxing
» DEFT Linux v5x released
» Strong CAPTCHA Guidelines
» DEFT Linux 6 Released
» Chromium and Linux sandboxing
» DEFT Linux v5x released
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum
|
|