How to: Backup a website with FTPcopy
When running a website, it is very important to have a backup of all site files, to prepare for any event that may require reloading data (file corruption, moving hosts, etc.). After building my Ubuntu file server, I knew that I had to find a way to mirror this website so files could be recovered if necessary.
Here is a bash script to automate the process and create a daily mirror of whatever FTP server you want to back up.
- Download and install FTPCopy from the repositories.
sudo apt-get install ftpcopy
- Change to a directory that will store the script and open a new text file.
i, then copy and paste the following text.
# Issue FTPcopy command
ftpcopy --no-delete -l 1 -u $USER -p $PASS $HOST $REMOTE .
- Be sure to change the values for website, storage directory, remote directory, host, username and password.
- Save and exit the text file by typing
- Make the script executable by typing
chmod a+x /path/to/script
- Add the script to the crontab so it will be executed on a regular basis. I use Webmin for this type of administration work, but it is possible to use the command line. Use this example to sort out the format. Mine runs daily at 12 AM.
To clarify, this code changes directories into the backup folder, then issues the ftpcopy command. The remote directory of public_html is common on many webservers, but be sure to confirm before running the script. The no-delete option means that files are not removed from the backup if they’ve been removed from the web server. The l option simply means provide feedback of what files are being moved — this can be viewed in your user mail.
After the time has passed for the first time, check the folder where your backups will reside to make sure they are being added as planned.