Regularly backing up your web server content, or any content for that matter, is essential to avoid data loss in case of a malware attack or accidental deletion. If your website is hosted on a web host that doesn’t provide an automatic backup service or if you have self-hosted your website in a virtual machine running on a cloud, then manually taking backups regularly can be tedious.
To address this problem, I decided to create a python script to fully automate the process. Since my blogs, including this one, are all running on Docker containers, all that I need to do is to create an archive of the directory mapped to the containers—which contains my static files as well as the database.
Once the directory is archived, I need to store the archived file in a safe location. With Google Drive, you get 15GB of free cloud storage space, and I thought that will be an ideal place to stash my backups. Additionally, I wanted to be notified of the status of the backup process. In case of an error, I should be able to address it quickly. If the backup succeeds, knowing it has succeeded will assure me that the process is working as it is supposed to do.
Before we begin to start coding, let’s take a look at how our script is supposed to function.
- Create an archive of the directory or file.
- Authenticate the script with the Google Drive API service.
- Check for any previous backups.
- If there is more than one backup, then delete all the backups except the latest one.
- Upload the archive to Google Drive.
- Delete the archive file from the web server.
- Send a notification to your mobile phone
Getting started with Google Drive API
To use Google Drive API, we need a service token from Google. Even though you can access the API service using an API key and access tokens, to do so you need a web interface through which a user can sign in to their user account and authorize access to their account. But this will make our simple script unnecessarily complex. Our script will only be used by one user, so we can directly grant access to our Google Drive.
To that end, we need to create a service token. Before creating a service token, we need to have a Google Console Project. So, head over to https://console.cloud.google.com and click on the combo box displaying the names of your projects next to the Google Cloud Platform logo. In the modal window that appears, click on New Project.
In the New Project page, enter a name for your project and click on Create. I have named my project “AutoBackup”.
Once done, select the name of your project from the combo box on the top bar.
In the side menu, hover over IAM & admin and select Service Accounts from the menu.
Then click on Create Service Account. Give a name and an id, and click on Create. I have named my account “PythonScript”.
In the next page, click on Role, and select Owner from Project. This will give you full access to all the resources in the Project.
In the next screen, click Create Key and then select JSON. Once you click on Create, the created token file will be downloaded to your computer.
Keep this file safe and secure because you won’t be able to download this file again, and if this file is compromised, then anyone can access your project using the compromised token.
Then, click on Done and you will be able to see the email ID of your service account.
Take a note of this as you will need it later.