Backing up to gcloud storage from Linux/Virtualmin

If you find this free website useful – why don’t you support this with a donation? It is easy…. read more ….

Using Google Cloud Storage as backup for your linux server is inexpensive if you use the nearline storage option. This is cheaper that Amazon Web Services (at the time of writing)

The process I use is as follows.

  1. Backup locally
  2. Create storage bucket / folders, set lifecycle
  3. Script to move backups
  4. find way to execute script

As I use Virtualmin as my control panel, the process details they way to use it from Virtualmin, but you can tailor to your own needs.

I’m assuming you have already signed up to Google Cloud Console and set up a project / billing as required.

The process uses a local backup copy and then moves / deletes it. So if you are limited in working space, you may need to divide you backups in to workable tranches.

You will need gstutils loaded load onto your server using the Google SDK https://cloud.google.com/sdk/docs/ and properly authorised – e.g. via gcloud init

1. Using virtualmin to create scheduled backups

 

vim

e.g. creating backups into the local directory /home/backups/files

 

2.  Make a Google Storage Bucket

You can use the Google Cloud Console to do this, but as you have the gsutil command and you will need to get used  to it, here is how to do it from the CLI.

I’m assuming assuming here you want to store in EU but you can use US or ASIA as required for -l (location)

I recommend the ‘class’ to be ‘nearline’ as this is the cheaper option -c nl

(change mybucket to a unique name of course)

gsutil mb -c nl -l EU gs://mybucket/

set life cycle e.g. 35 days to delete  or as required, that way your storage bills won’t grow forever.

create rule file as below (35 days)  and run

gsutil lifecycle set lifecycle_config.json gs://mybucket/
{
    "rule":
    [
      {
        "action": {"type": "Delete"},
        "condition": {"age":35 }
      }
    ]
}

3. create script in /files/backups

#!/bin/bash
#today variable as YYYY-MM-DD
NOW=$(date +"%Y-%m-%d")
#Upload all files to our dated folder in the selected bucket
gsutil cp /home/backups/files/* gs://mybuckets/$1/$NOW
#Delete all the files after uploading them
rm /home/backups/files/*

Takes a folder argument allows different backups to use it within the same bucket  (folders get created automatically)

e.g.  /bin/sh /home/backups/upload.sh  weekly

make sure your script is executable

sudo chmod +x /home/backups/upload.sh

 

4. find ways to execute script

With Virualmin this is easy, in schedule set command to run after backup

schedule

 

 

Recovering backups

Of course to recover backs requires copying locally and then restoring too – but with gsutil that is fairly easy.

 

list your weekly backups

gsutil ls gs://mybucket/weekly

e.g.

gs://mybucket/weekly/

gs://mybucket/weekly/2016-07-31

gs://mybucket/weekly/2016-07-25/

gs://mybucket/weekly/2016-08-05/

 

get the full directory (copy recursively) for 2016-08-05  back into /home/backups/files

gsutil -m cp -r dir gs://mybucket/weekly/2016-08-05/ /home/backups/files

Then restore  (in my case using Virtualmin) in the normal way

 


Posted

in

, ,

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *