[Python] Use python boto3 and crontab to backup MySQL to AWS S3

[Python] Use python boto3 and crontab to backup MySQL to AWS S3

I wrote a python script to backup MySQL and upload to AWS S3. I will show you these steps in the following. First of all, you need to create an user in AWS IAM.

iam1

Create a new user in IAM

Put “Create New Users” bottom
iam2

Input user’s name
iam3

Don’t forget to download the user’s credential. You must download this user’s credential, or you cannot download it afterwards.
iam4

Attach policy to this IAM user
iam5

This user only needs to access S3, I give this user all S3 permission.
iam6

Finally, you can see that there is a policy inside the permissions.
iam7

By the way, if you want to attach a IAM role to an EC2 instance, you must create an IAM role before you launch an EC2 instance. You can follow the above instructions to create an IAM role. Then when you launch an EC2 instance, attach it to EC2 instance.
ec2_iam_role

After creating IAM user, you need to install pip on your EC2 instances. What’s pip? Pip is an useful tool based on python, it helps you to install python packages quickly.

Install related tools

$ curl "https://bootstrap.pypa.io/get-pip.py" -o "get-pip.py"

$ sudo python get-pip.py

Check pip installation.

$ pip --help

Install AWS CLI by pip.

$ sudo pip install awscli

After installing pip and AWS CLI, install a python package is named boto3. boto3 is a powerful tool for AWS management.

$ sudo pip install boto3

Configure credentials file

Then use AWS CLI to configure credentials file

$aws configure

Input your aws_access_key_id, aws_secret_access_key which you downloaded before. And set a default region. Alternatively, you can create the credential file yourself. By default, its location is at ~/.aws/credentials:

[default]
aws_access_key_id = YOUR_ACCESS_KEY
aws_secret_access_key = YOUR_SECRET_KEY

You may also want to set a default region by yourself. This can be done in the configuration file. By default, its location is at ~/.aws/config:

[default]
region=us-east-1

Create a bucket in S3

Don’t forget to create a bucket in your S3.
s31

Write a script

When you finish the above steps, we can write a simple python script to backup our MySQL data now.

$ vim wordpress_backup.py
#!/usr/bin/env python
import os
import time
import boto3

username = 'YOUR_DATABASE_USERNAME'
password = 'YOUR_DATABASE_PASSWORD'
hostname = 'localhost' # your database host.
database = 'YOUR_DATABASE_NAME'

filestamp = time.strftime('%Y%m%d')

#===============dump database=================
filePath = "/home/ubuntu/%s-%s.sql" % (database, filestamp)
os.popen("mysqldump -u%s -p%s %s -h%s > %s" % (username, password, database, hostname, filePath))

#=============upload SQL file to S3=================
s3 = boto3.resource('s3')

bucketName = 'BUCKET_NAME'

filename = "%s-%s.sql" % (database, filestamp)

s3.Object(bucketName, filename).put(Body=open(filename, 'rb'))

print 'Uploading %s to Amazon S3 bucket: %s' % (filename, bucketName)
os.remove(filePath)

#===========Get object list from a bucket============
myBucket = s3.Bucket(bucketName)
objectList = myBucket.objects.all()

for key in objectList:
    print (key.key)

After you finished these steps, you can run this python script to backup MySQL, and you can find your MySQL dump in S3 bucket.

$ python wordpress_backup.py

Set crontab

Finally, we can use linux crontab to run this python script daily.

$ crontab -e

Add below line to your crontab file.

00 18 * * * /usr/bin/python /home/ubuntu/wordpress_backup.py >> /home/ubuntu/S3backup.log 2>&1

Set the rule of lifecycle in bucket properties

Find the lifecycle tab, and put “Add rule”
s32

You can apply this rule to a specific folder or whole bucket.
s33

You can permanently delete the old files.
s34

Finally, review the lifecycle rules, then create and activate this rule.
s35

(Visited 489 time, 1 visit today)
Facebooktwittergoogle_plusredditpinterestlinkedinmail
Comments are closed.