I wrote a python script to backup MySQL and upload to AWS S3. I will show you these steps in the following. First of all, you need to create a user in AWS IAM.
Create a new user in IAM
Put “Create New Users” bottom
Input user’s name
Don’t forget to download the user’s credential. You must download this user’s credential, or you cannot download it afterward.
Attach the policy to this IAM user
This user only needs to access S3, I give this user all S3 permission.
Finally, you can see that there is a policy inside the permissions.
By the way, if you want to attach an IAM role to an EC2 instance, you must create an IAM role before you launch an EC2 instance. You can follow the above instructions to create an IAM role. Then when you launch an EC2 instance, attach it to EC2 instance.
After creating IAM user, you need to install pip on your EC2 instances. What’s pip? Pip is a useful tool based on python, it helps you to install python packages quickly.
Install related tools
$ curl "https://bootstrap.pypa.io/get-pip.py" -o "get-pip.py" $ sudo python get-pip.py
Check pip installation.
$ pip --help
Install AWS CLI by pip.
$ sudo pip install awscli
After installing pip and AWS CLI, install a python package is named boto3. boto3 is a powerful tool for AWS management.
$ sudo pip install boto3
Configure credentials file
Then use AWS CLI to configure credentials file
Input your aws_access_key_id, aws_secret_access_key which you downloaded before. And set a default region. Alternatively, you can create the credential file yourself. By default, its location is at
[default] aws_access_key_id = YOUR_ACCESS_KEY aws_secret_access_key = YOUR_SECRET_KEY
You may also want to set a default region by yourself. This can be done in the configuration file. By default, its location is at
Create a bucket in S3
Don’t forget to create a bucket in your S3.
Write a script
When you finish the above steps, we can write a simple python script to backup our MySQL data now.
$ vim wordpress_backup.py
#!/usr/bin/env python import os import time import boto3 username = 'YOUR_DATABASE_USERNAME' password = 'YOUR_DATABASE_PASSWORD' hostname = 'localhost' # your database host. database = 'YOUR_DATABASE_NAME' filestamp = time.strftime('%Y%m%d') #===============dump database================= filePath = "/home/ubuntu/%s-%s.sql" % (database, filestamp) os.popen("mysqldump -u%s -p%s %s -h%s > %s" % (username, password, database, hostname, filePath)) #=============upload SQL file to S3================= s3 = boto3.resource('s3') bucketName = 'BUCKET_NAME' filename = "%s-%s.sql" % (database, filestamp) s3.Object(bucketName, filename).put(Body=open(filename, 'rb')) print 'Uploading %s to Amazon S3 bucket: %s' % (filename, bucketName) os.remove(filePath) #===========Get object list from a bucket============ myBucket = s3.Bucket(bucketName) objectList = myBucket.objects.all() for key in objectList: print (key.key)
After you finished these steps, you can run this python script to backup MySQL, and you can find your MySQL dump in S3 bucket.
$ python wordpress_backup.py
Finally, we can use linux crontab to run this python script daily.
$ crontab -e
Add below line to your crontab file.
00 18 * * * /usr/bin/python /home/ubuntu/wordpress_backup.py >> /home/ubuntu/S3backup.log 2>&1
Set the rule of lifecycle in bucket properties
Find the lifecycle tab, and put “Add rule”
You can apply this rule to a specific folder or whole bucket.
You can permanently delete the old files.
Finally, review the lifecycle rules, then create and activate this rule.