So I have recently setup a backup plan for our mysql database. I am pretty happy with how it came out. I use Amazon S3 as the final storage device for all of the backups. A fairly simple ruby script (below) runs mysqldump and then pushes the result out to S3.
I wanted a weeks worth of daily backups, a months worth of weekly backups, and monthly backups for all of eternity. After figuring out how to use the ruby date object the rest was pretty easy. The ruby cookbook held the only decent documentation that I could find on how to actually use the date object. Thanks for the excellent documentation!
I also used some information from a great post on the Mission Data Blog. The current code I had opened a file read it into memory and then transferred that to S3. The mentioned blog post explained how to change Net::HTTP to stream a file.
Here is the interesting part of the code I wrote:
# create AWS connection
conn = S3::AWSAuthConnection.new(AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY,
USE_SSL)
today = Date.today
system("mysqldump --opt dbname |
bzip2 -c > /var/backups/dbname-#{today.to_s}.sql.bz2")
#put out today's backup
putfile conn, :bucket => "db-backup",
:key => "dbname-#{today.to_s}.sql.bz2",
:file => "/var/backups/dbname-#{today.to_s}.sql.bz2"
#remove old copies
if today.day % 7 == 1
delete conn, :bucket => "db-backup",
:key => "dbname-#{(today < "db-backup", :key => "dbname-#{(today - 7).to_s}.sql.bz2"
end
#remove old local copies
FileUtils.rm "/var/backups/dbname-#{(today - 7).to_s}.sql.bz2",
:force => true
If anyone has any interesting ideas on how to implement a better backup plan I would be very interested in hearing some more ideas.
No comments:
Post a Comment