每天在 aws s3 bucket 上自动备份 Digital ocean 服务器

Take automated backup of Digital ocean server on aws s3 bucket on daily basis

我有带有文件和 MySQL 数据库的数字海洋水滴,我想每天(即每个午夜)在 aws s3 存储桶上备份它

使用 root 权限登录您的 Droplet

1.//先安装awscli

apt install awscli

2。 //创建 S3 存储桶并提供以下详细信息

aws configure

AWS Access Key ID [None]: {enter your access key id}         
AWS Secret Access Key [None]: {enter your secret access key} 
Default region name [None]: {enter your preferred region}    
Default output format [None]: {enter your preferred format} 

3.// 测试连接是否正常??

aws s3 cp file.zip s3://{bucket_name}

4.// 创建实际脚本 (backup.sh)

#!/bin/sh

# Database credentials
DATABASE="database-name"
USERNAME="database_user"
PASSWORD="password"

# Directory to back up
SOURCE_DIR= Files_path_here

# Target
TARGET_DIR=/backup
TARGET_BUCKET= Bucket-name

# Output files
NOW=$(date +"%Y_%m_%d_%H_%M_%S")
DB_OUTPUT=$TARGET_DIR/db.$NOW.sql.gz
FILES_OUTPUT=$TARGET_DIR/files.$NOW.zip

# Back up files
zip -r $FILES_OUTPUT $SOURCE_DIR

# Back up database
mysqldump -u $USERNAME -p"$PASSWORD" $DATABASE --single-transaction | gzip > $DB_OUTPUT

# Upload to S3
aws s3 cp $DB_OUTPUT s3://$TARGET_BUCKET
aws s3 cp $FILES_OUTPUT s3://$TARGET_BUCKET

# Remove files older than 14 days
find $TARGET_DIR -type f -mtime +14 | xargs rm -f

5.//赋予权限

chmod 700 backup.sh

6.// 将此 bash 文件(即 backup.sh)转换为 linux 兼容

sed -i -e 's/\r$//' backup.sh

7.// 通过 运行 从命令行测试备份:

./backup.sh

8.//下面命令编辑crontab

crontab -e

9.Add 最下面一行:

0 22 * * * /path/to/backup.sh