如何在内存中压缩文件并发送到 Amazon S3
How to zip files in memory and send to Amazon S3
我遇到这个问题,如果我写类似的东西(其中 bunch 只是一个文件路径列表):
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False) as zipper:
for file in bunch:
zipper.write(file)
with open('0.zip', 'wb') as f:
f.write(zip_buffer.getvalue())
然后我得到一个 zip 文件 0.zip,其中包含“bunch”列表中的所有文件。太棒了
但是当我尝试从内存上传到 Amazon S3 时
s3_client = boto3.client("s3")
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False) as zipper:
for file in bunch:
zipper.write(file)
s3_client.put_object(Bucket=S3_BUCKET, Key=ZIP_NAME_IN_S3_BUCKET, Body=zip_buffer.getvalue())
然后在 Amazon S3 存储桶中创建了 zip 文件,但是它不是可以解压缩的有效 zip 文件。为什么我保存在本地和发送到 Amazon S3 时不一样?
我找到了解决方案。我必须做的:
s3_client = boto3.client("s3")
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False) as zipper:
for file in bunch:
zipper.write(file)
zip_buffer.seek(0)
s3_client.upload_fileobj(zip_buffer, S3_BUCKET, ZIP_NAME_IN_S3_BUCKET)
我遇到这个问题,如果我写类似的东西(其中 bunch 只是一个文件路径列表):
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False) as zipper:
for file in bunch:
zipper.write(file)
with open('0.zip', 'wb') as f:
f.write(zip_buffer.getvalue())
然后我得到一个 zip 文件 0.zip,其中包含“bunch”列表中的所有文件。太棒了
但是当我尝试从内存上传到 Amazon S3 时
s3_client = boto3.client("s3")
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False) as zipper:
for file in bunch:
zipper.write(file)
s3_client.put_object(Bucket=S3_BUCKET, Key=ZIP_NAME_IN_S3_BUCKET, Body=zip_buffer.getvalue())
然后在 Amazon S3 存储桶中创建了 zip 文件,但是它不是可以解压缩的有效 zip 文件。为什么我保存在本地和发送到 Amazon S3 时不一样?
我找到了解决方案。我必须做的:
s3_client = boto3.client("s3")
zip_buffer = io.BytesIO()
with zipfile.ZipFile(zip_buffer, "a", zipfile.ZIP_DEFLATED, False) as zipper:
for file in bunch:
zipper.write(file)
zip_buffer.seek(0)
s3_client.upload_fileobj(zip_buffer, S3_BUCKET, ZIP_NAME_IN_S3_BUCKET)