如何使用 python 在 AWS-s3 存储桶的不同区域上传文件
how to upload the file under different region of AWS-s3 bucket using python
import boto
import boto.s3
import sys
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = 'XXXXXXXXXXXXXXXXX'
AWS_SECRET_ACCESS_KEY = 'XXXXXXXXXXXXXXXXXXXXXXXXX'
bucket_name="s3 bucket_name"
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
bucket=conn.get_bucket(bucket_name,validate=True,headers=None)s
testfile = "file_path"
print ('Uploading %s to Amazon S3 bucket %s' % (testfile, bucket_name))
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
k = Key(bucket)
k.key = 'mytestfile.csv'
k.set_contents_from_filename(testfile,
cb=percent_cb, num_cb=10)
- 这是我用来在 s3 存储桶中上传 .csv 文件的程序,如果我选择的区域是 ,它会以预期的方式工作美国东部(弗吉尼亚北部) 但我需要在 'US East (Ohio)' 区域上传文件。如果我尝试这样做,我会遇到以下异常 boto.exception.S3ResponseError: S3ResponseError: 400 Bad Request。如果有人对此有解决方案,请与我分享。
它对我有用,无需指定区域:
import boto3
client = boto3.client('s3')
client.upload_file('foo.txt', 'my-bucket', 'foo.txt')
但是,您也可以在连接到 S3 时指定区域:
import boto3
client = boto3.client('s3', region_name = 'ap-southeast-2')
client.upload_file('foo.txt', 'my-bucket', 'foo.txt')
顺便说一句,您应该永远不会需要将凭据放入源 (.py) 文件中。相反,将凭据存储在配置文件中。 SDK 将自动检索它们。创建文件的最简单方法是使用 AWS Command-Line Interface (CLI) aws configure
command.
然后,您可以像我上面那样 运行 编写代码,而不必担心传递凭据。
import boto
import sys
from boto.s3.key import Key
import boto.s3.connection
AWS_ACCESS_KEY_ID = '<access_key>'
AWS_SECRET_ACCESS_KEY = '<secert_key>'
Bucketname = 'bucket_name'
conn = boto.s3.connect_to_region('us-east-2',
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
is_secure=True, # uncomment if you are not using ssl
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
bucket = conn.get_bucket(Bucketname)
testfile = "filename"
print ('Uploading %s to Amazon S3 bucket %s' %
(testfile, Bucketname))
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
k = Key(bucket)
k.key = 'fileName
k.set_contents_from_filename(testfile,
cb=percent_cb, num_cb=10)
import boto
import boto.s3
import sys
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = 'XXXXXXXXXXXXXXXXX'
AWS_SECRET_ACCESS_KEY = 'XXXXXXXXXXXXXXXXXXXXXXXXX'
bucket_name="s3 bucket_name"
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
bucket=conn.get_bucket(bucket_name,validate=True,headers=None)s
testfile = "file_path"
print ('Uploading %s to Amazon S3 bucket %s' % (testfile, bucket_name))
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
k = Key(bucket)
k.key = 'mytestfile.csv'
k.set_contents_from_filename(testfile,
cb=percent_cb, num_cb=10)
- 这是我用来在 s3 存储桶中上传 .csv 文件的程序,如果我选择的区域是 ,它会以预期的方式工作美国东部(弗吉尼亚北部) 但我需要在 'US East (Ohio)' 区域上传文件。如果我尝试这样做,我会遇到以下异常 boto.exception.S3ResponseError: S3ResponseError: 400 Bad Request。如果有人对此有解决方案,请与我分享。
它对我有用,无需指定区域:
import boto3
client = boto3.client('s3')
client.upload_file('foo.txt', 'my-bucket', 'foo.txt')
但是,您也可以在连接到 S3 时指定区域:
import boto3
client = boto3.client('s3', region_name = 'ap-southeast-2')
client.upload_file('foo.txt', 'my-bucket', 'foo.txt')
顺便说一句,您应该永远不会需要将凭据放入源 (.py) 文件中。相反,将凭据存储在配置文件中。 SDK 将自动检索它们。创建文件的最简单方法是使用 AWS Command-Line Interface (CLI) aws configure
command.
然后,您可以像我上面那样 运行 编写代码,而不必担心传递凭据。
import boto
import sys
from boto.s3.key import Key
import boto.s3.connection
AWS_ACCESS_KEY_ID = '<access_key>'
AWS_SECRET_ACCESS_KEY = '<secert_key>'
Bucketname = 'bucket_name'
conn = boto.s3.connect_to_region('us-east-2',
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
is_secure=True, # uncomment if you are not using ssl
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
bucket = conn.get_bucket(Bucketname)
testfile = "filename"
print ('Uploading %s to Amazon S3 bucket %s' %
(testfile, Bucketname))
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
k = Key(bucket)
k.key = 'fileName
k.set_contents_from_filename(testfile,
cb=percent_cb, num_cb=10)