如何使用 BOTO 将 JSON 文件发送到 Amazon SQS 队列
How to send a JSON file to Amazon SQS queue using BOTO
我需要向 AWS SQS 队列服务发送几个 .json 文件。谁能告诉我使用 boto 的确切代码可以完成此操作吗?
按照这些思路应该可以工作:
(你可以先看语法,我没测试过)
import boto.sqs
import json
queue_name = 'YOUR-QUEUE'
sqs = boto.sqs.connect_to_region('us-east-1')
queue = sqs.get_queue(queue_name)
[queue.write(queue.new_message(json.load(i))) for i in ['file1.json', 'file2.json']]
Boto 的 new_message()
调用需要一个字符串作为其参数。您的文件已经以字符串格式存储 JSON。您不需要 JSON 模块。只是不要忘记去掉你的行结尾 - \r\n 在 Windows (CRLF) 和 \n 在 Unix (LF)
但是,如果您有一个 JSON 对象,那就另当别论了。使用 json.dumps
获取您的 JSON 对象并将其转储为字符串。
From the boto (not version 3) docs 在 connect_to_region()
At this point the variable conn will point to an SQSConnection object in the US-WEST-2 region. Bear in mind that just as any other AWS service, SQS is region-specific. In this example, the AWS access key and AWS secret key are passed in to the method explicitly. Alternatively, you can set the environment variables:
AWS_ACCESS_KEY_ID - Your AWS Access Key ID
AWS_SECRET_ACCESS_KEY - Your AWS Secret Access Key
我在 python 2.7.12 和 boto(不是 boto 3)上测试了这个并且它有效。
import boto.sqs
import sys
# either use environment variables for your access keys, or the flags here
conn = boto.sqs.connect_to_region("us-west-2", aws_access_key_id="the_key", aws_secret_access_key="the_secret_key")
q = conn.get_queue('the_queue_name')
if not q:
print "unable to locate queue! exiting..."
sys.exit()
for src_file in ['myfile1.json', 'myfile2.json']:
with open(src_file, 'r') as json_file:
for line in json_file:
print "writing %s" % line
q.write(q.new_message(line.rstrip('\r\n')))
我需要向 AWS SQS 队列服务发送几个 .json 文件。谁能告诉我使用 boto 的确切代码可以完成此操作吗?
按照这些思路应该可以工作: (你可以先看语法,我没测试过)
import boto.sqs
import json
queue_name = 'YOUR-QUEUE'
sqs = boto.sqs.connect_to_region('us-east-1')
queue = sqs.get_queue(queue_name)
[queue.write(queue.new_message(json.load(i))) for i in ['file1.json', 'file2.json']]
Boto 的 new_message()
调用需要一个字符串作为其参数。您的文件已经以字符串格式存储 JSON。您不需要 JSON 模块。只是不要忘记去掉你的行结尾 - \r\n 在 Windows (CRLF) 和 \n 在 Unix (LF)
但是,如果您有一个 JSON 对象,那就另当别论了。使用 json.dumps
获取您的 JSON 对象并将其转储为字符串。
From the boto (not version 3) docs 在 connect_to_region()
At this point the variable conn will point to an SQSConnection object in the US-WEST-2 region. Bear in mind that just as any other AWS service, SQS is region-specific. In this example, the AWS access key and AWS secret key are passed in to the method explicitly. Alternatively, you can set the environment variables:
AWS_ACCESS_KEY_ID - Your AWS Access Key ID
AWS_SECRET_ACCESS_KEY - Your AWS Secret Access Key
我在 python 2.7.12 和 boto(不是 boto 3)上测试了这个并且它有效。
import boto.sqs
import sys
# either use environment variables for your access keys, or the flags here
conn = boto.sqs.connect_to_region("us-west-2", aws_access_key_id="the_key", aws_secret_access_key="the_secret_key")
q = conn.get_queue('the_queue_name')
if not q:
print "unable to locate queue! exiting..."
sys.exit()
for src_file in ['myfile1.json', 'myfile2.json']:
with open(src_file, 'r') as json_file:
for line in json_file:
print "writing %s" % line
q.write(q.new_message(line.rstrip('\r\n')))