Aws Programmatic 启动批处理作业
Aws Programmatic launch batch job
嘿,我有以下启动批处理作业的功能。
我的批处理作业有两个参数要传入
- 资源
--目的地
def kickoff_transfer_batch(self,item):
try:
batch = boto3.client('batch')
bucket, key = get_s3_bucket_and_key(item.source)
jobName = 'transfer-'+ key
jobQueue = 'aws-strikeforce-on-demand-restore-prod'
jobDefinition = 'aws-strikeforce-transfer-prod'
source = '--source ' + item.source
destination ='--destination ' + item.destination
command = []
command.append(source)
command.append(destination)
submit_job_response = batch.submit_job(
jobName=jobName,
jobQueue=jobQueue,
jobDefinition=jobDefinition,
containerOverrides={'command': command}
)
job_id = submit_job_response['jobId']
print('Submitted job {} {} to the job queue {}'.format(jobName, job_id, jobQueue))
except Exception as err:
item.errored = True
print("failed: " + item.source)
print("error: " + str(err))
stack_trace = traceback.format_exc()
self._log_error_notes(item.source, err, stack_trace)
我的工作正在批量启动,但由于我传入 --source 和 --dest 的方式,我的容器无法启动。
这是错误日志
main.py: error: unrecognized arguments: --source file_test.txt --destination file_test.txt
如何修复我的命令列表以传递正确的参数。
当我在命令行启动作业时,我只需键入
--源文件,--目标文件
此答案供日后参考
def kickoff_transfer_batch(self,item):
try:
batch = boto3.client('batch')
bucket, key = get_s3_bucket_and_key(item.source)
jobName = 'transfer-'+ key
jobQueue = 'aws-strikeforce-on-demand-restore-prod'
jobDefinition = 'aws-strikeforce-transfer-prod'
command = '--source '+ item.source + '--destination ' + item.destination
command = command.split()
submit_job_response = batch.submit_job(
jobName=jobName,
jobQueue=jobQueue,
jobDefinition=jobDefinition,
containerOverrides={'command': command}
)
job_id = submit_job_response['jobId']
print('Submitted job {} {} to the job queue {}'.format(jobName, job_id, jobQueue))
except Exception as err:
item.errored = True
print("failed: " + item.source)
print("error: " + str(err))
stack_trace = traceback.format_exc()
self._log_error_notes(item.source, err, stack_trace)
嘿,我有以下启动批处理作业的功能。
我的批处理作业有两个参数要传入 - 资源 --目的地
def kickoff_transfer_batch(self,item):
try:
batch = boto3.client('batch')
bucket, key = get_s3_bucket_and_key(item.source)
jobName = 'transfer-'+ key
jobQueue = 'aws-strikeforce-on-demand-restore-prod'
jobDefinition = 'aws-strikeforce-transfer-prod'
source = '--source ' + item.source
destination ='--destination ' + item.destination
command = []
command.append(source)
command.append(destination)
submit_job_response = batch.submit_job(
jobName=jobName,
jobQueue=jobQueue,
jobDefinition=jobDefinition,
containerOverrides={'command': command}
)
job_id = submit_job_response['jobId']
print('Submitted job {} {} to the job queue {}'.format(jobName, job_id, jobQueue))
except Exception as err:
item.errored = True
print("failed: " + item.source)
print("error: " + str(err))
stack_trace = traceback.format_exc()
self._log_error_notes(item.source, err, stack_trace)
我的工作正在批量启动,但由于我传入 --source 和 --dest 的方式,我的容器无法启动。 这是错误日志
main.py: error: unrecognized arguments: --source file_test.txt --destination file_test.txt
如何修复我的命令列表以传递正确的参数。 当我在命令行启动作业时,我只需键入 --源文件,--目标文件
此答案供日后参考
def kickoff_transfer_batch(self,item):
try:
batch = boto3.client('batch')
bucket, key = get_s3_bucket_and_key(item.source)
jobName = 'transfer-'+ key
jobQueue = 'aws-strikeforce-on-demand-restore-prod'
jobDefinition = 'aws-strikeforce-transfer-prod'
command = '--source '+ item.source + '--destination ' + item.destination
command = command.split()
submit_job_response = batch.submit_job(
jobName=jobName,
jobQueue=jobQueue,
jobDefinition=jobDefinition,
containerOverrides={'command': command}
)
job_id = submit_job_response['jobId']
print('Submitted job {} {} to the job queue {}'.format(jobName, job_id, jobQueue))
except Exception as err:
item.errored = True
print("failed: " + item.source)
print("error: " + str(err))
stack_trace = traceback.format_exc()
self._log_error_notes(item.source, err, stack_trace)