如何使用 Python 代码启动数据流作业
How to start up dataflow job with Python Code
一旦我将一个文件放到云中,我想启动数据流作业storage.And我启动一个云函数来触发it.But我不知道如何启动数据流作业Python?有人可以帮忙吗?
const kickOffDataflow = (input, output) => {
var jobName = CONFIG.DATAFLOW_JOB_NAME;
var templatePath = CONFIG.TEMPLETE_FILE_PATH;
var request = {
projectId: "test",
requestBody: {
jobName: jobName,
parameters: {
configFile: input,
outputFile: output,
mode: "cluster_test"
},
environment: {
zone: "europe-west1-b"
}
},
gcsPath: templatePath
}
console.log("Start to create " + jobName + " dataflow job");
return google.auth.getClient({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
}).then(auth => {
request.auth = auth;
return dataflow.projects.templates.launch(request);
}).catch(error => {
console.error(error);
throw error;
});
}
看看Dataflow Cloud Composer Example。它描述了如何将 Cloud Composer 与 Cloud Functions 结合使用,以在新文件到达 GCS 存储桶时触发基于 Python 的数据流作业。
一旦我将一个文件放到云中,我想启动数据流作业storage.And我启动一个云函数来触发it.But我不知道如何启动数据流作业Python?有人可以帮忙吗?
const kickOffDataflow = (input, output) => {
var jobName = CONFIG.DATAFLOW_JOB_NAME;
var templatePath = CONFIG.TEMPLETE_FILE_PATH;
var request = {
projectId: "test",
requestBody: {
jobName: jobName,
parameters: {
configFile: input,
outputFile: output,
mode: "cluster_test"
},
environment: {
zone: "europe-west1-b"
}
},
gcsPath: templatePath
}
console.log("Start to create " + jobName + " dataflow job");
return google.auth.getClient({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
}).then(auth => {
request.auth = auth;
return dataflow.projects.templates.launch(request);
}).catch(error => {
console.error(error);
throw error;
});
}
看看Dataflow Cloud Composer Example。它描述了如何将 Cloud Composer 与 Cloud Functions 结合使用,以在新文件到达 GCS 存储桶时触发基于 Python 的数据流作业。