BigQuery - 计划查询错误 "PermissionDenied: 403 The caller does not have permission"
BigQuery - Scheduled Query Error "PermissionDenied: 403 The caller does not have permission"
计划查询有问题,使用 python 脚本,我收到错误:
"google.api_core.exceptions.PermissionDenied: 403 The caller does not have permission", when 运行 下面的脚本.
#!/usr/bin/python
import sys
import json
import time
from google.cloud import bigquery_datatransfer
from google.oauth2 import service_account
prj_id = "project-id"
ds_id = "dataset_id"
gcp_info = json.load(open("key-file.json"))
creds = service_account.Credentials.from_service_account_info(gcp_info)
s_creds = creds.with_scopes(
[
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/bigquery',
]
)
s_acc = "service-account@project_id.iam.gserviceaccount.com"
bq_tc = bigquery_datatransfer.DataTransferServiceClient(credentials=s_creds)
dataset = prj_id + '.' + ds_id + '.'
def main():
argc = len(sys.argv) - 1
if argc != 1:
print("Usage: python3 /root/gcp_new-query.py <Temp-Table>")
sys.exit()
t_id = dataset + sys.argv[1] + '-temp'
t2_id = dataset + sys.argv[1] + '-data'
q2 = """
DELETE FROM `{}` WHERE AddressID > 0 AND MsgTS < TIMESTAMP_SUB(CURRENT_TIMESTAMP(),
INTERVAL 60 MINUTE)
""".format(t_id)
p = bq_tc.common_project_path(prj_id)
tc_cfg2 = bigquery_datatransfer.TransferConfig(
destination_dataset_id=ds_id,
display_name=sys.argv[1]+"-RM-Old-Data",
data_source_id="scheduled_query",
params={
"query": q2,
},
schedule="every hour",
)
tc_cfg2 = bq_tc.create_transfer_config(
bigquery_datatransfer.CreateTransferConfigRequest(
parent=p,
transfer_config=tc_cfg2,
service_account_name=s_acc,
)
)
print("Created scheduled query '{}'".format(tc_cfg2.name))
main()
一进入create_transfer_config(),我就得到了错误。
我已经阅读了文档并确保所有正确的权限都已授予“service-account@project_id”,它们是:
- BigQuery 数据传输服务代理
- BigQuery 管理员
我是不是缺少权限或者我的脚本中有什么地方不太正确?
如果我没有很好地解释所有内容,请道歉。
编辑:我还确保服务帐户具有关联的密钥 json 文件。
-A.
最后我找到了解决办法。我只是用了:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/path/to/json.json"
代替:
gcp_info = json.load(open("key-file.json"))
creds = service_account.Credentials.from_service_account_info(gcp_info)
s_creds = creds.with_scopes(
[
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/bigquery',
]
)
s_acc = "service-account@project_id.iam.gserviceaccount.com"
计划查询有问题,使用 python 脚本,我收到错误: "google.api_core.exceptions.PermissionDenied: 403 The caller does not have permission", when 运行 下面的脚本.
#!/usr/bin/python
import sys
import json
import time
from google.cloud import bigquery_datatransfer
from google.oauth2 import service_account
prj_id = "project-id"
ds_id = "dataset_id"
gcp_info = json.load(open("key-file.json"))
creds = service_account.Credentials.from_service_account_info(gcp_info)
s_creds = creds.with_scopes(
[
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/bigquery',
]
)
s_acc = "service-account@project_id.iam.gserviceaccount.com"
bq_tc = bigquery_datatransfer.DataTransferServiceClient(credentials=s_creds)
dataset = prj_id + '.' + ds_id + '.'
def main():
argc = len(sys.argv) - 1
if argc != 1:
print("Usage: python3 /root/gcp_new-query.py <Temp-Table>")
sys.exit()
t_id = dataset + sys.argv[1] + '-temp'
t2_id = dataset + sys.argv[1] + '-data'
q2 = """
DELETE FROM `{}` WHERE AddressID > 0 AND MsgTS < TIMESTAMP_SUB(CURRENT_TIMESTAMP(),
INTERVAL 60 MINUTE)
""".format(t_id)
p = bq_tc.common_project_path(prj_id)
tc_cfg2 = bigquery_datatransfer.TransferConfig(
destination_dataset_id=ds_id,
display_name=sys.argv[1]+"-RM-Old-Data",
data_source_id="scheduled_query",
params={
"query": q2,
},
schedule="every hour",
)
tc_cfg2 = bq_tc.create_transfer_config(
bigquery_datatransfer.CreateTransferConfigRequest(
parent=p,
transfer_config=tc_cfg2,
service_account_name=s_acc,
)
)
print("Created scheduled query '{}'".format(tc_cfg2.name))
main()
一进入create_transfer_config(),我就得到了错误。 我已经阅读了文档并确保所有正确的权限都已授予“service-account@project_id”,它们是:
- BigQuery 数据传输服务代理
- BigQuery 管理员
我是不是缺少权限或者我的脚本中有什么地方不太正确? 如果我没有很好地解释所有内容,请道歉。
编辑:我还确保服务帐户具有关联的密钥 json 文件。
-A.
最后我找到了解决办法。我只是用了:
import os
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/path/to/json.json"
代替:
gcp_info = json.load(open("key-file.json"))
creds = service_account.Credentials.from_service_account_info(gcp_info)
s_creds = creds.with_scopes(
[
'https://www.googleapis.com/auth/cloud-platform',
'https://www.googleapis.com/auth/bigquery',
]
)
s_acc = "service-account@project_id.iam.gserviceaccount.com"