spring-cloud-gcp-starter-bigquery 忽略 属性 文件中的 spring.cloud.gcp.credentials.location
spring-cloud-gcp-starter-bigquery ignores spring.cloud.gcp.credentials.location from property file
我正在玩 spring gcp 项目。我的 first example with GCP bucket 正常工作并使用正确的 google 帐户,我在 属性 文件 :
中指出
spring.cloud.gcp.credentials.location=file:secret.json
下一步我尝试重复bigQuery example
为了实现它,我在 GCP 端创建了数据集并将数据集名称添加到 属性 文件:
spring.cloud.gcp.bigquery.datasetName=my_dataset
我也复制了controller:
@Controller
public class BigQueryController {
@Autowired
BigQuerySampleConfiguration.BigQueryFileGateway bigQueryFileGateway;
@Autowired
BigQueryTemplate bigQueryTemplate;
@Value("${spring.cloud.gcp.bigquery.datasetName}")
private String datasetName;
@GetMapping("/bigquery")
public ModelAndView renderIndex(ModelMap map) {
map.put("datasetName", this.datasetName);
return new ModelAndView("index.html", map);
}
/**
* Handles a file upload using {@link BigQueryTemplate}.
*
* @param file the CSV file to upload to BigQuery
* @param tableName name of the table to load data into
* @return ModelAndView of the response the send back to users
* @throws IOException if the file is unable to be loaded.
*/
@PostMapping("/uploadFile")
public ModelAndView handleFileUpload(
@RequestParam("file") MultipartFile file, @RequestParam("tableName") String tableName)
throws IOException {
ListenableFuture<Job> loadJob = this.bigQueryTemplate.writeDataToTable(
tableName, file.getInputStream(), FormatOptions.csv());
return getResponse(loadJob, tableName);
}
/**
* Handles CSV data upload using Spring Integration {@link BigQuerySampleConfiguration.BigQueryFileGateway}.
*
* @param csvData the String CSV data to upload to BigQuery
* @param tableName name of the table to load data into
* @return ModelAndView of the response the send back to users
*/
@PostMapping("/uploadCsvText")
public ModelAndView handleCsvTextUpload(
@RequestParam("csvText") String csvData, @RequestParam("tableName") String tableName) {
ListenableFuture<Job> loadJob = this.bigQueryFileGateway.writeToBigQueryTable(csvData.getBytes(), tableName);
return getResponse(loadJob, tableName);
}
private ModelAndView getResponse(ListenableFuture<Job> loadJob, String tableName) {
String message;
try {
Job job = loadJob.get();
message = "Successfully loaded data file to " + tableName;
} catch (Exception e) {
e.printStackTrace();
message = "Error: " + e.getMessage();
}
return new ModelAndView("index")
.addObject("datasetName", this.datasetName)
.addObject("message", message);
}
}
和配置:
@Configuration
public class BigQuerySampleConfiguration {
@Bean
public DirectChannel bigQueryWriteDataChannel() {
return new DirectChannel();
}
@Bean
public DirectChannel bigQueryJobReplyChannel() {
return new DirectChannel();
}
@Bean
@ServiceActivator(inputChannel = "bigQueryWriteDataChannel")
public MessageHandler messageSender(BigQueryTemplate bigQueryTemplate) {
BigQueryFileMessageHandler messageHandler = new BigQueryFileMessageHandler(bigQueryTemplate);
messageHandler.setFormatOptions(FormatOptions.csv());
messageHandler.setOutputChannel(bigQueryJobReplyChannel());
return messageHandler;
}
@Bean
public GatewayProxyFactoryBean gatewayProxyFactoryBean() {
GatewayProxyFactoryBean factoryBean = new GatewayProxyFactoryBean(BigQueryFileGateway.class);
factoryBean.setDefaultRequestChannel(bigQueryWriteDataChannel());
factoryBean.setDefaultReplyChannel(bigQueryJobReplyChannel());
// Ensures that BigQueryFileGateway does not return double-wrapped ListenableFutures
factoryBean.setAsyncExecutor(null);
return factoryBean;
}
/**
* Spring Integration gateway which allows sending data to load to BigQuery through a
* channel.
*/
@MessagingGateway
public interface BigQueryFileGateway {
ListenableFuture<Job> writeToBigQueryTable(
byte[] csvData, @Header(BigQuerySpringMessageHeaders.TABLE_NAME) String tableName);
}
}
和index.html
(我认为我不应该在这里复制它)
但是当我尝试将 smth 写入 bigQuery 数据集时,我看到以下错误:
2020-03-03 15:01:32.147 ERROR 16224 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is com.google.cloud.bigquery.BigQueryException: 404 Not Found
{
"error": {
"code": 404,
"message": "Not found: Dataset my_production_project:my_dataset",
"errors": [
{
"message": "Not found: Dataset my_production_project:my_dataset",
"domain": "global",
"reason": "notFound",
"debugInfo": "[NOT_FOUND] time: 2020-03-03T04:01:31.971-08:00, Reason: code=NOT_FOUND message=Dataset bx-dev-contactmatch:my_dataset not found debug=time: 2020-03-03T04:01:31.971-08:00 errorProto=domain: \"cloud.helix.ErrorDomain\"\ncode: \"NOT_FOUND\"\nargument: \"Dataset\"\nargument: \"bx-dev-contactmatch:my_dataset\"\ndebug_info: \"time: 2020-03-03T04:01:31.971-08:00\"\n\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:281)\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:285)\n\tat com.google.cloud.helix.server.metadata.DatasetTrackerSpanner.lambda$getDatasetAsync(DatasetTrackerSpanner.java:236)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:213)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:202)\n\tat com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:118)\n\tat com.google.common.util.concurrent.MoreExecutors.run(MoreExecutors.java:1158)\n\tat com.google.common.context.ContextRunnable.runInContext(ContextRunnable.java:89)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:78)\n\tat io.grpc.Context.run(Context.java:602)\n\tat com.google.tracing.GenericContextCallback.runInInheritedContext(GenericContextCallback.java:75)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:74)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n"
}
],
"status": "NOT_FOUND"
}
}
] with root cause
com.google.api.client.http.HttpResponseException: 404 Not Found
{
"error": {
"code": 404,
"message": "Not found: Dataset my_production_project:my_dataset",
"errors": [
{
"message": "Not found: Dataset my_production_project:my_dataset",
"domain": "global",
"reason": "notFound",
"debugInfo": "[NOT_FOUND] time: 2020-03-03T04:01:31.971-08:00, Reason: code=NOT_FOUND message=Dataset bx-dev-contactmatch:my_dataset not found debug=time: 2020-03-03T04:01:31.971-08:00 errorProto=domain: \"cloud.helix.ErrorDomain\"\ncode: \"NOT_FOUND\"\nargument: \"Dataset\"\nargument: \"bx-dev-contactmatch:my_dataset\"\ndebug_info: \"time: 2020-03-03T04:01:31.971-08:00\"\n\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:281)\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:285)\n\tat com.google.cloud.helix.server.metadata.DatasetTrackerSpanner.lambda$getDatasetAsync(DatasetTrackerSpanner.java:236)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:213)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:202)\n\tat com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:118)\n\tat com.google.common.util.concurrent.MoreExecutors.run(MoreExecutors.java:1158)\n\tat com.google.common.context.ContextRunnable.runInContext(ContextRunnable.java:89)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:78)\n\tat io.grpc.Context.run(Context.java:602)\n\tat com.google.tracing.GenericContextCallback.runInInheritedContext(GenericContextCallback.java:75)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:74)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n"
}
],
"status": "NOT_FOUND"
}
}
从错误中我们可以看出应用试图访问 my_production_project
,这不是预期的。
secret.json内容:
{
"type": "service_account",
"project_id": "spring-samples-269912",
"private_key_id": "04d22c73e3ef53dd82f20c322f91a79e2fbc76d9",
"private_key": "-----BEGIN PRIVATE KEY-----******-----END PRIVATE KEY-----\n",
"client_email": "spring-samples-service-account@spring-samples-269912.iam.gserviceaccount.com",
"client_id": "117486490087851987327",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/spring-samples-service-account%40spring-samples-269912.iam.gserviceaccount.com"
}
正如您在此处看到的那样 spring-samples-269912
提到了项目。
我该如何解决?
P.S.
两个示例(Gcp 存储桶和 BigQuery)都位于同一个项目中,因此它们使用相同的 application.properties 文件和相同的 secret.json
设置后问题消失了
spring.cloud.gcp.bigquery.project-id=spring-samples-269912
或
spring.cloud.gcp.project-id=spring-samples-269912
我正在玩 spring gcp 项目。我的 first example with GCP bucket 正常工作并使用正确的 google 帐户,我在 属性 文件 :
中指出spring.cloud.gcp.credentials.location=file:secret.json
下一步我尝试重复bigQuery example
为了实现它,我在 GCP 端创建了数据集并将数据集名称添加到 属性 文件:
spring.cloud.gcp.bigquery.datasetName=my_dataset
我也复制了controller:
@Controller
public class BigQueryController {
@Autowired
BigQuerySampleConfiguration.BigQueryFileGateway bigQueryFileGateway;
@Autowired
BigQueryTemplate bigQueryTemplate;
@Value("${spring.cloud.gcp.bigquery.datasetName}")
private String datasetName;
@GetMapping("/bigquery")
public ModelAndView renderIndex(ModelMap map) {
map.put("datasetName", this.datasetName);
return new ModelAndView("index.html", map);
}
/**
* Handles a file upload using {@link BigQueryTemplate}.
*
* @param file the CSV file to upload to BigQuery
* @param tableName name of the table to load data into
* @return ModelAndView of the response the send back to users
* @throws IOException if the file is unable to be loaded.
*/
@PostMapping("/uploadFile")
public ModelAndView handleFileUpload(
@RequestParam("file") MultipartFile file, @RequestParam("tableName") String tableName)
throws IOException {
ListenableFuture<Job> loadJob = this.bigQueryTemplate.writeDataToTable(
tableName, file.getInputStream(), FormatOptions.csv());
return getResponse(loadJob, tableName);
}
/**
* Handles CSV data upload using Spring Integration {@link BigQuerySampleConfiguration.BigQueryFileGateway}.
*
* @param csvData the String CSV data to upload to BigQuery
* @param tableName name of the table to load data into
* @return ModelAndView of the response the send back to users
*/
@PostMapping("/uploadCsvText")
public ModelAndView handleCsvTextUpload(
@RequestParam("csvText") String csvData, @RequestParam("tableName") String tableName) {
ListenableFuture<Job> loadJob = this.bigQueryFileGateway.writeToBigQueryTable(csvData.getBytes(), tableName);
return getResponse(loadJob, tableName);
}
private ModelAndView getResponse(ListenableFuture<Job> loadJob, String tableName) {
String message;
try {
Job job = loadJob.get();
message = "Successfully loaded data file to " + tableName;
} catch (Exception e) {
e.printStackTrace();
message = "Error: " + e.getMessage();
}
return new ModelAndView("index")
.addObject("datasetName", this.datasetName)
.addObject("message", message);
}
}
和配置:
@Configuration
public class BigQuerySampleConfiguration {
@Bean
public DirectChannel bigQueryWriteDataChannel() {
return new DirectChannel();
}
@Bean
public DirectChannel bigQueryJobReplyChannel() {
return new DirectChannel();
}
@Bean
@ServiceActivator(inputChannel = "bigQueryWriteDataChannel")
public MessageHandler messageSender(BigQueryTemplate bigQueryTemplate) {
BigQueryFileMessageHandler messageHandler = new BigQueryFileMessageHandler(bigQueryTemplate);
messageHandler.setFormatOptions(FormatOptions.csv());
messageHandler.setOutputChannel(bigQueryJobReplyChannel());
return messageHandler;
}
@Bean
public GatewayProxyFactoryBean gatewayProxyFactoryBean() {
GatewayProxyFactoryBean factoryBean = new GatewayProxyFactoryBean(BigQueryFileGateway.class);
factoryBean.setDefaultRequestChannel(bigQueryWriteDataChannel());
factoryBean.setDefaultReplyChannel(bigQueryJobReplyChannel());
// Ensures that BigQueryFileGateway does not return double-wrapped ListenableFutures
factoryBean.setAsyncExecutor(null);
return factoryBean;
}
/**
* Spring Integration gateway which allows sending data to load to BigQuery through a
* channel.
*/
@MessagingGateway
public interface BigQueryFileGateway {
ListenableFuture<Job> writeToBigQueryTable(
byte[] csvData, @Header(BigQuerySpringMessageHeaders.TABLE_NAME) String tableName);
}
}
和index.html
(我认为我不应该在这里复制它)
但是当我尝试将 smth 写入 bigQuery 数据集时,我看到以下错误:
2020-03-03 15:01:32.147 ERROR 16224 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is com.google.cloud.bigquery.BigQueryException: 404 Not Found
{
"error": {
"code": 404,
"message": "Not found: Dataset my_production_project:my_dataset",
"errors": [
{
"message": "Not found: Dataset my_production_project:my_dataset",
"domain": "global",
"reason": "notFound",
"debugInfo": "[NOT_FOUND] time: 2020-03-03T04:01:31.971-08:00, Reason: code=NOT_FOUND message=Dataset bx-dev-contactmatch:my_dataset not found debug=time: 2020-03-03T04:01:31.971-08:00 errorProto=domain: \"cloud.helix.ErrorDomain\"\ncode: \"NOT_FOUND\"\nargument: \"Dataset\"\nargument: \"bx-dev-contactmatch:my_dataset\"\ndebug_info: \"time: 2020-03-03T04:01:31.971-08:00\"\n\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:281)\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:285)\n\tat com.google.cloud.helix.server.metadata.DatasetTrackerSpanner.lambda$getDatasetAsync(DatasetTrackerSpanner.java:236)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:213)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:202)\n\tat com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:118)\n\tat com.google.common.util.concurrent.MoreExecutors.run(MoreExecutors.java:1158)\n\tat com.google.common.context.ContextRunnable.runInContext(ContextRunnable.java:89)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:78)\n\tat io.grpc.Context.run(Context.java:602)\n\tat com.google.tracing.GenericContextCallback.runInInheritedContext(GenericContextCallback.java:75)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:74)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n"
}
],
"status": "NOT_FOUND"
}
}
] with root cause
com.google.api.client.http.HttpResponseException: 404 Not Found
{
"error": {
"code": 404,
"message": "Not found: Dataset my_production_project:my_dataset",
"errors": [
{
"message": "Not found: Dataset my_production_project:my_dataset",
"domain": "global",
"reason": "notFound",
"debugInfo": "[NOT_FOUND] time: 2020-03-03T04:01:31.971-08:00, Reason: code=NOT_FOUND message=Dataset bx-dev-contactmatch:my_dataset not found debug=time: 2020-03-03T04:01:31.971-08:00 errorProto=domain: \"cloud.helix.ErrorDomain\"\ncode: \"NOT_FOUND\"\nargument: \"Dataset\"\nargument: \"bx-dev-contactmatch:my_dataset\"\ndebug_info: \"time: 2020-03-03T04:01:31.971-08:00\"\n\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:281)\n\tat com.google.cloud.helix.common.Exceptions$Public.resourceNotFound(Exceptions.java:285)\n\tat com.google.cloud.helix.server.metadata.DatasetTrackerSpanner.lambda$getDatasetAsync(DatasetTrackerSpanner.java:236)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:213)\n\tat com.google.common.util.concurrent.AbstractTransformFuture$AsyncTransformFuture.doTransform(AbstractTransformFuture.java:202)\n\tat com.google.common.util.concurrent.AbstractTransformFuture.run(AbstractTransformFuture.java:118)\n\tat com.google.common.util.concurrent.MoreExecutors.run(MoreExecutors.java:1158)\n\tat com.google.common.context.ContextRunnable.runInContext(ContextRunnable.java:89)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:78)\n\tat io.grpc.Context.run(Context.java:602)\n\tat com.google.tracing.GenericContextCallback.runInInheritedContext(GenericContextCallback.java:75)\n\tat com.google.common.context.ContextRunnable.run(ContextRunnable.java:74)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n"
}
],
"status": "NOT_FOUND"
}
}
从错误中我们可以看出应用试图访问 my_production_project
,这不是预期的。
secret.json内容:
{
"type": "service_account",
"project_id": "spring-samples-269912",
"private_key_id": "04d22c73e3ef53dd82f20c322f91a79e2fbc76d9",
"private_key": "-----BEGIN PRIVATE KEY-----******-----END PRIVATE KEY-----\n",
"client_email": "spring-samples-service-account@spring-samples-269912.iam.gserviceaccount.com",
"client_id": "117486490087851987327",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/spring-samples-service-account%40spring-samples-269912.iam.gserviceaccount.com"
}
正如您在此处看到的那样 spring-samples-269912
提到了项目。
我该如何解决?
P.S.
两个示例(Gcp 存储桶和 BigQuery)都位于同一个项目中,因此它们使用相同的 application.properties 文件和相同的 secret.json
设置后问题消失了
spring.cloud.gcp.bigquery.project-id=spring-samples-269912
或
spring.cloud.gcp.project-id=spring-samples-269912