BigQuery Transfer Service 不会从 S3 复制行

BigQuery Transfer Service does not copy rows from S3

我创建了从 AWS S3 到 Google BigQuery 的 BigQuery 传输。它因以下错误而失败,

No new files found matching "gs://bqdts-amazon_s3-prod-eu-w5jetqct8ohvcjih85apf7gvkbibvbkcj9o6l67/test/files"

但是,数据已成功从 S3 移动到 Google Cloud

Moving data from Amazon S3 to Google Cloud complete: Moved 10 object(s).

我还在 BigQuery 数据集中创建了一个 table

请帮我解决这个问题

我发现了类似的错误Amazon S3 transfers 一般问题:

Files are transferred from Amazon S3 but not loaded into BigQuery. The transfer logs may look look similar to this:

Moving data from Amazon S3 to Google Cloud complete: Moved object(s). No new files found matching .

Confirm that the Amazon S3 URI in the transfer configuration is correct.

If the transfer configuration was meant to load all files with a common prefix, ensure that the Amazon S3 URI ends with a wildcard. For example, to load all files in s3://my-bucket/my-folder/, the Amazon S3 URI in the transfer configuration must be s3://my-bucket/my-folder/*, not just s3://my-bucket/my-folder/.

应该 "gs://bqdts-amazon_s3-prod-eu-w5jetqct8ohvcjih85apf7gvkbibvbkcj9o6l67/test/files" 是 Amazon S3 URI:s3://

我以前遇到过同样的问题,只有在我在 Amazon S3 URI 后添加“*”时才有效。

示例:

之前(显示与您相同的错误): s3://mybucket/path/

之后(工作成功): s3://mybucket/path/*

在 Amazon S3 传输页面的 General issues 部分,显示了这些推荐的操作:

Confirm that the Amazon S3 URI in the transfer configuration is correct.

If the transfer configuration was meant to load all files with a common prefix, ensure that the Amazon S3 URI ends with a wildcard. For example, to load all files in s3://my-bucket/my-folder/, the Amazon S3 URI in the transfer configuration must be s3://my-bucket/my-folder/*, not just s3://my-bucket/my-folder/.