使用 googleCloudStorageR 库将存储桶文件移动到其他存储桶

Move bucket file to other bucket with googleCloudStorageR library

我在 docker 中有一个 R 脚本,以便在 Google 云 运行 上执行它。

完全处理位于存储桶“输入”中的文件 X.csv 后,我想将其迁移到存储桶“完成”。 如何使用 googleCloudStorageR 实现?

`googleCloudStorageR::gcs_upload("myfile.csv")` 

似乎没有提供 gs:// 语法:

googleCloudStorageR::gcs_upload("gs://Input/X")

returns一个错误:

 Path 'gs://Input/X.csv' does not exist

此外,存储桶名称不在 gcs_upload() 函数参数中。我是否必须预先将默认存储桶设置为“完成”?

googleCloudStorageR::gcs_global_bucket("Done")

谢谢。

如果您使用的是 cran。r-project 此 Documentation 展示了如何上传对象

## upload a file - type will be guessed from file extension or supply type  
write.csv(mtcars, file = filename)
gcs_upload(filename)

## upload an R data.frame directly - will be converted to csv via write.csv
gcs_upload(mtcars)

## upload an R list - will be converted to json via jsonlite::toJSON
gcs_upload(list(a = 1, b = 3, c = list(d = 2, e = 5)))

## upload an R data.frame directly, with a custom function
## function should have arguments 'input' and 'output'
## safest to supply type too
f <- function(input, output) write.csv(input, row.names = FALSE, file = output)

gcs_upload(mtcars, 
           object_function = f,
           type = "text/csv")

如果您正在使用 cloudyr

gcs_upload(
  file,
  bucket = gcs_get_global_bucket(),
  type = NULL,
  name = deparse(substitute(file)),
  object_function = NULL,
  object_metadata = NULL,
  predefinedAcl = c("private", "bucketLevel", "authenticatedRead",
    "bucketOwnerFullControl", "bucketOwnerRead", "projectPrivate", "publicRead",
    "default"),
  upload_type = c("simple", "resumable")
)

gcs_upload_set_limit(upload_limit = 5000000L)

如果要设置桶:

## set global bucket so don't need to keep supplying in future calls
gcs_global_bucket("my-bucket")

您可以在目录中找到此文档:

googleCloudStorageR-master/docs/reference/gcs_upload.html