我们可以使用 GCP 云功能将数据从 Google 云存储发送到 SFTP 服务器吗?

Can we send data from Google cloud storage to SFTP server using GCP Cloud function?

我想从 Google 云存储中获取一个 .csv 文件并将其发送到 SFTP 服务器。我不知道,我做错了什么,但我得到的错误是 在云存储中找不到文件。我的代码如下:

import base64
import os
import pysftp
import re
import csv
from google.cloud import storage
from google.cloud import bigquery

def hello_sftp(event, context):
    
    #defining credentials for the transfer
    myHostName = 'lmno.abcd.com'
    myUsername = 'pqr'
    myPassword = 'xyz'
    filename = 'test_file.csv'
    path = "gs://testing/"

    copy_file_on_ftp(myHostName, myUsername, myPassword, filename, path)
   
def copy_file_on_ftp(myHostName, myUsername, myPassword, filename, localpath):
    
    remotepath = '/Export/' + str(filename)
    print(' ')
    print(localpath)
    print(' ')
    cnopts = pysftp.CnOpts()
    cnopts.hostkeys = None
    
    with pysftp.Connection(
    host=myHostName, username=myUsername, password=myPassword, cnopts=cnopts
    ) as sftp:
        print("Connection successfully established . . . ")
        print("Exporting . . . ")
        print("local path and file name is : ", localpath+filename)
        sftp.put(localpath =localpath+filename, remotepath=remotepath)
    sftp.close()
    print("export to sFTP successful!")

但我收到错误

FileNotFoundError: [Errno 2] No such file or directory: 'gs://testing/test_file.csv'

那里不能发送数据吗?

pysftp不理解gs://

相反,使用 Google API 将文件下载到 SFTP 类文件对象,代表用 pysftp 打开的 SFTP 服务器上的目标文件:

storage_client = storage.Client()

bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(source_blob_name)
with sftp.open(remotepath, 'w', 32768) as f:
    blob.download_to_file(f)