使用 Golang 从 Azure Blob Storage 下载文件时得到“curl Empty reply from server”,但文件是在后台下载的
While downloading file from Azure Blob Storage using Golang getting " curl Empty reply from server" , but file is downloaded in background
我正在尝试使用 http 请求从 Azure Blob 存储下载文件。我可以下载文件,但在终端 curl returns "Empty reply from server" 上。我试图增加超时时间,但没有解决。我从卷曲中提到了与此响应相关的其他问题,但没有帮助。对于小文件,此代码可以完美运行,但对于大文件,比如 75 MB,则无法运行。
containerURL := azblob.NewContainerURL(*URL, pipeline)
blobURL := containerURL.NewBlockBlobURL(splitArray[1])
ctx := context.Background()
downloadResponse, err := blobURL.Download(ctx, 0, azblob.CountToEnd, azblob.BlobAccessConditions{}, false)
if err != nil {
.
.
.
}
bodyStream := downloadResponse.Body(azblob.RetryReaderOptions{MaxRetryRequests: 20})
// read the body into a buffer
downloadedData := bytes.Buffer{}
_, err = downloadedData.ReadFrom(bodyStream)
file, err := os.OpenFile(
"/tmp/"+fileName,
os.O_RDWR|os.O_TRUNC|os.O_CREATE,
0777,
)
file.Write(downloadedData.Bytes())
file.Close()
filePath := "/tmp/" + fileName
file, err = os.Open(filePath)
return middleware.ResponderFunc(func(w http.ResponseWriter, r runtime.Producer) {
fn := filepath.Base(filePath)
w.Header().Set(CONTENTTYPE, "application/octet-stream")
w.Header().Set("Content-Disposition", fmt.Sprintf("attachment; filename=%q", fn))
io.Copy(w, file)
err := defer os.Remove(filePath)
file.Close()
})
我正在考虑使用 goroutines 实现上述逻辑。甚至需要使用 goroutines 吗?
任何建设性的反馈都会有所帮助。
在分析来自 wireshark 的数据包后,我知道它由于超时而与我这边断开连接,因为我正在使用 go-swagger,我增加了超时,在 configure.go。 GoSwagger 提供了内置函数来处理这些场景,如 TLS 、超时。以下代码供参考。
// As soon as server is initialized but not run yet, this function will be called.
// If you need to modify a config, store server instance to stop it individually later, this is the place.
// This function can be called multiple times, depending on the number of serving schemes.
// scheme value will be set accordingly: "http", "https" or "unix"
func configureServer(s *http.Server, scheme, addr string) {
s.WriteTimeout(time.Minute * 5)
}
我正在尝试使用 http 请求从 Azure Blob 存储下载文件。我可以下载文件,但在终端 curl returns "Empty reply from server" 上。我试图增加超时时间,但没有解决。我从卷曲中提到了与此响应相关的其他问题,但没有帮助。对于小文件,此代码可以完美运行,但对于大文件,比如 75 MB,则无法运行。
containerURL := azblob.NewContainerURL(*URL, pipeline)
blobURL := containerURL.NewBlockBlobURL(splitArray[1])
ctx := context.Background()
downloadResponse, err := blobURL.Download(ctx, 0, azblob.CountToEnd, azblob.BlobAccessConditions{}, false)
if err != nil {
.
.
.
}
bodyStream := downloadResponse.Body(azblob.RetryReaderOptions{MaxRetryRequests: 20})
// read the body into a buffer
downloadedData := bytes.Buffer{}
_, err = downloadedData.ReadFrom(bodyStream)
file, err := os.OpenFile(
"/tmp/"+fileName,
os.O_RDWR|os.O_TRUNC|os.O_CREATE,
0777,
)
file.Write(downloadedData.Bytes())
file.Close()
filePath := "/tmp/" + fileName
file, err = os.Open(filePath)
return middleware.ResponderFunc(func(w http.ResponseWriter, r runtime.Producer) {
fn := filepath.Base(filePath)
w.Header().Set(CONTENTTYPE, "application/octet-stream")
w.Header().Set("Content-Disposition", fmt.Sprintf("attachment; filename=%q", fn))
io.Copy(w, file)
err := defer os.Remove(filePath)
file.Close()
})
我正在考虑使用 goroutines 实现上述逻辑。甚至需要使用 goroutines 吗?
任何建设性的反馈都会有所帮助。
在分析来自 wireshark 的数据包后,我知道它由于超时而与我这边断开连接,因为我正在使用 go-swagger,我增加了超时,在 configure.go。 GoSwagger 提供了内置函数来处理这些场景,如 TLS 、超时。以下代码供参考。
// As soon as server is initialized but not run yet, this function will be called.
// If you need to modify a config, store server instance to stop it individually later, this is the place.
// This function can be called multiple times, depending on the number of serving schemes.
// scheme value will be set accordingly: "http", "https" or "unix"
func configureServer(s *http.Server, scheme, addr string) {
s.WriteTimeout(time.Minute * 5)
}