如何在不写入文件的情况下压缩流并将其上传到 Azure Blob 存储?
How do you compress a stream and upload it to Azure Blob Storage without writing to a file?
我在 ASP.NET Core 中编写了一个 Post
方法来压缩请求正文并将其上传到 Azure Blob 存储。该方法采用如下参数:
public async Task<IActionResult> Post([FromHeader] string AssignmentId)
然后设置各种字符串,包括获取存储的连接字符串:
string fileName = $"{AssignmentId}.gz";
string compressedFilePath = Path.Combine(hostEnvironment.ContentRootPath, $"Test JSONs/{fileName}");
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
我初始化 BlobClient
:
BlobClient blobClient = new BlobClient(connectionString, "assignments", fileName);
然后我创建一个文件,并使用 GZipStream
将请求的主体流压缩到文件中:
using (FileStream compressedFileStream = System.IO.File.Create(compressedFilePath))
{
using GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress);
using Stream bodyStream = HttpContext.Request.Body;
await bodyStream.CopyToAsync(compressionStream);
}
最后我阅读了我刚刚编写的文件并使用 FileStream
:
上传
using (FileStream fileStream = System.IO.File.OpenRead(compressedFilePath))
{
await blobClient.UploadAsync(fileStream);
}
这个解决方案有效,但我担心文件的不断读写,在速度方面。我尝试使用传递到 GZipStream
的 MemoryStream
,但它最终只上传了 10B 文件,而文件应该是 1KB+。
我很感激任何建议。
完整方法如下:
public async Task<IActionResult> Post([FromHeader] string AssignmentId)
{
string fileName = $"{AssignmentId}.gz";
string compressedFilePath = Path.Combine(hostEnvironment.ContentRootPath, $"Test JSONs/{fileName}");
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
BlobClient blobClient = new BlobClient(connectionString, "assignments", fileName);
using (FileStream compressedFileStream = System.IO.File.Create(compressedFilePath))
{
using GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress);
using Stream bodyStream = HttpContext.Request.Body;
await bodyStream.CopyToAsync(compressionStream);
}
using (FileStream fileStream = System.IO.File.OpenRead(compressedFilePath))
{
await blobClient.UploadAsync(fileStream);
}
return Ok();
}
我最终解决了这个问题,既让压缩流保持打开状态,又通过重置压缩流正在写入的内存流的位置(感谢@MitchWheat!)。
using MemoryStream memoryStream = new MemoryStream() ;
using (Stream bodyStream = HttpContext.Request.Body)
{
using (GZipStream compressionStream = new GZipStream(memoryStream,
CompressionMode.Compress, true))
{
await bodyStream.CopyToAsync(compressionStream);
}
}
memoryStream.Position = 0;
await blobClient.UploadAsync(memoryStream, overwrite: true);
我在 ASP.NET Core 中编写了一个 Post
方法来压缩请求正文并将其上传到 Azure Blob 存储。该方法采用如下参数:
public async Task<IActionResult> Post([FromHeader] string AssignmentId)
然后设置各种字符串,包括获取存储的连接字符串:
string fileName = $"{AssignmentId}.gz";
string compressedFilePath = Path.Combine(hostEnvironment.ContentRootPath, $"Test JSONs/{fileName}");
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
我初始化 BlobClient
:
BlobClient blobClient = new BlobClient(connectionString, "assignments", fileName);
然后我创建一个文件,并使用 GZipStream
将请求的主体流压缩到文件中:
using (FileStream compressedFileStream = System.IO.File.Create(compressedFilePath))
{
using GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress);
using Stream bodyStream = HttpContext.Request.Body;
await bodyStream.CopyToAsync(compressionStream);
}
最后我阅读了我刚刚编写的文件并使用 FileStream
:
using (FileStream fileStream = System.IO.File.OpenRead(compressedFilePath))
{
await blobClient.UploadAsync(fileStream);
}
这个解决方案有效,但我担心文件的不断读写,在速度方面。我尝试使用传递到 GZipStream
的 MemoryStream
,但它最终只上传了 10B 文件,而文件应该是 1KB+。
我很感激任何建议。
完整方法如下:
public async Task<IActionResult> Post([FromHeader] string AssignmentId)
{
string fileName = $"{AssignmentId}.gz";
string compressedFilePath = Path.Combine(hostEnvironment.ContentRootPath, $"Test JSONs/{fileName}");
string connectionString = Environment.GetEnvironmentVariable("AZURE_STORAGE_CONNECTION_STRING");
BlobClient blobClient = new BlobClient(connectionString, "assignments", fileName);
using (FileStream compressedFileStream = System.IO.File.Create(compressedFilePath))
{
using GZipStream compressionStream = new GZipStream(compressedFileStream, CompressionMode.Compress);
using Stream bodyStream = HttpContext.Request.Body;
await bodyStream.CopyToAsync(compressionStream);
}
using (FileStream fileStream = System.IO.File.OpenRead(compressedFilePath))
{
await blobClient.UploadAsync(fileStream);
}
return Ok();
}
我最终解决了这个问题,既让压缩流保持打开状态,又通过重置压缩流正在写入的内存流的位置(感谢@MitchWheat!)。
using MemoryStream memoryStream = new MemoryStream() ;
using (Stream bodyStream = HttpContext.Request.Body)
{
using (GZipStream compressionStream = new GZipStream(memoryStream,
CompressionMode.Compress, true))
{
await bodyStream.CopyToAsync(compressionStream);
}
}
memoryStream.Position = 0;
await blobClient.UploadAsync(memoryStream, overwrite: true);