如何将文件动态添加到存储在 Azure blob 存储中的 zip 存档?

How can I dynamically add files to a zip archive stored in Azure blob storage?

我在 Azure 中有一个进程生成大量 pdf 报告文件并将它们存储在 blob 存储中。我没有将 link 单独发送给所有这些用户,而是生成一个 zip 文件并将此 link 发送给用户。

这个过程都是在一个过程中完成的,并且一直运行良好。最近,我在将文件添加到 zip 存档时遇到了 OutOfMemory 异常错误,我正在努力寻找解决方案。

下面是我用来创建 zip 文件的代码(注意:使用 SharpLibZip 库)。目前,它在添加大约 45 个文件(每个文件约 3.5Mb)后失败并出现 OutOfMemoryException (PDF)。当我点击以下行时发生故障:zipStream.PutNextEntry(newEntry).

有谁知道我可以如何改进这个过程?在这个级别失败似乎是一个 zip 文件很小。

Using outputMemStream As New MemoryStream()

    Using zipStream As New ICSharpCode.SharpZipLib.Zip.ZipOutputStream(outputMemStream)
          zipStream.SetLevel(7)

          Dim collD3 As UserSurveyReportCollection = GetFileList(RequestID)

          For Each entityD2 As UserSurveyReport In collD3

              Try
                  Dim strF As String = entityD2.FileLocation

                 'Download blob as memorystream and add this stream to the zip file
                 Dim msR As New MemoryStream 
                 msR = objA.DownloadBlobAsMemoryStream(azureAccount, ReportFolder, entityD2.FileName)
                 msR.Seek(0, SeekOrigin.Begin)

                'Determine file name used in zip file archive for item
                 Dim strZipFileName As String = DetermineZipSourceName(entityD2, strFolder, strFileName)

                 'Add MemoryStream to ZipFile Stream
                 Dim newEntry As ICSharpCode.SharpZipLib.Zip.ZipEntry = New ICSharpCode.SharpZipLib.Zip.ZipEntry(strZipFileName)
                 newEntry.DateTime = DateTime.Now

                 zipStream.PutNextEntry(newEntry)
                 msR.CopyTo(zipStream)
                 zipStream.CloseEntry()

                 msR = Nothing
                 zipStream.Flush()

                 intCounter += 1

        End If

    Catch exZip As Exception

    End Try

  Next


    zipStream.IsStreamOwner = False
    zipStream.Finish()
    zipStream.Close()

    outputMemStream.Position = 0

    Dim bytes As Byte() = outputMemStream.ToArray()
    result.Comment = objA.UploadBlob(bytes, azureAccount, ReportFolder, entityReport.FileName).AbsolutePath


    End Using
  End Using

我找到了解决办法。这种方法似乎最大限度地减少了内存中 zip 文件创建的内存使用,并将生成的 zip 存档加载到 Azure 中的 blob 存储。这使用本机 System.IO.Compression 库而不是第 3 方 zip 库。

我创建了一个名为 ZipModel 的 class,它只有一个文件名和 blob。我创建了一个列表,并将其传递给下面的函数。我希望这可以帮助处于同样困境的其他人。

    Private Function SendBlobsToZipFile(ByVal destinationBlob As CloudBlockBlob, ByVal sourceBlobs As List(Of ZipModel)) As ResultDetail

    Dim result As Boolean = True
    Dim resultCounter as Integer = 0

    Using blobWriteStream As Stream = destinationBlob.OpenWrite()

        Using archive As ZipArchive = New ZipArchive(blobWriteStream, ZipArchiveMode.Create)

            For Each zipM As ZipModel In sourceBlobs
                Try
                    Dim strName As String = String.Format("{0}\{1}", zipM.FolderName, zipM.FileName)
                    Dim archiveEntry As ZipArchiveEntry = archive.CreateEntry(strName, CompressionLevel.Optimal)

                    Using archiveWriteStream As Stream = archiveEntry.Open()
                        zipM.ZipBlob.DownloadToStream(archiveWriteStream)
                        resultCounter  += 1
                    End Using
                Catch ex As Exception

                    result = False

                End Try

            Next

        End Using
    End Using

    Return result


End Function

对于使用 C# 并希望将大型 zip 文件写入 blob 存储的任何人:

var blob = container.GetBlockBlobReference(outputFilename);
using (var stream = await blob.OpenWriteAsync())
using (var zip = new ZipArchive(stream, ZipArchiveMode.Create))
{
    for (int i = 0; i < 2000; i++)
    {
        using (var randomStream = CreateRandomStream(2))
        {
            var entry = zip.CreateEntry($"{i}.zip", CompressionLevel.Optimal);
            using (var innerFile = entry.Open())
            {
                await randomStream.CopyToAsync(innerFile);
            }
        }
    }
}

效果出奇的好。应用程序内存约为 20Mb,在流式传输到 Azure 时 CPU 非常低。我创建了非常大的输出文件(> 4.5Gb)没有问题