CsvHelper 流太长

CsvHelper stream too long

我在使用 CsvHelper 将大量数据(> 2GB)保存到 Azure Blob 存储时遇到问题:我收到错误消息“流太长”。 有谁可以帮我解决吗?提前致谢!这是我的代码:

public static void EXPORT_CSV(DataTable dt, string fileName, ILogger log)
    {
        try
        {
            // Retrieve storage account from connection string.
            var cnStorage = Environment.GetEnvironmentVariable("cnStorage");
            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(cnStorage);
            // Create the blob client.
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            // Retrieve reference to a previously created container.
            CloudBlobContainer container = blobClient.GetContainerReference("dataexport");
            bool exists = container.CreateIfNotExists();
            // Retrieve reference to a blob named "myblob".
            CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);

            var stream = new MemoryStream();

            using (var writer = new StreamWriter(stream))
            using (var csvWriter = new CsvWriter(writer, CultureInfo.InvariantCulture))
            {
                csvWriter.Configuration.TypeConverterOptionsCache.GetOptions<DateTime>().Formats = new[] { "dd/MM/yyyy" };
                foreach (DataColumn column in dt.Columns)
                {
                    csvWriter.WriteField(column.ColumnName);
                }

                csvWriter.NextRecord();

                foreach (DataRow row in dt.Rows)
                {
                    for (var i = 0; i < dt.Columns.Count; i++)
                    {
                        csvWriter.WriteField(row[i]);
                    }
                    csvWriter.NextRecord();
                }
                csvWriter.Flush();
                writer.Flush();
                stream.Position = 0;

                log.LogInformation($"C# BatchDataExportCSVsegnalazioni START UploadFromStream  at: {DateTime.Now}");
                blockBlob.UploadFromStream(stream);
                log.LogInformation($"C# BatchDataExportCSVsegnalazioni END UploadFromStream  at: {DateTime.Now}");
            }
        }
        catch (Exception ex)
        {
            log.LogError("Error upload BatchDataExportCSVsegnalazioni: " + ex.Message);
        }
    }

该错误可能是由于对大数据使用 MemoryStream 而不是 csvHelper。 查看问题是否可以通过 :

解决
  1. 将数据直接写入 FileStream 而不是写入内存流。

    using (var fileStream = File.Create(path))
    
    // (or)
    
    using (var fileStream = new FileStream(filePath, FileMode.OpenOrCreate))
    {
      using (var writer = new StreamWriter(fileStream))
         {
       using (var csvWriter = new CsvWriter(writer, CultureInfo.InvariantCulture))
                {
    

(或)

  1. 您可以使用 cloudblockblob 库通过使用程序集 Azure.Storage.Blobs 和命名空间 Azure.Storage.Blobs 的扩展方法在 Azure 存储中创建文件。专门化:

请参考Handling Large Files in Azure with Blob Storage Streaming

例如:

 var stream = blob.OpenWrite()

另见 Do's and Don'ts for Streaming File Uploads to Azure Blob Storage with .NET

我使用 blob.OpenWriteAsync():

解决了直接写入 Azure Blob 存储的问题
        public static async Task UPLOAD_CSVAsync(DataTable dt, string fileName, ILogger log)
    {
        try
        {
            // Retrieve storage account from connection string.
            var cnStorage = Environment.GetEnvironmentVariable("cnStorage");
            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(cnStorage);
            // Create the blob client.
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            // Retrieve reference to a previously created container.
            CloudBlobContainer container = blobClient.GetContainerReference("dataexport");
            bool exists = container.CreateIfNotExists();
            // Retrieve reference to a blob named "fileName".
            CloudBlockBlob blockBlob = container.GetBlockBlobReference(fileName);

            log.LogInformation($"C# BatchExpCSVsegnalazioni START UploadFromStream  at: {DateTime.Now}");
            await WriteDataTableToBlob(dt, blockBlob);
            log.LogInformation($"C# BatchExpCSVsegnalazioni END UploadFromStream  at: {DateTime.Now}");
        }
        catch (Exception ex)
        {
            log.LogError("error upload BatchExpCSVsegnalazioni: " + ex.Message);
        }
    }
    public static async Task WriteDataTableToBlob(DataTable dt, CloudBlockBlob blob)
    {
        using (var writer = await blob.OpenWriteAsync())
        using (var streamWriter = new StreamWriter(writer))
        using (var csvWriter = new CsvWriter(streamWriter, CultureInfo.InvariantCulture))
        {
            csvWriter.Configuration.TypeConverterOptionsCache.GetOptions<DateTime>().Formats = new[] { "dd/MM/yyyy" };
            foreach (DataColumn column in dt.Columns)
            {
                csvWriter.WriteField(column.ColumnName);
            }
            csvWriter.NextRecord();

            foreach (DataRow row in dt.Rows)
            {
                for (var i = 0; i < dt.Columns.Count; i++)
                {
                    csvWriter.WriteField(row[i]);
                }
                csvWriter.NextRecord();
            }
            csvWriter.Flush();
        }
    }