如何减少大文件传输的内存使用?

How Can I Reduce The Memory Useage For a Huge File Transfer?

我必须将一些大文件(2GB 左右)传输到 Web 服务:

public bool UploadContent(System.Web.HttpContext context)
{
     var file = context.Request.Files[0];
     var fileName = file.FileName;

     byte[] fileBytes = new Byte[file.ContentLength];
     file.InputStream.Read(fileBytes, 0, fileBytes.Length);
     client.createResource(fileBytes);
}        

HttpContext 已经在 File[0] 中包含文件的内容,但我看不到将这些字节传递给 Web 服务的 createResource(byte[] contents) 方法的方法将副本作为 byte 数组...所以我像吃糖果一样吃内存。

有没有更有效的方法来做到这一点?

EDIT client.createResource() 是 COTS 产品的一部分,修改不在我们的控制范围内。

您可以发送文件块,而不是发送整个字节。逐步寻找文件上传并合并下一个块以已在服务器上保存字节。 仅当您被允许修改该方法时,您才需要更新您的 client.CreateResource 方法 :)

添加以下参数: string fileName // 开始发送块时定位文件名 byte[] buffer // 将通过 webservice 发送到服务器的块 long offset // 将告诉您已经上传了多少数据的信息,以便您可以查找文件并合并缓冲区。

现在您的方法将如下所示:

public bool CreateResource(string FileName, byte[] buffer, long Offset)
{
    bool retVal = false;
    try
    {
        string FilePath = "d:\temp\uploadTest.extension";
        if (Offset == 0)
            File.Create(FilePath).Close();
        // open a file stream and write the buffer. 
        // Don't open with FileMode.Append because the transfer may wish to 
        // start a different point
        using (FileStream fs = new FileStream(FilePath, FileMode.Open,
            FileAccess.ReadWrite, FileShare.Read))
        {
            fs.Seek(Offset, SeekOrigin.Begin);
            fs.Write(buffer, 0, buffer.Length);
        }
        retVal = true;
    }
    catch (Exception ex)
    {
        // Log exception or send error message to someone who cares
    }
    return retVal;
}

现在要从 HttpPostedFile 的 InputStream 中分块读取文件,请尝试以下代码:

public bool UploadContent(System.Web.HttpContext context)
{
    //the file that we want to upload
    var file = context.Request.Files[0];
    var fs = file.InputStream;

    int Offset = 0; // starting offset.

    //define the chunk size
    int ChunkSize = 65536; // 64 * 1024 kb

    //define the buffer array according to the chunksize.
    byte[] Buffer = new byte[ChunkSize];
    //opening the file for read.

    try
    {
        long FileSize = file.ContentLength; // File size of file being uploaded.
        // reading the file.
        fs.Position = Offset;
        int BytesRead = 0;
        while (Offset != FileSize) // continue uploading the file chunks until offset = file size.
        {
            BytesRead = fs.Read(Buffer, 0, ChunkSize); // read the next chunk 

            if (BytesRead != Buffer.Length)
            {
                ChunkSize = BytesRead;
                byte[] TrimmedBuffer = new byte[BytesRead];
                Array.Copy(Buffer, TrimmedBuffer, BytesRead);
                Buffer = TrimmedBuffer; // the trimmed buffer should become the new 'buffer'
            }
            // send this chunk to the server. it is sent as a byte[] parameter, 
            // but the client and server have been configured to encode byte[] using MTOM. 
            bool ChunkAppened = client.createResource(file.FileName, Buffer, Offset);
            if (!ChunkAppened)
            {
                break;
            }

            // Offset is only updated AFTER a successful send of the bytes. 
            Offset += BytesRead; // save the offset position for resume
        }
    }
    catch (Exception ex)
    {
    }
    finally
    {
        fs.Close();
    }
}

Disclaimer: I haven't tested this code. This is a sample code to show how large file upload can be achieved without hampering the memory.

参考:Source article.