OutOfMemoryException when I read 500MB FileStream OutOfMemoryException when I read 500MB FileStream arrays arrays

OutOfMemoryException when I read 500MB FileStream


The code you show, reads all content of the 500mb file into a contiguous region in memory. It's not surprising that you get an out-of-memory condition.

The solution is, "don't do that."

What are you really trying to do?


If you want to read a file completely, it's much simpler than the ReadFully method you use. Try this:

using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read)) {    byte[] buffer = new byte[fs.Length];   int bytesRead = fs.Read(buffer, 0, buffer.Length);   // buffer now contains the entire contents of the file} 

But... using this code won't solve your problem. It might work for a 500mb file. It won't work for a 750mb file, or a 1gb file. At some point you will reach the limit of memory on your system and you will have the same out-of-memory error you started with.

The problem is that you are trying to hold the entire contents of the file in memory at one time. This is usually unnecessary, and is doomed to failure as the files grow in size. It's no problem when the filesize is 16k. At 500mb, it's the wrong approach.

This is why I have asked several times, what are you really trying to do ?


Sounds like you want to send the contents of a file out to an ASPNET response stream. This is the question. Not "how to read a 500mb file into memory?" But "how to send a large file to the ASPNET Response stream?"

For this, once again, it's fairly simple.

// emit the contents of a file into the ASPNET Response streamusing (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read)) {    Response.BufferOutput= false;   // to prevent buffering   byte[] buffer = new byte[1024];   int bytesRead = 0;   while ((bytesRead = fs.Read(buffer, 0, buffer.Length)) > 0)    {       Response.OutputStream.Write(buffer, 0, bytesRead);   }} 

What it does is iteratively read a chunk from the file, and write that chunk to the Response stream, until there is nothing more to read in the file. This is what is meant by "streaming IO". The data passes through your logic, but is never held all in one place, just as a stream of water passes through a sluice. In this example, never is there more than 1k of file data in memory at one time (well, not held by your application code, anyway. There are other IO buffers lower in the stack.)

This is a common pattern in streamed IO. Learn it, use it.

The one trick when pumping data out to ASPNET's Response.OutputStream is to set BufferOutput = false. By default, ASPNET tries to buffer its output. In this case (500mb file), buffering is a bad idea. Setting the BufferOutput property to false will prevent ASPNET from attempting to buffer all the file data before sending the first byte. Use that when you know the file you're sending is very large. The data will still get sent to the browser correctly.

And even this isn't the complete solution. You'll need to set the response headers and so on. I guess you're aware of that, though.


You are doubling your buffer size at each reallocation, which means previously allocated blocks can never be used (they effectively leak). By the time you get to 500 MB, you've chewed up 1 GB plus overheads. In fact, it might be 2 GB, since, if you hit the 512 MB, your next allocation will be 1 GB. On a 32-bit system, this bankrupts your process.

Since it's a normal file you are reading in, just query the filesystem for its size and preallocate the buffer in one go.


Asp.Net Core Middleware

public static async Task<string> GetRequestBody(HttpContext context)    {        string bodyText = string.Empty;        try        {            var requestbody = context.Request.Body;            context.Request.EnableRewind();            int offset = 0, bytesread = 0;            var buffer = new byte[5096];            while ((bytesread = await context.Request.Body.ReadAsync(buffer, offset, buffer.Length - offset)) > 0)            {                offset += bytesread;                if (offset == buffer.Length)                {                    int nextByte = context.Request.Body.ReadByte();                    if (nextByte == -1)                    {                        break;                    }                    byte[] newBuffer = new byte[buffer.Length * 2];                    Array.Copy(buffer, newBuffer, buffer.Length);//how to avoid copy                     newBuffer[offset] = (byte)nextByte;//how to avoid boxing                     buffer = newBuffer;                    offset++;                }                if (offset > 4194304)                {                    //log.Warn("Middleware/GetRequestBody--> Request length exceeding limit");                    break;                }            }            bodyText = Encoding.UTF8.GetString(buffer);        }        catch (Exception ex)        {            //log.Debug(ex, "Middleware/GetRequestBody--> Request length exceeding limit");        }        context.Request.Body.Position = 0;        return bodyText;    }