Java OutOfMemoryError in reading a large text file Java OutOfMemoryError in reading a large text file arrays arrays

Java OutOfMemoryError in reading a large text file


Try to use java.nio.MappedByteBuffer.

http://docs.oracle.com/javase/7/docs/api/java/nio/MappedByteBuffer.html

You can map a file's content onto memory without copying it manually. High-level Operating Systems offer memory-mapping and Java has API to utilize the feature.

If my understanding is correct, memory-mapping does not load a file's entire content onto memory (meaning "loaded and unloaded partially as necessary"), so I guess a 10GB file won't eat up your memory.


Even though you can increase the JVM memory limit, it is needless and allocating a huge memory like 10GB to process a file sounds overkill and resource intensive.

Currently you are using a "ByteArrayOutputStream" which keeps an internal memory to keep the data. This line in your code keeps appending the last read 2KB file chunk to the end of this buffer:

bArrStream.write(localbuffer, 0, i);

bArrStream keeps growing and eventually you run out of memory.

Instead you should reorganize your algorithm and process the file in a streaming way:

InputStream inFileReader = channelSFtp.get(path); // file reading from ssh.byte[] localbuffer = new byte[2048];int i = 0;while (-1 != (i = inFileReader.read(buffer))) {    //Deal with the current read 2KB file chunk here}inFileReader.close();


The Java virtual machine (JVM) runs with a fixed upper memory limit, which you can modify thus:

java -Xmx1024m ....

e.g. the above option (-Xmx...) sets the limit to 1024 megabytes. You can amend as necessary (within limits of your machine, OS etc.) Note that this is different from traditional applications which would allocate more and more memory from the OS upon demand.

However a better solution is to rework your application such that you don't need to load the whole file into memory at one go. That way you don't have to tune your JVM, and you don't impose a huge memory footprint.