HBase java.lang.OutOfMemoryError HBase java.lang.OutOfMemoryError hadoop hadoop

HBase java.lang.OutOfMemoryError


I encountered this error yesterday. What was happening in my case is that I was creating a lot of instances of HTable which created way too many threads when I was using the put on a record. (I was using a mapper and creating it inside the map function)

I'd check to see if your connection to HBase is being created a lot (inside a loop or a map function. If that is happening, then moving it to instantiate fewer connections to HBase (I used HTable) may solve the problem.It solved mine.

HTH


I encountered this error when I was using a HTablePool instance to get my HTableInterface instances, but after the utilization I forget to call the close() method on it.


I also encountered the same issue and as explained by kosii above, the root cause was not closing the HTableInterface instance which I got from the HTablePool after utilization.

HTableInterface table = tablePool.getTable(tableName);// Do the work........table.close()