How to persist LARGE BLOBs (>100MB) in Oracle using Hibernate How to persist LARGE BLOBs (>100MB) in Oracle using Hibernate oracle oracle

How to persist LARGE BLOBs (>100MB) in Oracle using Hibernate


I was having the same problems as you in attempting to map using "blob" type. Here is a link to a post I made on the hibernate site: https://forum.hibernate.org/viewtopic.php?p=2452481#p2452481

Hibernate 3.6.9
Oracle Driver 11.2.0.2.0
Oracle Database 11.2.0.2.0

To fix the problem I used code that had a custom UserType for the Blob, I had the return type be java.sql.Blob.

Here are the key method implementations of this UserType:

public Object nullSafeGet(ResultSet rs, String[] names, Object owner) throws HibernateException, SQLException {   Blob blob = rs.getBlob(names[0]);   if (blob == null)      return null;   return blob;}public void nullSafeSet(PreparedStatement st, Object value, int index)     throws HibernateException, SQLException {   if (value == null) {      st.setNull(index, sqlTypes()[0]);   }   else {      InputStream in = null;      OutputStream out = null;      // oracle.sql.BLOB      BLOB tempBlob = BLOB.createTemporary(st.getConnection(), true, BLOB.DURATION_SESSION);      tempBlob.open(BLOB.MODE_READWRITE);      out = tempBlob.getBinaryOutputStream();      Blob valueAsBlob = (Blob) value;      in = valueAsBlob.getBinaryStream();      StreamUtil.toOutput(in, out);      out.flush();      StreamUtil.close(out);      tempBlob.close();      st.setBlob(index, tempBlob);      StreamUtil.close(in);   }}


Personally I store files up to 200MB in Oracle BLOB columns using Hibernate, so I can assure it works. So...

You should try newer version of Oracle JDBC driver. It seems that this behavior of using byte arrays instead of streams was changed a little bit over time. And the drivers are backward compatible. I'm not sure, if that's going to fix your problem, but it works for me.Additionally You should switch to org.hibernate.dialect.Oracle10gDialect - which retires the use of the oracle.jdbc.driver package in favor of oracle.jdbc - and it might also help.


I just discovered this question when I was having the same problem with Oracle and Hibernate. The issue is in the Hibernate blob handling. It seems to copy the blob to memory depending on the Dialect in use. I guess they do so, because it's required by some databases/drivers. For Oracle though, this behaviour does not seem to be required.

The fix is pretty simple, just create a custom OracleDialect containing this code:

public class Oracle10DialectWithoutInputStreamToInsertBlob extends Oracle10gDialect {    public boolean useInputStreamToInsertBlob() {        return false;    }}

Next you need to configure your session factory to use this Dialect. I've tested it with the ojdbc6-11.2.0.1.0 driver towards Oracle 11g, and confirmed that this fixes the issue with memory consumption.

If some of you tries this with another Oracle database and/or with a different Oracle driver I would love to hear if it works for you. If it works with several configurations, I'll send a pull request to the Hibernate team.