Re: Inserting large BLOBs via JDBC - OutOfMemoryError

From: hhaag(at)gmx(dot)de
To: Barry Lind <barry(at)xythos(dot)com>
Cc: hhaag(at)gmx(dot)de, pgsql-jdbc(at)postgresql(dot)org, pgsql-jdbc-owner(at)postgresql(dot)org
Subject: Re: Inserting large BLOBs via JDBC - OutOfMemoryError
Date: 2002-08-16 08:14:31
Message-ID: 19217.1029485671@www9.gmx.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-jdbc

There might be a workaround:

From JDBC 2.0 on, the interface java.sql.Blob allows to manipulate BLOBs.
AbstractJdbc2Statement.setBlob() (code posted below) seems to create a BLOB
upfront storing it in the database. The actual INSERT command will then only
contain the OID, avoiding all memory problems.

I'll let you know how it goes.

public void setBlob(int i, Blob x) throws SQLException
{
InputStream l_inStream = x.getBinaryStream();
int l_length = (int) x.length();
LargeObjectManager lom = connection.getLargeObjectAPI();
int oid = lom.create();
LargeObject lob = lom.open(oid);
OutputStream los = lob.getOutputStream();
try
{
// could be buffered, but then the OutputStream returned by LargeObject
// is buffered internally anyhow, so there would be no performance
// boost gained, if anything it would be worse!
int c = l_inStream.read();
int p = 0;
while (c > -1 && p < l_length)
{
los.write(c);
c = l_inStream.read();
p++;
}
los.close();
}
catch (IOException se)
{
throw new PSQLException("postgresql.unusual", se);
}
// lob is closed by the stream so don't call lob.close()
setInt(i, oid);
}

--
GMX - Die Kommunikationsplattform im Internet.
http://www.gmx.net

Responses

Browse pgsql-jdbc by date

  From Date Subject
Next Message Chad 2002-08-16 13:48:35 Inquiry From Form [pgsql]
Previous Message hhaag 2002-08-16 08:09:48 Re: Inserting large BLOBs via JDBC - OutOfMemoryError