From: gurkan(at)resolution(dot)com
To: pgsql-jdbc(at)postgresql(dot)org
Subject:
Date: 2005-09-01 23:32:18
Message-ID: 1125617538.43178f82aaebf@www.resolution.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-jdbc

Hi,
I have been working on conversion program from Informix to PostgreSQL db, and I
have a one table which has large data(bytea).

I need to be able to copy this one row (docdata column) that I have which is
close to 32MB. Program seems to read the data but cannot copy to postgres
(PreparedStatement), it gives OutOfMemoryError. I ran the program with these
heap too.
java -Xms200m -Xmx700m AddDoc
and I do not want to use LargeObject api (I do partion of data based on their
dates going from one table to many tables; data seems to be stored at
pg_largeobjects tbale). Here is the part of the code, and any help apriciated.
How do I copy large data like 32MB or greater?
Thanks

wbin = informixRs.getBinaryStream("docdata");
while ((wbyte = wbin.read ()) != -1) {
wbout.write (wbyte);
}
outln ("Size of KB is: " + wbout.toByteArray().length/1024);
size += wbout.toByteArray().length;
inp = new ByteArrayInputStream(wbout.toByteArray());
wbin = null;

postgresStmt = postgresConn.prepareStatement("INSERT INTO "+tableName+"
(id,docdata) VALUES (?,?)");
postgresStmt.setInt(1,id);
if(docdef_id.compareTo("12720") == 0 ) {
outln("\n\nbefore out of memory");
postgresStmt.setBinaryStream (2, inp, inp.available());
outln("\n\nafter out of memory");//cannot reach here
} else {
postgresStmt.setBinaryStream (2, inp, inp.available());
}

-------------------------------------------------
This mail sent through IMP: www.resolution.com

Responses

Browse pgsql-jdbc by date

  From Date Subject
Next Message Oliver Jowett 2005-09-01 23:44:29 Re: setting large bytea values
Previous Message Oliver Jowett 2005-09-01 22:52:46 Re: Prepared statement not using an index