Tonight I rewrote a part of an application that deals with http uploads,
because it turned out it has to handle larger files than originally intended
- and one was getting OutOfMemory errors.
So I rewrote evcerything so that an InputStream is passed to the JDBC driver
and the files are never completely loaded into memory. However I am still
getting an OutOfMemory error for large files. While it is difficult to
pinpoint exactly where due to the lack of a stack trace, it does look like
the driver is causing it.
Does the JDBC driver handle InputStream:s intelligently at all? If so, does it
do so under all circumstances? In this case I am putting data into a column
of type 'bytea' and am using PreparedStatement.setBinaryStream().
The backend is PostgreSQL 7.4.1, and I am using the driver for 7.4.1
(pg74.1jdbc3.jar). Running under JDK 1.4.2.
Do I need to use some other type in the database in order for input streams to
be handled properly? Do I have to use some PostgreSQL specific API? Does the
JDBC driver need to be changed to support this?
I can always fall back to using files on the filesystem, but then I will loose
all the niceties that come with ACID transactions which I automatically get
if I keep it all in the database.
/ Peter Schuller, InfiDyne Technologies HB
PGP userID: 0xE9758B7D or 'Peter Schuller <peter(dot)schuller(at)infidyne(dot)com>'
Key retrieval: Send an E-Mail to getpgpkey(at)scode(dot)org
E-Mail: peter(dot)schuller(at)infidyne(dot)com Web: http://www.scode.org
pgsql-jdbc by date
|Next:||From: Dave Cramer||Date: 2004-03-30 01:55:45|
|Subject: Re: JDBC driver's (non-)handling of InputStream:s|
|Previous:||From: Dave Cramer||Date: 2004-03-29 21:40:19|
|Subject: Re: Support for 2-Phase Commit protocol|