From: | David Wall <d(dot)wall(at)computer(dot)org> |
---|---|
To: | pgsql-jdbc <pgsql-jdbc(at)postgresql(dot)org> |
Subject: | JDBC Blob helper class & streaming uploaded data into PG |
Date: | 2009-02-04 19:18:26 |
Message-ID: | 4989EA02.5020703@computer.org |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-jdbc |
Does anybody have a JDBC Blob helper class so we can use the
setBlob()/getBlob() calls in JDBC for PG 8.3? It would be a class that
implements the java.sql.Blob interface.
Does anybody have any code samples of how I'd go about reading a file in
blocks, compressing the block and then streaming that block into a Blob
(LO)? We are starting to process very large files uploaded in a web app
and so it's no longer possible to keep it all in memory before it's
stored in PG.
The pseudo-code is something like:
while ( hasMoreHttpPostData )
{
read chunk from HTTP POST
compress chunk
write chunk to DB
}
If possible, we'd like to use a streaming cipher (for encryption) and
GZIP to write the compressed data and a Blob to store to the database.
Do these APIs support writing chunks, or must we actually loop through
each byte (which seems painfully slow).
Thanks,
David
From | Date | Subject | |
---|---|---|---|
Next Message | Kris Jurka | 2009-02-05 23:19:21 | Re: Re: [JDBC] BUG: DatabaseMetaData.getColumns isn't case insensitive |
Previous Message | Andrew Lazarus | 2009-02-03 21:53:36 | getHost() |