Re: Inserting 'large' amounts of data

From: dmp <danap(at)ttc-cmc(dot)net>
To: Mario Splivalo <mario(dot)splivalo(at)megafon(dot)hr>
Cc: pgsql-jdbc(at)postgresql(dot)org
Subject: Re: Inserting 'large' amounts of data
Date: 2009-08-26 17:28:42
Message-ID: 4A9570CA.1050505@ttc-cmc.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-jdbc

>
>
>I have a web application which allows users to upload a lot of phone
>numbers. I need to store those numbers to a database. Usualy, one would
>upload around 70k-100k of records, totaling around 2 MB in size.
>
>I'm using tomcat as an application server, and JDBC to connect to pg8.3
>database.
>
>I will have around 20-50 concurent users in peek hours, and even that is
>quite overestimated.
>
>I could create the temporary file on the filesystem where database
>cluster is located and then execute COPY mytable FROM
>'/tmp/upload-data/uuidofsomesort.csv' WITH CSV', but the 'problem' is
>that database server and tomcat reside on different physical machines.
>
>What would one recommend as the best way to insert those data?
>
> Mario
>
Hello Mario,
If the users already have the data in CSV format why not let them do it
via the
app. server? The connection can be made across machines if setup properly.

http://dandymadeproductions.com/projects/MyJSQLView/docs/javadocs/index.html
CSVDataImportThread.java

This class could be used as a basis, with some work. I have a bug with
data that has
semicolons, could be more robust also, but could be used for a start.

John wrote:

> I believe you can use org.postgresql.copy.CopyIn() ... there are
> variants that use a writeToCopy() call to send the data, or a
> java.io.InputStream, or a java.io.Reader ...

This sounds a lot cleaner.

danap.

In response to

Browse pgsql-jdbc by date

  From Date Subject
Next Message Maciek Sakrejda 2009-08-26 17:34:32 Re: Inserting 'large' amounts of data
Previous Message John R Pierce 2009-08-26 17:17:11 Re: Inserting 'large' amounts of data