On 18 May 2010 11:51, Mag Gam <magawake(at)gmail(dot)com> wrote:
> I am writing an app where I have to populate file content into a database.
> The file looks like this,
> epoch,name, orderid, quantaty.
> There are about 100k rows like this. I am curious is it best to
> process each line 1 at a time or do a mass INSERT of 100k and then do
> a execute?
Use COPY: http://www.postgresql.org/docs/current/static/sql-copy.html
You may wish to import it into a temporary table first before
converting the date into a useable format.
> Another question,
> Since the data is going to be large, are there any tricks or tips
> anyone knows how to handle time series data into a database? Is
> postgresql the right tool or is there something else out there?
Depends on what you're doing with the data. If you're querying date
ranges, you'll need to index based on date.
In response to
pgsql-novice by date
|Next:||From: Ireneusz Pluta||Date: 2010-05-18 12:36:53|
|Subject: psql \e and syntax highlighting in vim|
|Previous:||From: Mag Gam||Date: 2010-05-18 10:51:21|
|Subject: populating a time series database|