Re: batch insert/update

From: Ivan Sergio Borgonovo <mail(at)webthatworks(dot)it>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: batch insert/update
Date: 2007-12-26 22:17:21
Message-ID: 20071226231721.52ddace5@webthatworks.it
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Wed, 26 Dec 2007 20:48:27 +0100
Andreas Kretschmer <akretschmer(at)spamfence(dot)net> wrote:

> blackwater dev <blackwaterdev(at)gmail(dot)com> schrieb:
>
> > I have some php code that will be pulling in a file via ftp.
> > This file will contain 20,000+ records that I then need to pump
> > into the postgres db. These records will represent a subset of
> > the records in a certain table. I basically need an efficient
> > way to pump these rows into the table, replacing matching rows
> > (based on id) already there and inserting ones that aren't. Sort
> > of looping through the result and inserting or updating based on
> > the presents of the row, what is the best way to handle this?
> > This is something that will run nightly.

> Insert you data to a extra table and work with regular SQL to
> insert/update the destination table. You can use COPY to insert the
> data into your extra table, this works very fast, but you need a
> suitable file format for this.

What if you know in advance what are the row that should be inserted
and you've a batch of rows that should be updated?

Is it still the fasted system to insert them all in a temp table with
copy?

What about the one that have to be updated if you've all the columns,
not just the changed ones?
Is it faster to delete & insert or to update?

updates comes with the same pk as the destination table.

thx

--
Ivan Sergio Borgonovo
http://www.webthatworks.it

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Josh Harrison 2007-12-27 15:49:10 Any big slony and WAL shipping users?
Previous Message Thomas Hart 2007-12-26 22:12:37 Re: batch insert/update