Skip site navigation (1) Skip section navigation (2)

Re: How import big amounts of data?

From: "Steinar H(dot) Gunderson" <sgunderson(at)bigfoot(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: Re: How import big amounts of data?
Date: 2005-12-29 10:50:49
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-performance
On Thu, Dec 29, 2005 at 10:48:26AM +0100, Arnau wrote:
>   Which is the best way to import data to tables? I have to import 
> 90000 rows into a column and doing it as inserts takes ages. Would be 
> faster with copy? is there any other alternative to insert/copy?

There are multiple reasons why your INSERT might be slow:

- Are you using multiple transactions instead of batching them in all or a
  few transactions? (Usually, the per-transaction cost is a lot higher than
  the per-row insertion cost.)
- Do you have a foreign key without a matching index in the other table? (In
  newer versions of PostgreSQL, EXPLAIN ANALYZE can help with this; do a
  single insert and see where it ends up doing work. Older won't show such
  things, though.)
- Do you have an insertion trigger taking time? (Ditto wrt. EXPLAIN ANALYZE.)

COPY will be faster than INSERT regardless, though (for more than a few rows,
at least).

/* Steinar */

In response to

pgsql-performance by date

Next:From: Dennis BjorklundDate: 2005-12-29 11:39:06
Subject: Re: How import big amounts of data?
Previous:From: ArnauDate: 2005-12-29 09:48:26
Subject: How import big amounts of data?

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group