Re: Importing Large Amounts of Data

From: Neil Conway <nconway(at)klamath(dot)dyndns(dot)org>
To: "Bruce Momjian" <pgman(at)candle(dot)pha(dot)pa(dot)us>
Cc: cjs(at)cynic(dot)net, pgsql-hackers(at)postgresql(dot)org
Subject: Re: Importing Large Amounts of Data
Date: 2002-04-16 01:49:37
Message-ID: 20020415214937.43255d2a.nconway@klamath.dyndns.org
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On Mon, 15 Apr 2002 21:44:26 -0400 (EDT)
"Bruce Momjian" <pgman(at)candle(dot)pha(dot)pa(dot)us> wrote:
> In the old days, we created indexes
> only after the data was loaded, but when we added PRIMARY key, pg_dump
> was creating the table with PRIMARY key then loading it, meaning the
> table was being loaded while it had an existing index. I know we fixed
> this recently but I am not sure if it was in 7.2 or not.

It's not in 7.2 -- but it's fixed in CVS.

Cheers,

Neil

--
Neil Conway <neilconway(at)rogers(dot)com>
PGP Key ID: DB3C29FC

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message Peter Eisentraut 2002-04-16 01:51:56 Re: regexp character class locale awareness patch
Previous Message Bruce Momjian 2002-04-16 01:44:26 Re: Importing Large Amounts of Data