| From: | Rolf Luettecke <rolf(dot)luettecke(at)michael-telecom(dot)de> |
|---|---|
| To: | pgsql-admin(at)postgresql(dot)org |
| Subject: | Re: slow inserts |
| Date: | 2002-03-21 08:56:01 |
| Message-ID: | 20020321095601.7c3eff8b.rolf.luettecke@michael-telecom.de |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-admin |
Hi Jodi,
> None of the data is actually committed to the database until the scripts
> complete so I believe that autocommit is turned off.
>
what if you try to write the output of your script into a separate file
and pipe it to a psql as input? What i mean is to strip of the processing
time for the "excel-part". Still 25 minutes to do the job?
We often insert data in the same amount (10.000 - 100.000 rows per job)
within a few seconds/minutes. A few months ago I had the same problem
"writing" a dbf-file from postgres-data: The select-statement took
milliseconds, but the conversion into db-format seems to be endless.
BTW: We also had a table (10.000s of rows / daily vacuumed) which was
rather slow during inserts (PostgreSQL 7.1.2). After upgrading to
version 7.1.3 and completely rebuild the tables, the problem went away.
Hope it helps
R. Luettecke
--
MICHAEL TELECOM AG
Bruchheide 34 - 49163 Bohmte
Fon: +49 5471 806-0
rolf(dot)luettecke(at)michael-telecom(dot)de
http://www.michael-telecom.de
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Jayaram Bhat | 2002-03-21 09:02:18 | which site i download postsql |
| Previous Message | Rasmus Mohr | 2002-03-21 08:08:45 | Re: Failure loading TCL/u |