> None of the data is actually committed to the database until the scripts
> complete so I believe that autocommit is turned off.
what if you try to write the output of your script into a separate file
and pipe it to a psql as input? What i mean is to strip of the processing
time for the "excel-part". Still 25 minutes to do the job?
We often insert data in the same amount (10.000 - 100.000 rows per job)
within a few seconds/minutes. A few months ago I had the same problem
"writing" a dbf-file from postgres-data: The select-statement took
milliseconds, but the conversion into db-format seems to be endless.
BTW: We also had a table (10.000s of rows / daily vacuumed) which was
rather slow during inserts (PostgreSQL 7.1.2). After upgrading to
version 7.1.3 and completely rebuild the tables, the problem went away.
Hope it helps
MICHAEL TELECOM AG
Bruchheide 34 - 49163 Bohmte
Fon: +49 5471 806-0
In response to
pgsql-admin by date
|Next:||From: Jayaram Bhat||Date: 2002-03-21 09:02:18|
|Subject: which site i download postsql|
|Previous:||From: Rasmus Mohr||Date: 2002-03-21 08:08:45|
|Subject: Re: Failure loading TCL/u|