slow speeds after 2 million rows inserted

From: James Neff <jneff(at)tethyshealth(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: slow speeds after 2 million rows inserted
Date: 2006-12-29 17:39:03
Message-ID: 459552B7.3040909@tethyshealth.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Greetings,

Ive got a java application I am reading data from a flat file and
inserting it into a table. The first 2 million rows (each file
contained about 1 million lines) went pretty fast. Less than 40 mins to
insert into the database.

After that the insert speed is slow. I think I may be able to type the
data faster than what is being done by the java application on the third
file.

Table looks like this:

CREATE TABLE data_archive
(
id serial NOT NULL,
batchid integer NOT NULL,
claimid character varying(25) NOT NULL,
memberid character varying(45) NOT NULL,
raw_data text NOT NULL,
status integer DEFAULT 0,
line_number integer,
CONSTRAINT data_archive_pkey PRIMARY KEY (id)
)

there is also an index on batchid.

The insert command is like so:

"INSERT INTO data_archive (batchid, claimid, memberid, raw_data, status,
line_number) VALUES ('" + commandBatchID + "', '', '', '" + raw_data +
"', '1', '" + myFilter.claimLine + "');";

where the raw_data variable is the line from the file.

How can I find out what is causing this slow down and how do I speed it up?

Database is 8.2.0 on x86_64-unknown-linux-gnu.

There is nothing else running on this database server (other than
standard linux background programs). PS ax did not show anything else
running. No locks other than the occasional lock by the INSERT query.

I have done a FULL vacuum on this table but not reindex (running now).

Thanks in advance,
James

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Joshua D. Drake 2006-12-29 17:51:18 Re: slow speeds after 2 million rows inserted
Previous Message Joshua D. Drake 2006-12-29 17:23:49 Re: LDAP configuration problem