From: | Scott Marlowe <smarlowe(at)g2switchworks(dot)com> |
---|---|
To: | Karen Hill <karen_hill22(at)yahoo(dot)com> |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: How long should it take to insert 200,000 records? |
Date: | 2007-02-06 16:14:44 |
Message-ID: | 1170778484.5451.97.camel@state.g2switchworks.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On Mon, 2007-02-05 at 18:35, Karen Hill wrote:
> I have a pl/pgsql function that is inserting 200,000 records for
> testing purposes. What is the expected time frame for this operation
> on a pc with 1/2 a gig of ram and a 7200 RPM disk? The processor is
> a 2ghz cpu. So far I've been sitting here for about 2 million ms
> waiting for it to complete, and I'm not sure how many inserts postgres
> is doing per second.
That really depends. Doing 200,000 inserts as individual transactions
will be fairly slow. Since PostgreSQL generally runs in autocommit
mode, this means that if you didn't expressly begin a transaction, you
are in fact inserting each row as a transaction. i.e. this:
for (i=0;i<200000;i++){
insert into table abc values ('test',123);
}
Is functionally equivalent to:
for (i=0;i<200000;i++){
begin;
insert into table abc values ('test',123);
commit;
}
However, you can add begin / end pairs outside the loop like this:
begin;
for (i=0;i<200000;i++){
insert into table abc values ('test',123);
}
commit;
and it should run much faster.
From | Date | Subject | |
---|---|---|---|
Next Message | Merlin Moncure | 2007-02-06 16:40:01 | Re: How long should it take to insert 200,000 records? |
Previous Message | Bruno Wolff III | 2007-02-06 15:45:08 | Re: explain analyze output for review (was: optimizing a geo_distance()...) |