From: | Alessandro Gagliardi <alessandro(at)path(dot)com> |
---|---|
To: | pgsql-novice(at)postgresql(dot)org |
Subject: | execute many for each commit |
Date: | 2012-02-17 22:40:18 |
Message-ID: | CAAB3BBJRcHi-mCHwOTuQwDR32xwe-NkZQghhZ4mtYZxhHF0L-w@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
This is really more of a psycopg2 than a PostgreSQL question per se, but
hopefully there are a few Pythonistas on this list who can help me out. At
a recent PUG meeting I was admonished on the folly of committing after
every execute statement (especially when I'm executing hundreds of inserts
per second). I was thinking of batching a bunch of execute statements (say,
1000) before running a commit but the problem is that if any one of those
inserts fail (say, because of a unique_violation, which happens quite
frequently) then I have to rollback the whole batch. Then I'd have to come
up with some logic to retry each one individually or something similarly
complicated.
I look at this problem and I think, "I must be doing something wrong." But
I can't figure out what it is. The closest thing to an answer I could find
using Google was
http://stackoverflow.com/questions/396455/python-postgresql-psycopg2-interface-executemanywhich
didn't really provide any good solution at all. Perhaps someone here
knows better? Or perhaps that advice was wrong and I simply do have to do a
commit after each insert?
From | Date | Subject | |
---|---|---|---|
Next Message | Alessandro Gagliardi | 2012-02-17 23:04:16 | Foreign Key to an (abstract?) Parent Table |
Previous Message | Bartosz Dmytrak | 2012-02-17 21:39:18 | Re: insert fail gracefully if primary key already exists |