| From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
|---|---|
| To: | "Harpreet Dhaliwal" <harpreet(dot)dhaliwal01(at)gmail(dot)com> |
| Cc: | "Ron St-Pierre" <ron(dot)pgsql(at)shaw(dot)ca>, "Postgres General" <pgsql-general(at)postgresql(dot)org> |
| Subject: | Re: Duplicate Unique Key constraint error |
| Date: | 2007-07-10 19:09:14 |
| Message-ID: | 5288.1184094554@sss.pgh.pa.us |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general pgsql-odbc |
"Harpreet Dhaliwal" <harpreet(dot)dhaliwal01(at)gmail(dot)com> writes:
> Transaction 1 started, saw max(dig_id) = 30 and inserted new dig_id=31.
> Now the time when Transaction 2 started and read max(dig_id) it was still 30
> and by the time it tried to insert 31, 31 was already inserted by
> Transaction 1 and hence the unique key constraint error.
This is exactly why you're recommended to use sequences (ie serial
columns) for generating IDs. Taking max()+1 does not work, unless
you're willing to lock the whole table and throw away vast amounts of
concurrency.
regards, tom lane
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Michael Glaesemann | 2007-07-10 19:25:07 | Re: Adjacency Lists vs Nested Sets |
| Previous Message | Richard Huxton | 2007-07-10 18:51:58 | Re: Adjacency Lists vs Nested Sets |
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Harpreet Dhaliwal | 2007-07-10 19:33:34 | Re: Duplicate Unique Key constraint error |
| Previous Message | Michael Glaesemann | 2007-07-10 18:35:16 | Re: Duplicate Unique Key constraint error |