Re: Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table.

From: Scott Marlowe <scott(dot)marlowe(at)gmail(dot)com>
To: Allan Kamau <kamauallan(at)gmail(dot)com>
Cc: Postgres General Postgres General <pgsql-general(at)postgresql(dot)org>
Subject: Re: Avoiding duplicates (or at least marking them as such) in a "cumulative" transaction table.
Date: 2010-03-08 02:49:35
Message-ID: dcc563d11003071849r59fcf197p8b65e99202ad10f7@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Sun, Mar 7, 2010 at 1:45 AM, Allan Kamau <kamauallan(at)gmail(dot)com> wrote:
> Hi,
> I am looking for an efficient and effective solution to eliminate
> duplicates in a continuously updated "cumulative" transaction table
> (no deletions are envisioned as all non-redundant records are
> important). Below is my situation.

Is there a reason you can't use a unique index and detect failed
inserts and reject them?

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Vick Khera 2010-03-08 03:36:30 Re: postgresql 8.2 startup script
Previous Message Scott Marlowe 2010-03-08 02:26:32 Re: Transaction wraparound problem with database postgres