From: | Mike Christensen <mike(at)kitchenpc(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Am I best off keeping large chunks of text in a separate table? |
Date: | 2011-09-18 00:46:24 |
Message-ID: | CABs1bs3wygUf5D0N6TGyddtJYUd9LeuLOyWi3NVtDUV=ez_wFg@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I have a table that looks something like this:
url - character varying(1024)
date - timestamptz
body - text
Url is a unique primary key. Body can potentially be a couple hundred
k of text.
There will at first be perhaps 100,000 rows in this table, but at some
point it might get into the millions.
I need to be able to quickly insert into this table (I might be
inserting several rows per second at times). I also need to be able
to very quickly see if a URL already exists in the table, and what the
date value is. Or, query for all "urls" that have a "date" older than
x days.
Am I better off with two tables such as:
Table1:
id - uuid or integer (primary key)
url - unique index
date
Table2:
id - FK to Table2.id
body - text
It makes the program flow a bit more complicated, and I'd have to use
transactions and stuff when inserting new rows. However, for years
I've been told that having rows with large chunks of text is bad for
perf and forces that data to be paged into memory and causes other
various issues. Any advice on this one? Thanks!
Mike
From | Date | Subject | |
---|---|---|---|
Next Message | Scott Marlowe | 2011-09-18 01:02:22 | Re: Am I best off keeping large chunks of text in a separate table? |
Previous Message | Craig Ringer | 2011-09-17 23:02:27 | Re: problem connecting postgresql 9.0 tables from vba |