50 MB Table

From: JB <jimbag(at)kw(dot)igs(dot)net>
To: pgsql-general(at)postgresql(dot)org
Subject: 50 MB Table
Date: 2000-03-06 22:09:51
Message-ID: 38C42CAF.F65E7BCE@kw.igs.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


I have a 50 MB +- table in postgres. The data is normalized so there's
not much I can do about the size. The tuples are about 512 bytes so
there's a pile of 'em. I need searching on of several fields, a couple
in particular are text fields that needs 'LIKE'. The problem is, the
thing is way too slow. So, I was wondering, before I go hunting for some
other solution, could anyone here point me to some ways to (hand)
optimize the searching in postgres? Different indexes, hashing and LIKE?
I'm not sure where to go with this.

The basic criteria are:
- sizes of indexes, etc, is not an issue. There's lot's of room on the
box.
- the data is basically static so a read-only (if such a thing) is
fine.
- it needs to be FAST

cheers
jb

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Ron Chmara 2000-03-06 23:21:07 Re: [GENERAL] DHCP and pg_hba.conf
Previous Message Ron Chmara 2000-03-06 20:52:20 Re: [GENERAL] DHCP and pg_hba.conf