From: | Tatsuo Ishii <ishii(at)sraoss(dot)co(dot)jp> |
---|---|
To: | teodor(at)sigaev(dot)ru |
Cc: | pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: string_to_array eats too much memory? |
Date: | 2006-11-08 09:38:23 |
Message-ID: | 20061108.183823.116956610.t-ishii@sraoss.co.jp |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
> > I'm playing with GIN to make a full text search system. GIN comes with
> > built-in TEXT[] support and I use string_to_array() to make a
> > TEXT[]. Problem is, if there's large number of array elemets,
> > string_to_array() consumes too much memory. For example, to make ~70k
> > array elements, string_to_array seems to eat several Gig bytes of
> > memory. ~70k array elements means there are same number of words in a
> > document which is not too big in a large text IMO.
>
> Do you mean 70k unique lexemes? Ugh.
I'm testing how GIN scales.
> Why do not you use tsearch framework?
? I thought GIN is superior than tsearch2.
From your GIN proposal posted to pgsql-hackers:
"The primary goal of the Gin index is a scalable full text search in
PostgreSQL"
What do you think?:-)
--
Tatsuo Ishii
SRA OSS, Inc. Japan
From | Date | Subject | |
---|---|---|---|
Next Message | Magnus Hagander | 2006-11-08 09:47:29 | Re: string_to_array eats too much memory? |
Previous Message | Heikki Linnakangas | 2006-11-08 09:32:06 | Re: 8.2 Beta 3 Now Available for Download / Testing ... |