Re: integrated tsearch doesn't work with non utf8 database

From: Teodor Sigaev <teodor(at)sigaev(dot)ru>
To: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: Heikki Linnakangas <heikki(at)enterprisedb(dot)com>, Pavel Stehule <pavel(dot)stehule(at)gmail(dot)com>, Oleg Bartunov <oleg(at)sai(dot)msu(dot)su>, PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: integrated tsearch doesn't work with non utf8 database
Date: 2007-09-10 16:21:10
Message-ID: 46E56EF6.2090902@sigaev.ru
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

> I think Teodor's solution is wrong as it stands, because if the subquery
> finds matches for mapcfg and maptokentype, but none of those rows
> produce a non-null ts_lexize result, it will instead emit one row with a
> null result, which is not what should happen.
But concatenation with NULL will have result NULL, so "Lexized token" column
will have NULL as supposed.
--
Teodor Sigaev E-mail: teodor(at)sigaev(dot)ru
WWW: http://www.sigaev.ru/

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Heikki Linnakangas 2007-09-10 16:24:21 Ts_rank internals
Previous Message Tom Lane 2007-09-10 16:21:02 Re: invalidly encoded strings