Heikki Linnakangas <hlinnaka(at)iki(dot)fi> writes:
> Yes, I can see that happening here too. The problem seems to be that the
> analyze-function detoasts every row in the sample. Tsvectors can be very
> large, so it adds up.
> That's pretty easy to fix, the analyze function needs to free the
> detoasted copies as it goes. But in order to do that, it needs to make
> copies of all the lexemes stored in the hash table, instead of pointing
> directly to the detoasted copies.
> Patch attached. I think this counts as a bug, and we should backport this.
+1. I didn't test the patch, but it looks sane to the eyeball.
regards, tom lane