What would be the most efficient way, performance wise, of storing alot of
rather big chunks of text in seperate records in PostgreSQL. I'm dividing
huge XML-documents into smaller bits and placing the bits into seperate
records. Requests want all or just some of the records, and the document is
re-built based on the request. So everything is heavy IO-based.
What would be the best way to do this? LargeObject, the binary blob feature
of PostgreSQL or .... ????
The chunks can be everything from a few lines to entire documents of
several megabytes ( ok, that's the extreme example, but still .... )
pgsql-novice by date
|Next:||From: Poul L. Christiansen||Date: 2001-04-24 16:07:48|
|Subject: Re: Storing big chunks of text, variable length|
|Previous:||From: Joel Burton||Date: 2001-04-24 13:33:48|
|Subject: Re: Re: BETWEEN clause|