I have a Postgis table with about 2 million polyline records. The most
number of points I have in the geometry field is about 500. I have a
simple DBD::Pg Perl program that does a select for most of these records
and do some processing with them before writing them to a file.
Unfortunately, I seem to keep getting this error:
DBD::Pg::st execute failed: out of memory for query result
DBD::Pg::st fetchrow_array failed: no statement executing
This program works fine with less than a million records for sure,
my next largest table. I believe the program is failing at the first
execute after prepare on this table.
Any ideas on how I can do this?
pgsql-general by date
|Next:||From: James Robinson||Date: 2005-04-29 15:49:40|
|Subject: Composite types as columns used in production?|
|Previous:||From: Sebastian Böck||Date: 2005-04-29 15:35:04|
|Subject: Re: Problem with GIST-index and timestamps|