Re: [GENERAL] BIG Data and Perl

From: Lincoln Yeoh <lylyeoh(at)mecomb(dot)com>
To: pgsql-general(at)postgreSQL(dot)org
Subject: Re: [GENERAL] BIG Data and Perl
Date: 1999-10-18 02:48:01
Message-ID: 3.0.5.32.19991018104801.008d5b10@pop.mecomb.po.my
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote:
>I've got a fairly good size database that has in one table around 50,000
>records in it.
>
>It starts of and processes the first 300-400 rows fast and then gets
>slower in time and eventually just quits. It'll run for about 4-6 hours
>before it quits.
>
>Any idea what may be going on here?

Maybe you're running out of memory. Your perl script may be reading too
much into memory.

When using the perl DBI module, I get the impression that the perl script
reads in all the results when you do
$cursor->execute

I don't know if there are any ways around this. It can be a bit
inconvenient if the result is large ;).

Cheerio,

Link.

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Charles Tassell 1999-10-18 03:31:21 Re: [GENERAL] BIG Data and Perl
Previous Message Jim Cromie 1999-10-18 00:51:21 Re: XQL Problems