Skip site navigation (1) Skip section navigation (2)

Re: large resultset

From: AI Rumman <rummandba(at)gmail(dot)com>
To: Andrew McMillan <andrew(at)morphoss(dot)com>
Cc: pgsql-php(at)postgresql(dot)org
Subject: Re: large resultset
Date: 2010-06-15 12:03:24
Message-ID: AANLkTikId6xwBgFhSxQIOpF0rhKps9yCfrUCcdcZeDT4@mail.gmail.com (view raw or flat)
Thread:
Lists: pgsql-php
Thanks a lot.
Actualy I am new with Postgresql.
I am using POstgresql 8.1.
The application is for giving the administrator all their email activities.
Thats why it is 2 million of records. It is an CRM application.

On Tue, Jun 15, 2010 at 4:37 PM, Andrew McMillan <andrew(at)morphoss(dot)com>wrote:

> On Tue, 2010-06-15 at 16:01 +0600, AI Rumman wrote:
> > No. I need to send 2 million records. I want to know what is the best
> > possible way to send these records?
> > HOw should I write the plpgsql procedure to send record ony by one to
> > improve the response time to the users?
>
> I don't think you're providing enough information for us to help you.
>
> Your problem with two million users might be:
>
> * But it takes so long to loop through them...
> * I run out of memory receiving the resultset from the far end.
> * How do I optimise this SQL query that fetches 2 million records.
> * Or (likely) something I haven't considered.
>
> Your 'How' question might be:
>
> * Should I be using a cursor to access these efficiently, by sending
> data in several chunks?
>
> * How can I write this so I don't waste my time if the person on the far
> end gave up waiting?
>
> Etc.
>
>
> Fundamentally sending 2million of anything can get problematic pretty
> darn quickly, unless the 'thing' is less than 100 bytes.
>
>
> My personal favourite would be to write a record somewhere saying 'so
> and so wants these 2 million records', and give the user a URL where
> they can fetch them from.  Or e-mail them to the user, or... just about
> anything, except try and generate them in-line with the page, in a
> reasonable time for their browser to not give up, or their proxy to not
> give up, or their ISP's transparent proxy to not give up.
>
> Why do they want 2 million record anyway?  2 million of what?  Will
> another user drop by 10 seconds later and also want 2 million records?
> The same 2 million?  Why does the user want 2 million records?  Is there
> something that can be done to the 2 million records to make them a
> smaller but more useful set of information?
>
>
> Hopefully this stream of consciousness has some help buried in it
> somewhere :-)
>
>
> Cheers,
>                                        Andrew McMillan.
>
>
> --
> ------------------------------------------------------------------------
> http://andrew.mcmillan.net.nz/                     Porirua, New Zealand
> Twitter: _karora                                  Phone: +64(272)DEBIAN
>            Water, taken in moderation cannot hurt anybody.
>                             -- Mark Twain
>
> ------------------------------------------------------------------------
>
>

In response to

Responses

pgsql-php by date

Next:From: Jasen BettsDate: 2010-06-15 12:09:25
Subject: Re: large resultset
Previous:From: Andrew McMillanDate: 2010-06-15 10:37:24
Subject: Re: large resultset

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group