Skip site navigation (1) Skip section navigation (2)

Re: large resultset

From: Andrew McMillan <andrew(at)morphoss(dot)com>
To: AI Rumman <rummandba(at)gmail(dot)com>
Cc: pgsql-php(at)postgresql(dot)org
Subject: Re: large resultset
Date: 2010-06-15 12:45:19
Message-ID: 1276605919.3820.16344.camel@happy.home.mcmillan.net.nz (view raw or flat)
Thread:
Lists: pgsql-php
On Tue, 2010-06-15 at 18:03 +0600, AI Rumman wrote:
> Thanks a lot.
> Actualy I am new with Postgresql.
> I am using POstgresql 8.1.
> The application is for giving the administrator all their email
> activities. Thats why it is 2 million of records. It is an CRM
> application.

OK, so it still seems to me that you should go back to your user and say
"What are you going to do with these records?".  They may well say
something like "I'm going to load them into $SPREADSHEET and do
$ANALYSIS of $SOMETHING" at which point you can shortcut their future
pain by pointing out "but you realise there are 2 million of them, and
we have them in a database already.  Why not tell me what analysis you
want to do now, and I will just present you with the analysis?".  It may
well be that they will only ever want subsets of the results, so you can
write your page to offer them narrower sets of records, and never all at
once.

Or if they can't really adequately explain what they are going to do
with these records, and still insist they will want the whole 2million
at once, write the actual processing in a shell script and e-mail them
the results - just trigger it from the web request.

My experience is that users only want ridiculous things if they think
that what they are requesting is reasonable.  So if their request seems
unreasonable then either you don't understand it well enough, or they
don't understand computers, and talking to them isn't ever going to make
it worse.

Cheers,
					Andrew McMillan.

> 
> On Tue, Jun 15, 2010 at 4:37 PM, Andrew McMillan <andrew(at)morphoss(dot)com>
> wrote:
>         On Tue, 2010-06-15 at 16:01 +0600, AI Rumman wrote:
>         > No. I need to send 2 million records. I want to know what is
>         the best
>         > possible way to send these records?
>         > HOw should I write the plpgsql procedure to send record ony
>         by one to
>         > improve the response time to the users?
>         
>         
>         I don't think you're providing enough information for us to
>         help you.
>         
>         Your problem with two million users might be:
>         
>         * But it takes so long to loop through them...
>         * I run out of memory receiving the resultset from the far
>         end.
>         * How do I optimise this SQL query that fetches 2 million
>         records.
>         * Or (likely) something I haven't considered.
>         
>         Your 'How' question might be:
>         
>         * Should I be using a cursor to access these efficiently, by
>         sending
>         data in several chunks?
>         
>         * How can I write this so I don't waste my time if the person
>         on the far
>         end gave up waiting?
>         
>         Etc.
>         
>         
>         Fundamentally sending 2million of anything can get problematic
>         pretty
>         darn quickly, unless the 'thing' is less than 100 bytes.
>         
>         
>         My personal favourite would be to write a record somewhere
>         saying 'so
>         and so wants these 2 million records', and give the user a URL
>         where
>         they can fetch them from.  Or e-mail them to the user, or...
>         just about
>         anything, except try and generate them in-line with the page,
>         in a
>         reasonable time for their browser to not give up, or their
>         proxy to not
>         give up, or their ISP's transparent proxy to not give up.
>         
>         Why do they want 2 million record anyway?  2 million of what?
>          Will
>         another user drop by 10 seconds later and also want 2 million
>         records?
>         The same 2 million?  Why does the user want 2 million
>         records?  Is there
>         something that can be done to the 2 million records to make
>         them a
>         smaller but more useful set of information?
>         
>         
>         Hopefully this stream of consciousness has some help buried in
>         it
>         somewhere :-)
>         
>         
>         Cheers,
>                                                Andrew McMillan.
>         
>         
>         --
>         ------------------------------------------------------------------------
>         http://andrew.mcmillan.net.nz/                     Porirua,
>         New Zealand
>         Twitter: _karora                                  Phone:
>         +64(272)DEBIAN
>                    Water, taken in moderation cannot hurt anybody.
>                                     -- Mark Twain
>         
>         ------------------------------------------------------------------------
>         
> 

-- 
------------------------------------------------------------------------
andrew (AT) morphoss (DOT) com                            +64(272)DEBIAN
Though a superhero, Bruce Schneier disdains the use of a mask or secret
               identity as 'security through obscurity'.

------------------------------------------------------------------------

In response to

pgsql-php by date

Next:From: vinnyDate: 2010-06-21 19:52:14
Subject: Re: large resultset
Previous:From: vinnyDate: 2010-06-15 12:25:02
Subject: Re: large resultset

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group