Skip site navigation (1) Skip section navigation (2)

Re: large resultset

From: Jasen Betts <jasen(at)xnet(dot)co(dot)nz>
To: pgsql-php(at)postgresql(dot)org
Subject: Re: large resultset
Date: 2010-06-15 12:21:20
Message-ID: hv7r80$ogq$ (view raw, whole thread or download thread mbox)
Lists: pgsql-php
On 2010-06-15, Andrew McMillan <andrew(at)morphoss(dot)com> wrote:

> Fundamentally sending 2million of anything can get problematic pretty
> darn quickly, unless the 'thing' is less than 100 bytes.
> My personal favourite would be to write a record somewhere saying 'so
> and so wants these 2 million records', and give the user a URL where
> they can fetch them from.  Or e-mail them to the user, or... just about
> anything, except try and generate them in-line with the page, in a
> reasonable time for their browser to not give up, or their proxy to not
> give up, or their ISP's transparent proxy to not give up.

email often fails for sizes over 10Mb

> Why do they want 2 million record anyway?  2 million of what?  Will
> another user drop by 10 seconds later and also want 2 million records?
> The same 2 million?  Why does the user want 2 million records?  Is there
> something that can be done to the 2 million records to make them a
> smaller but more useful set of information?

/Nobody/ wants a web page with 2 million lines on it,
(scrolling gets tricky when each pixel is 2000 lines of data, plus
most browsers aren't designed to handle it well)

still if it's served with "Content-Disposition: Attachment"
they'll get offered it for download instead.
(unless they use IE and you use cookies and SSL in which case it
doesn't work)

In response to

pgsql-php by date

Next:From: vinnyDate: 2010-06-15 12:25:02
Subject: Re: large resultset
Previous:From: Jasen BettsDate: 2010-06-15 12:09:25
Subject: Re: large resultset

Privacy Policy | About PostgreSQL
Copyright © 1996-2018 The PostgreSQL Global Development Group