Re: Query Progress (was: Performance With Joins on Large Tables)

From: "Joshua Marsh" <icub3d(at)gmail(dot)com>
To: "Bucky Jordan" <bjordan(at)lumeta(dot)com>
Cc: "Jeff Davis" <pgsql(at)j-davis(dot)com>, pgsql-performance(at)postgresql(dot)org, jim(at)nasby(dot)net, bujordan(at)gmail(dot)com
Subject: Re: Query Progress (was: Performance With Joins on Large Tables)
Date: 2006-09-13 18:56:58
Message-ID: 38242de90609131156n6eb102f9p8c7a09d12264c942@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

On 9/13/06, Bucky Jordan <bjordan(at)lumeta(dot)com> wrote:
>
> Setting to 0.1 finally gave me the result I was looking for. I know
> that the index scan is faster though. The seq scan never finished (i
> killed it after 24+ hours) and I'm running the query now with indexes
> and it's progressing nicely (will probably take 4 hours).
>
>
> In regards to "progressing nicely (will probably take 4 hours)" - is
> this just an estimate or is there some way to get progress status (or
> something similar- e.g. on step 6 of 20 planned steps) on a query in pg?
> I looked through Chap 24, Monitoring DB Activity, but most of that looks
> like aggregate stats. Trying to relate these to a particular query
> doesn't really seem feasible.
>
> This would be useful in the case where you have a couple of long running
> transactions or stored procedures doing analysis and you'd like to give
> the user some feedback where you're at.
>
> Thanks,
>
> Bucky
>

I do it programmatically, not through postgresql. I'm using a cursor,
so I can keep track of how many records I've handled. I'm not aware
of a way to do this in Postgresql.

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Joshua Marsh 2006-09-13 19:27:06 Re: Performance With Joins on Large Tables
Previous Message Marcin Mank 2006-09-13 18:39:59 Re: Performance With Joins on Large Tables