best way to kill long running query?

From: "Bill Eaton" <EE2(at)aeroantenna(dot)com>
To: <pgsql-general(at)postgresql(dot)org>
Subject: best way to kill long running query?
Date: 2007-03-21 16:36:34
Message-ID: BHEMIOKCPPFPFJCHEKDCKEIGCFAA.EE2@aeroantenna.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

I want to allow some queries for my users to run for a prescribed period of
time and kill them if they go over time. Is there a good way to do this? Or
is this a bad idea?

I've been struggling with trying to figure out the best way to allow users
to browse through large tables. For example, I have one table with about
600,000 rows and growing at about 100,000/month.

I want to allow users to browse through this table, but only if their
effective SELECT statement only generates 100 or maybe 1000 rows. There are
several fields that can be used in the WHERE clause, such as user, date,
model, etc. It will be difficult for me to predict how large a result set is
a priori. So I want to allow the query to run for a prescribed period of
time, then kill it.

I'll probably be using ADO --> ODBC at the client. So I could probably kill
the Connection/Recordset. I just don't know the best way to do it. pgAdmin
allows queries to be killed. How does it do it?

Thanks in advance,

Bill Eaton
Thousand Oaks, CA

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Joshua D. Drake 2007-03-21 16:44:44 Re: multi terabyte fulltext searching
Previous Message Oleg Bartunov 2007-03-21 16:35:56 Re: multi terabyte fulltext searching