| From: | "Kalle Hallivuori" <kato(at)iki(dot)fi> |
|---|---|
| To: | "Dave Page" <dpage(at)postgresql(dot)org> |
| Cc: | pgadmin-hackers(at)postgresql(dot)org |
| Subject: | Re: Submitting cursored query support, docs, etc |
| Date: | 2007-12-12 10:31:59 |
| Message-ID: | c637d8bb0712120231o5fce0748ndc164d0579c201e5@mail.gmail.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgadmin-hackers |
Hi!
2007/12/11, Dave Page <dpage(at)postgresql(dot)org>:
> I haven't looked at it in that much detail for a number of years
> unfortunately. However, from the top of my head, I would start by
> looking at ctlSQLResult::Execute/pgQueryThread::execute in which you can
> create the cursor and retrieve the first batch of results.
Yeah, that works fine; I have a button now that declares a cursor for
the user's query, and I can append 'fetch from' manually to actually
see a row :)
> Then, look at
> extending pgSet such that when run with a cursor, an attempt to access a
> row that hasn't be retrieved yet results in the dataset being extended
> before the value is returned. Currently pgSet is tied to a single
> PGResult, but I guess in cursor mode we would probably need to make that
> a list of PGResults each containing <= N rows where N is a configurable
> batch size.
This is exactly the kind of hint I hoped for. Thanks!
--
Kalle Hallivuori +358-41-5053073 http://korpiq.iki.fi/
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Dave Page | 2007-12-12 14:45:18 | 1.8.1 timetable |
| Previous Message | Guillaume Lelarge | 2007-12-11 23:20:56 | Foreign key dialog |