From: | Timo Savola <timo(dot)savola(at)codeonline(dot)com> |
---|---|
To: | pgsql-jdbc(at)postgresql(dot)org |
Subject: | Re: ResultSet memory usage |
Date: | 2002-01-11 16:05:40 |
Message-ID: | 1010765145.10350.9.camel@vorlon |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-jdbc |
> A possible workaround- If you only need to grab a few rows is there some way
> to make those rows float to the top using an "order by" & then apply "limit"
> so you don't have to deal with the huge ResultSet?
I'm using order by, but the point is that I can only make an educated
guess for the limit parameter. And I can't calculate a "big enough"
value.
I need to get N first entries with duplicates removed based on one (or
two) unique column(s). I can't use distinct since I need to select also
other columns that shouldn't be affected by "distinct". I've thought
about subselects, etc. but so far the best/cleanest approach I've come
up with is to use a HashSet for the unique column values on the Java
end. The down side is that I need to transfer a lot of unnecessary rows
from to the application, and with PostgreSQL that means all rows.
Timo
From | Date | Subject | |
---|---|---|---|
Next Message | Timo Savola | 2002-01-11 16:08:23 | Re: ResultSet memory usage |
Previous Message | Kovács Péter | 2002-01-11 14:32:45 | Proposal for a configurable ResultSet implementation |