From: | "Jonah H(dot) Harris" <jonah(dot)harris(at)gmail(dot)com> |
---|---|
To: | Ernesto Quiñones <ernestoq(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: syncing with a MySQL DB |
Date: | 2008-10-26 16:41:39 |
Message-ID: | 36e682920810260941yb0c790dv6fbd817e9e4d2b7a@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Sat, Oct 25, 2008 at 1:19 PM, Ernesto Quiñones <ernestoq(at)gmail(dot)com> wrote:
> I use dbi-link, work fine, but I have problems when I call mysql
> tables "linked" and these tables are big, maybe a millon records, the
> answers is really slow, I need to wait 5 or more minutes to have an
> answer in a single query like this "select * from table limit 10", I
> am thinking maybe dbi-link download all the data to pgsql before to
> give me the answer.
Yes, that's what Postgres is doing. DBI-link is currently incapable
of pushing down the predicate to the remote system because Postgres
can't give it access to the predicate.
> Anybody knows how improve this?
If I have to push the predicate down, I'll generally write a
set-returning function which takes some of the predicate, limit, and
offset info to build a dynamic sql query against the remote database
using dblink.
--
Jonah H. Harris, Senior DBA
myYearbook.com
From | Date | Subject | |
---|---|---|---|
Next Message | David Fetter | 2008-10-26 18:26:14 | Re: syncing with a MySQL DB |
Previous Message | Scott Marlowe | 2008-10-26 16:37:01 | Re: Are there plans to add data compression feature to postgresql? |