I am not sure if this is the place to ask this question, but since the
question is trying to improve the performance.. i guess i am not that
My question is if there is a query design that would query multiple
server simultaneously.. would that improve the performance?
To make it clear.. let's say we have 3 db servers. 1 server is just
designed to take the queries while the other 2 server is the ones that
holds the data. let's say we have a query of 'select * from
customer_data' and we change it to
select * from
dblink('db1','select * from customer_data where timestamp between
timestamp \'01-01-2004\' and timestamp \'06-30-2004\'')
dblink('db2','select * from customer_data where timestamp between
timestamp \'01-07-2004\' and timestamp \'12-31-2004\'')
Would the subquery above be done simultaneously by postgres before doing
the end query? or would it just execute one at a time?
If it does execute simultaneously.. it's possible to create code to
convert normal queries to distributed queries and requesting data from
database to improve performance. This would be advantageous for large
amount of data.
pgsql-performance by date
|Next:||From: Iain||Date: 2004-12-14 01:54:17|
|Subject: Re: pg_restore taking 4 hours! |
|Previous:||From: Tom Lane||Date: 2004-12-13 22:43:07|
|Subject: Re: Using LIMIT changes index used by planner |