From: | Mark Moellering <markmoellering(at)psyberation(dot)com> |
---|---|
To: | Postgres General <pgsql-general(at)postgresql(dot)org> |
Subject: | db-connections (application architecture) |
Date: | 2018-11-15 15:09:38 |
Message-ID: | CAA0uU3XTasTDfiSC-Q=TETWorpmAh_QTn3fBEegrKENb51-jLg@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
So, I am working on some system designs for a web application, and I wonder
if there is any definitive answer on how to best connect to a postgres
database.
I could have it so that each time a query, or set of queries, for a
particular request, needs to be run, a new connection is opened, queries
are run, and then connection is closed / dropped.
OR, I could create a persistent connection that will remain open as long as
a user is logged in and then any queries are run against the open
connection.
I can see how, for only a few (hundreds to thousands) of users, the latter
might make more sense but if I need to scale up to millions, I might not
want all of those connections open.
Any idea of how much time / overhead is added by opening and closing a
connection everytime?
Any and all information is welcome.
Thanks in advance
-- Mark M
From | Date | Subject | |
---|---|---|---|
Next Message | Andreas Kretschmer | 2018-11-15 15:14:36 | Re: db-connections (application architecture) |
Previous Message | Adrian Klaver | 2018-11-15 14:29:23 | Re: Impact on PostgreSQL due to Redhat acquisition by IBM |