| From: | Richard Huxton <dev(at)archonet(dot)com> |
|---|---|
| To: | <michael(dot)mattox(at)verideon(dot)com>, <pgsql-performance(at)postgresql(dot)org> |
| Subject: | Re: Performance advice |
| Date: | 2003-06-24 11:33:42 |
| Message-ID: | 200306241233.42361.dev@archonet.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-performance |
On Tuesday 24 Jun 2003 8:39 am, Michael Mattox wrote:
> I'd like to get some feedback on my setup to see if I can optimize my
> database performance. My application has two separate applications:
>
> The first application connects to websites and records the statistics in
> the database. Websites are monitored every 5 or 10 minutes (depends on
> client), there are 900 monitors which comes out to 7,800 monitorings per
> hour.
[snip]
> There is a serious
> performance constraint here because unlike a webserver, this application
> cannot slow down. If it slows down, we won't be able to monitor our sites
> at 5 minute intervals which will make our customers unhappy.
Others are discussing the performance/tuning stuff, but can I make one
suggestion?
Don't log your monitoring info directly into the database, log straight to one
or more text-files and sync them every few seconds. Rotate the files once a
minute (or whatever seems suitable). Then have a separate process that reads
"old" files and processes them into the database.
The big advantage - you can take the database down for a short period and the
monitoring goes on. Useful for those small maintenance tasks.
--
Richard Huxton
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Michael Mattox | 2003-06-24 12:16:09 | Re: Performance advice |
| Previous Message | Shridhar Daithankar | 2003-06-24 09:02:46 | Re: Performance advice |