Skip site navigation (1) Skip section navigation (2)

Postgresql performace question

From: "Mark Jones" <mlist(at)hackerjones(dot)org>
To: pgsql-hackers(at)postgresql(dot)org, pgsql-general(at)postgresql(dot)org
Subject: Postgresql performace question
Date: 2003-03-02 23:52:37
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-generalpgsql-hackers

I am working on a project that acquires real-time data from an external
device that I need to store and be able to search through and retrieve
quickly. My application receives packets of data ranging in size from 300 to
5000 bytes every 50 milliseconds for the minimum duration of 24 hours before
the data is purged or archived off disk. There are several fields in the
data that I like to be able to search on to retrieve the data at later time.
By using a SQL database such as Postgresql or Mysql it seams that it would
make this task much easier. My questions are, is a SQL database such as
Postgresql able to handle this kind of activity saving a record of 5000
bytes at rate of 20 times a second, also how well will it perform at
searching through a database which contains nearly two million records at a
size of about 8 - 9 gigabytes of data, assuming that I have adequate
computing hardware. I am trying to determine if a SQL database would work
well for this or if I need to write my own custom database for this project.
If anyone has any experience in doing anything similar with Postgresql  I
would love to know about your findings.



pgsql-hackers by date

Next:From: Rod TaylorDate: 2003-03-03 00:35:45
Subject: Re: Postgresql performace question
Previous:From: Kevin BrownDate: 2003-03-02 21:43:34
Subject: Re: GiST: Bad newtup On Exit From gistSplit() ?

pgsql-general by date

Next:From: Doug McNaughtDate: 2003-03-03 00:25:46
Subject: Re: Hosting a data file on a SAN
Previous:From: Tom LaneDate: 2003-03-02 20:53:09
Subject: Re: pg_relcheck

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group