Re: Using Postgres to store high volume streams of sensor readings

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: "Ciprian Dorin Craciun" <ciprian(dot)craciun(at)gmail(dot)com>
Cc: Grzegorz Jaśkiewicz <gryzman(at)gmail(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: Using Postgres to store high volume streams of sensor readings
Date: 2008-11-21 16:06:20
Message-ID: 1581.1227283580@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

"Ciprian Dorin Craciun" <ciprian(dot)craciun(at)gmail(dot)com> writes:
> In short the data is inserted by using COPY sds_benchmark_data
> from STDIN, in batches of 500 thousand data points.

Not sure if it applies to your real use-case, but if you can try doing
the COPY from a local file instead of across the network link, it
might go faster. Also, as already noted, drop the redundant index.

regards, tom lane

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Alvaro Herrera 2008-11-21 16:13:44 Re: Postgres mail list traffic over time
Previous Message Pavel Stehule 2008-11-21 16:02:36 Re: converter pgplsql funcion