Re: Using Postgres to store high volume streams of sensor readings

From: "Ciprian Dorin Craciun" <ciprian(dot)craciun(at)gmail(dot)com>
To: "Tom Lane" <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: Grzegorz Jaśkiewicz <gryzman(at)gmail(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: Using Postgres to store high volume streams of sensor readings
Date: 2008-11-21 16:52:55
Message-ID: 8e04b5820811210852k6ce7a7b6ub43b4368de33fc8c@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Fri, Nov 21, 2008 at 6:06 PM, Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> wrote:
> "Ciprian Dorin Craciun" <ciprian(dot)craciun(at)gmail(dot)com> writes:
>> In short the data is inserted by using COPY sds_benchmark_data
>> from STDIN, in batches of 500 thousand data points.
>
> Not sure if it applies to your real use-case, but if you can try doing
> the COPY from a local file instead of across the network link, it
> might go faster. Also, as already noted, drop the redundant index.
>
> regards, tom lane

Hy!

It won't be that difficult to use a local file (now I'm using the
same computer), but will it really make a difference? (I mean have you
seen such issues?)

Thanks,
Ciprian Craciun.

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Martin Pitt 2008-11-21 16:53:43 Packaging problem: using myspell dictionaries for tsearch2
Previous Message Ciprian Dorin Craciun 2008-11-21 16:51:11 Re: Using Postgres to store high volume streams of sensor readings