Re: Using Postgres to store high volume streams of sensor readings

From: "Ciprian Dorin Craciun" <ciprian(dot)craciun(at)gmail(dot)com>
To: "Tom Lane" <tgl(at)sss(dot)pgh(dot)pa(dot)us>
Cc: Grzegorz Jaśkiewicz <gryzman(at)gmail(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: Using Postgres to store high volume streams of sensor readings
Date: 2008-11-21 17:42:57
Message-ID: 8e04b5820811210942p76c6402epdae640b7f2e52895@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Fri, Nov 21, 2008 at 7:12 PM, Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> wrote:
> "Ciprian Dorin Craciun" <ciprian(dot)craciun(at)gmail(dot)com> writes:
>> On Fri, Nov 21, 2008 at 6:06 PM, Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> wrote:
>>> Not sure if it applies to your real use-case, but if you can try doing
>>> the COPY from a local file instead of across the network link, it
>>> might go faster. Also, as already noted, drop the redundant index.
>
>> It won't be that difficult to use a local file (now I'm using the
>> same computer), but will it really make a difference?
>
> Yes. I'm not sure how much, but there is nontrivial protocol overhead.
>
> regards, tom lane
>

Ok, I have tried it, and no improvements... (There is also the
drawback that I must run the inserts as the superuser...)

Ciprian Craciun.

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Greg Smith 2008-11-21 17:45:52 Re: Using Postgres to store high volume streams of sensor readings
Previous Message Tom Lane 2008-11-21 17:30:57 Re: Packaging problem: using myspell dictionaries for tsearch2