Re: INSERTing lots of data

From: Joachim Worringen <joachim(dot)worringen(at)iathh(dot)de>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: INSERTing lots of data
Date: 2010-06-01 07:51:28
Message-ID: 4C04BC00.4040905@iathh.de
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 06/01/2010 05:45 AM, Greg Smith wrote:
> Two thoughts. First, build a test performance case assuming it will fail
> to scale upwards, looking for problems. If you get lucky, great, but
> don't assume this will work--it's proven more difficult than is obvious
> in the past for others.
>
> Second, if you do end up being throttled by the GIL, you can probably
> build a solution for Python 2.6/3.0 using the multiprocessing module for
> your use case: http://docs.python.org/library/multiprocessing.html

Thanks, Greg - multiprocessing looks very usable for my application.
Much more than using fork() and pipes myself...

Joachim

In response to

Browse pgsql-general by date

  From Date Subject
Next Message Sim Zacks 2010-06-01 08:01:16 plpythonu / using pg as an application server
Previous Message Szymon Guz 2010-06-01 07:14:53 Re: create index concurrently - duplicate index to reduce time without an index