large numbers of inserts out of memory strategy

From: Ted Toth <txtoth(at)gmail(dot)com>
To: pgsql-general <pgsql-general(at)postgresql(dot)org>
Subject: large numbers of inserts out of memory strategy
Date: 2017-11-28 17:17:07
Message-ID: CAFPpqQGux3uT=CNN-z=zXr4qSmfBw9tDCURimELqEWUBrBpL7A@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

I'm writing a migration utility to move data from non-rdbms data
source to a postgres db. Currently I'm generating SQL INSERT
statements involving 6 related tables for each 'thing'. With 100k or
more 'things' to migrate I'm generating a lot of statements and when I
try to import using psql postgres fails with 'out of memory' when
running on a Linux VM with 4G of memory. If I break into smaller
chunks say ~50K statements then thde import succeeds. I can change my
migration utility to generate multiple files each with a limited
number of INSERTs to get around this issue but maybe there's
another/better way?

Ted

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Rob Sargent 2017-11-28 17:19:27 Re: large numbers of inserts out of memory strategy
Previous Message Robert Haas 2017-11-28 16:58:53 Re: ERROR: too many dynamic shared memory segments