Skip site navigation (1) Skip section navigation (2)

Re: Copying large tables with DBLink

From: Joe Conway <mail(at)joeconway(dot)com>
To: Chris Hoover <revoohc(at)sermonaudio(dot)com>
Cc: PostgreSQL Admin <pgsql-admin(at)postgresql(dot)org>
Subject: Re: Copying large tables with DBLink
Date: 2005-03-24 19:21:10
Message-ID: 42431326.2010908@joeconway.com (view raw or flat)
Thread:
Lists: pgsql-admin
Chris Hoover wrote:
> Has anyone had problems with memory exhaustion and dblink?  We were 
> trying to use dblink to convert our databases to our new layout, and had 
> our test server lock up several times when trying to copy a table that 
> was significantly larger than our memory and swap.
> Basically where were doing an insert into <table> select * from 
> dblink('dbname=olddb','select * from large_table) as t_large_table(table 
> column listing);
> 
> Does anyone know of a way around this?


dblink just uses libpq, and libpq reads the entire result into memory. 
There is no direct way around that that I'm aware of. You could, 
however, use a cursor, and fetch/manipulate rows in more reasonably 
sized groups.

HTH,

Joe

In response to

pgsql-admin by date

Next:From: Tom LaneDate: 2005-03-24 19:40:22
Subject: Re: Copying large tables with DBLink
Previous:From: Chris HooverDate: 2005-03-24 18:59:44
Subject: Copying large tables with DBLink

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group