Re: Moving data from one set of tables to another?

From: "Sean Davis" <sdavis2(at)mail(dot)nih(dot)gov>
To: howard(at)yankeescientific(dot)com
Cc: "Carol Walter" <walterc(at)indiana(dot)edu>, pgsql-novice(at)postgresql(dot)org
Subject: Re: Moving data from one set of tables to another?
Date: 2008-09-18 23:20:07
Message-ID: 264855a00809181620v750ef650y14805e1caef20ab0@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-novice

On Thu, Sep 18, 2008 at 6:39 PM, Howard Eglowstein
<howard(at)yankeescientific(dot)com> wrote:
> Somewhat empty, yes. The single set of 'data_' tables contains 3 years worth
> of data. I want to move 2 years worth out into the 'new_' tables. When I'm
> done, there will still be 1 year's worth of data left in the original table.
>
> Howard
>
> Carol Walter wrote:
>>
>> What do you want for your end product? Are the old tables empty after you
>> put the data into the new tables?
>>
>> Carol
>>
>> On Sep 18, 2008, at 3:02 PM, Howard Eglowstein wrote:
>>
>>> I have three tables called 'data_a', 'data_b' and 'data_c' which each
>>> have 50 columns. One of the columns in each is 'id' and is used to keep
>>> track of which data in data_b and data_c corresponds to a row in data_a. If
>>> I want to get all of the data in all 150 fields for this month (for
>>> example), I can get it with:
>>>
>>> select * from (data_a, data_b, data_c) where data_a.id=data_b.id AND
>>> data_a.id = data_c.id AND timestamp >= '2008-09-01 00:00:00' and timestamp
>>> <= '2008-09-30 23:59:59'

You might want to look at insert into ... select ...

You should be able to do this with 1 query per new table (+ the
deletes, obviously). For a few thousand records, I would expect that
the entire process might take a few seconds.

Sean

>>> What I need to do is execute this search which might return several
>>> thousand rows and write the same structure into 'new_a', 'new_b' and
>>> 'new_c'. What i'm doing now in a C program is executing the search above.
>>> Then I execute:
>>>
>>> INSERT INTO data_a (timestamp, field1, field2 ...[imagine 50 of them])
>>> VALUES ('2008-09-01 00:00:00', 'ABC', 'DEF', ...);
>>> Get the ID that was assigned to this row since 'id' is a serial field and
>>> the number is assigned sequentially. Say it comes back as '1'.
>>> INSERT INTO data_b (id, field1, field2 ...[imagine 50 of them]) VALUES
>>> ('1', 'ABC', 'DEF', ...);
>>> INSERT INTO data_c (id, field1, field2 ...[imagine 50 of them]) VALUES
>>> ('1', 'ABC', 'DEF', ...);
>>>
>>> That moves a copy of the three rows of data form the three tables into
>>> the three separate new tables.
>>> From the original group of tables, the id for these rows was, let's say,
>>> '1234'. Then I execute:
>>>
>>> DELETE FROM data_a where id='1234';
>>> DELETE FROM data_b where id='1234';
>>> DELETE FROM data_c where id='1234';
>>>
>>> That deletes the old data.
>>>
>>> This works fine and gives me exactly what I wanted, but is there a better
>>> way? This is 7 SQL calls and it takes about 3 seconds per moved record on
>>> our Linux box.
>>>
>>> Any thoughts or suggestions would be appreciated.
>>>
>>> Thanks,
>>>
>>> Howard
>>>
>>> --
>>> Sent via pgsql-novice mailing list (pgsql-novice(at)postgresql(dot)org)
>>> To make changes to your subscription:
>>> http://www.postgresql.org/mailpref/pgsql-novice
>>
>> ------------------------------------------------------------------------
>>
>>
>> No virus found in this incoming message.
>> Checked by AVG - http://www.avg.com Version: 8.0.169 / Virus Database:
>> 270.6.21/1678 - Release Date: 9/18/2008 9:01 AM
>>
>>
>
>
> --
> Sent via pgsql-novice mailing list (pgsql-novice(at)postgresql(dot)org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-novice
>

In response to

Responses

Browse pgsql-novice by date

  From Date Subject
Next Message Howard Eglowstein 2008-09-18 23:28:54 Re: Moving data from one set of tables to another?
Previous Message Howard Eglowstein 2008-09-18 22:39:03 Re: Moving data from one set of tables to another?