I've got to import data into a remote database.
I get some stuff usually as excel-files that doesn't fit the
db-structure in respect of normalisation so I import it into Access,
brush up the data and push the columns from there in the right tables
For bigger imports (~5000 lines in Excel that get spread over 4-6 tables
in my db) I fetch the remote DB, do the import locally and transfer a
dump back to the remote site when noone uses the server.
This is getting unwieldy as the db grows and the connection is not
really that fast.
I can access the server by ssh so it might be way faster to run a
prepared SQL file that consists just of the new data against the DB on
Could I build a SQL script that adds the new stuff and creates foreign
keys on the fly without the need to know the new IDs before?
Lets suppose I got a couple of foreign-linked tables:
company (company_id serial primary key, name, ...)
person (person_id serial primary key, company_fk, name, ...)
contact (contact_id serial primary key, person_fk, contact_date,
The SQL script needed to:
1) INSERT INTO company ...
2) look up the last created company_id
3) INSERT INTO person and use the company_id to set company_fk
4) look up the last created person_id
5) INSERT INTO contact and use the person_id to set person_fk
Probaply create some other persons with their contacts.
Then another company and so on.
I do this right now with vba in Access but this is painfully slow via
InterNet as it constantly has to ask the server for the last ids and the
Net's latency spoils all the fun.
pgsql-novice by date
|Next:||From: Sean Davis||Date: 2008-04-16 02:05:51|
|Subject: Re: Importing normalised data by SQL script in remote DB|
|Previous:||From: Steve T||Date: 2008-04-15 17:04:00|
|Subject: Re: Sequences - jumped after power failure|