Thank you for replying.
I realise this is not a regular use case, but here is my situation.
I am taking over a reporting role in a company that is currently using a
COBOL based ERP system. Due to the nature of the system, data extract is
very slow, so overnight extracts are used for day to day reporting
purposes, these extracts are stored as dbase files.
Other dbase files containing meta data have also been created to aid
This situation has led to many dbase files being used with many mashup
combinations created for different report use cases, this is proving to be
quite a maintenace overhead.
My preference is to dump all of the raw data into Postgress where I can
create the needed indexes and views needed for all of the reports, then the
only data that needs updating is the source data, not a myriad of dbase
Unfortunately the raw data consists of many tables with greater than 100
columns, given the time constraints I am faced with I cannott commit the
time needed to create the tables in postgress manually.
I have python scripts that I am gradually adopting to incrementaly replicate
the data once the structure is in place.
I have had some limited results using LibreOffice Base. I create a database
connected to my postgress database and another connected to the COBOL ODBC
connetor, then drag and drop tables from one to the other. There are
problems with the detection of some data types, and once corrected it is
very slow and raises an error at about 4 Million rows leaving the data in an
I would love to find a way to do this without error, it would save me so
Hope you can help.
Once I have the current situation under control I plan to introduce the
business to the Kimball approach.
On 7 March 2011 12:47, Chetan Suttraway
> On Wed, Feb 23, 2011 at 1:52 AM, Craig Barnes <cjbarnes18(at)gmail(dot)com>wrote:
>> I need a way to quickly import many large tables from various ODBC data
>> sources into PostgreSQL for reporting.
>> Ideally I am looking for a simple way to create all the tables to import
>> to, but a tool to take the source and do the whole job would be just as
> Could please elaborate this use case?
> Chetan Sutrave
> Senior Software Engineer
> EnterpriseDB Corporation
> The Enterprise PostgreSQL Company
> Phone: +91.20.30589523
> Website: www.enterprisedb.com
> EnterpriseDB Blog: http://blogs.enterprisedb.com/
> Follow us on Twitter: http://www.twitter.com/enterprisedb
> This e-mail message (and any attachment) is intended for the use of the
> individual or entity to whom it is addressed. This message contains
> information from EnterpriseDB Corporation that may be privileged,
> confidential, or exempt from disclosure under applicable law. If you are not
> the intended recipient or authorized to receive this for the intended
> recipient, any use, dissemination, distribution, retention, archiving, or
> copying of this communication is strictly prohibited. If you have received
> this e-mail in error, please notify the sender immediately by reply e-mail
> and delete this message.
In response to
pgsql-novice by date
|Next:||From: sogono||Date: 2011-03-08 10:11:59|
|Subject: DDBMS with postgresql.|
|Previous:||From: raghu ram||Date: 2011-03-07 18:54:31|
|Subject: Re: Backing up several databases|