Re: Best practice to load a huge table from ORACLE to PG

From: Tino Wildenhain <tino(at)wildenhain(dot)de>
To: Adonias Malosso <malosso(at)gmail(dot)com>
Cc: pgsql-performance(at)postgresql(dot)org
Subject: Re: Best practice to load a huge table from ORACLE to PG
Date: 2008-04-28 19:59:02
Message-ID: 48162C86.1040405@wildenhain.de
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

Adonias Malosso wrote:
> Hi All,
>
> I´d like to know what´s the best practice to LOAD a 70 milion rows, 101
> columns table
> from ORACLE to PGSQL.
>
> The current approach is to dump the data in CSV and than COPY it to
> Postgresql.
>
Uhm. 101 columns you say? Sounds interesting. There are dataloaders
like: http://pgfoundry.org/projects/pgloader/ which could speed
up loading the data over just copy csv. I wonder how much normalizing
could help.

Tino

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message Joshua D. Drake 2008-04-28 20:55:27 Re: Benchmarks WAS: Sun Talks about MySQL
Previous Message Shane Ambler 2008-04-28 19:49:59 Re: Very poor performance loading 100M of sql data using copy